Apple Previews iOS 17 Accessibility Features Ahead of WWDC


Apple today previewed a range of new accessibility features for the iPhone, iPad, and Mac.

Apple accessibility iPad iPhone 14 Pro Max Home Screen

The new software features for cognitive, speech, and vision accessibility are set to arrive later this year, likely as part of iOS 17, iPadOS 17, and macOS 14.

Assistive Access

Assistive Access helps ‌iPhone‌ and ‌iPad‌ users with cognitive disabilities by distilling apps and experiences to their core features. The mode includes a customized experience for Phone and FaceTime, which are combined into a single Calls app, as well as Messages, Camera, Photos, and Music. The feature offers a distinct interface with high contrast buttons and large text labels, as well as tools to help trusted supporters tailor the experience for the individual they support. For example, users and trusted supporters can also choose between a more visual, grid-based layout for their Home Screen and apps, or a row-based layout for users who prefer text.

Live Speech and Personal Voice Advance Speech Accessibility

Live Speech on the ‌iPhone‌, ‌iPad‌, and Mac allows users to type what they want to say and have it spoken out loud during phone and ‌FaceTime‌ calls, as well as in-person conversations. Users can also save commonly used phrases to chime into conversations quickly.

For users at risk of losing their ability to speak — such as those with a recent diagnosis of amyotrophic lateral sclerosis (ALS) or other conditions that can progressively impact speaking ability — Personal Voice is a simple and secure way to create a voice that sounds like them.
Users can create a Personal Voice by reading along with a randomized set of text prompts to record 15 minutes of audio on ‌iPhone‌ or ‌iPad‌. This speech accessibility feature uses on-device machine learning to keep users’ information private and secure, and integrates with Live Speech so users can speak with their Personal Voice when connecting with loved ones.

Other Features

For users who are blind or have low vision, Detection Mode in Magnifier offers Point and Speak, which identifies text users point toward and reads it aloud to help them interact with physical objects such as household appliances.

More to follow…



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *