Global Accessibility Awareness Day is coming up on May 18, and Apple is marking the occasion by showing off future updates to iPhones, iPads, and Macs aimed at improving accessibility. From AI-generated voices to simplified home screens, there are some great features on the way.
Apple announced a few new features today that are slated to arrive “later this year” — presumably, in the upcoming iOS 17 and macOS 14 updates. They are primarily targeted at people with “cognitive, vision, hearing, and mobility accessibility” issues, as well as “individuals who are nonspeaking or at risk of losing their ability to speak.” Even if you aren’t in one of those categories, the new features might be useful when setting up a friend or family member’s device, or they could help with general focus problems (like the current Focus Mode).
First up is Assistive Access, which is a custom experience that simplifies the home screen and core applications on an iPhone or iPad. It has a more basic interface for Messages, Camera, Photos, and Music, with only the core functions and larger text and buttons. It also combines Phone and FaceTime into one single Calls app. The home screen can be configured to show a simplified grid layout, or a row-based layout. It’s unclear if third-party apps can appear in this mode, and if so, whether they can also create minimal experiences.
Apple said in a press release, “The feature offers a distinct interface with high contrast buttons and large text labels, as well as tools to help trusted supporters tailor the experience for the individual they support. For example, for users who prefer communicating visually, Messages includes an emoji-only keyboard and the option to record a video message to share with loved ones.”
Apple also announced Live Speech, which allows people to type what they want to say, and have it be spoken aloud during phone, FaceTime calls, and in-person conversations. The company is also working on Personal Voice, which can record a person saying a series of provided phrases, and create an on-device AI audio model. Apple is hoping it might be useful for people with conditions that progressively impact speaking ability.
Source: Apple Newsroom
- › Will macOS 14 Sonoma Run on My MacBook or Desktop Mac?
- › The ChatGPT App for iPhone Just Got an Upgrade
- › The Apple Vision Pro’s Price Isn’t As Crazy At It Seems
- › Google Drive Is Ending Support for Older Windows PCs
- › Get These Early Father’s Day Tech Deals Before They’re Gone
- › Windows 11 Is Trying to Unify RGB Device Settings