May 19, 2022 at 3:35 am | Author: zahidmir | Technology
Global Accessibility Awareness Day is Thursday, so Apple took to its newsroom blog this week to announce several major new accessibility features headed to the iPhone, Apple Watch, iPad, and Mac.
One of the most widely used will likely be Live Captions, which is coming to iPhone, Mac, and iPad. The feature shows AI-driven, live-updating subtitles for speech coming from any audio source on the phone, whether the user is “on a phone or FaceTime call, using a video conferencing or social media app, streaming media content, or having a conversation with someone next to them.”
The text (which users can resize at will) appears at the top of the screen and ticks along as the subject speaks. Additionally, Mac users will be able to type responses and have them read aloud to others on the call. Live Captions will enter public beta on supported devices (“iPhone 11 and later, iPad models with A12 Bionic and later, and Macs with Apple silicon”) later this year.
There’s also door detection. It unfortunately will only work on iPhones and iPads with a lidar sensor (so the iPhone 12 Pro, iPhone 13 Pro, or recent iPad Pro models), but it sounds useful for those who are blind or have low vision. It uses the iPhone’s camera and AR sensors, in tandem with machine learning, to identify doors and audibly tell users where the door is located, whether it’s open or closed, how it can be opened, and what writing or labeling it might have.
Door detection will join people detection and image descriptions in a new “detection mode” intended for blind or low-vision users in iOS and iPadOS. Apple’s blog post didn’t say when that feature would launch, however.
Other accessibility additions that Apple says are just around the corner include 20 new Voice Over languages, new hand gestures on Apple Watch, and a feature that allows game players to receive help from a “buddy” with another game controller without disconnecting their own. Additionally, there are new Siri and Apple Books customizations meant to expand accessibility for people with disabilities, sound recognition customizations, and Apple Watch screen mirroring on the iPhone—which gives Watch users access to many accessibility features available on the iPhone but not the Watch.
Tech enthusiasts often lament that smartphones (and personal tech in general) have become stagnant, without many exciting new developments. But that couldn’t be further from the truth for many people with disabilities. Google, Apple, and numerous researchers and startups have been making significant advancements, bringing powerful new accessibility features to mobile devices.
Listing image by Apple