Apple expands accessibility features to all devices with live captions, gesture controls, and more.

What just happened? Did you know that Thursday is World Accessibility Awareness Day? Yes, me neither. But apparently this made-up party gives companies the perfect opportunity to show how inclusive they are by announcing features that make their products more accessible. Apple is recognizing the day by introducing features that expand its growing list of accessibility features.

Although not expected before the end of this year, Apple has revealed several additions to accessibility settings for Mac, iPhone, iPad, and Apple Watches. While the features are meant to help people with disabilities use Apple devices more easily, some are intriguing alternatives for those looking for more convenient input methods – especially the new gesture controls for Apple Watches, but more so. topic in a minute.

One of the first features revealed is door detection. Door detection is designed for the blind or visually impaired. It uses the camera, LiDAR scanner, and machine learning on new iPhones and iPads to help people better navigate buildings.

When they arrive at a new location, the feature can tell users where a door is, how far they are from it, and how it opens – by turning a knob, pulling a handle, and more. It can also read signs and symbols around the door, such as room numbers or accessibility signs.

Next, Apple is developing live captions for the hearing impaired. Live captions aren’t entirely innovative. Android devices have had a similar feature for a while, but now those with iPhone, iPad, or Mac can have real-time captioning overlays on video calls and FaceTime. It can also transcribe sounds around the user.

However, two features make live subtitles different from Android. One is the ability to add name tags to FaceTime speakers, making it easy to keep track of who’s talking. Moreover, when used on Mac, it can read the answers typed in real time. This last feature could be useful for aphasic patients or others who have difficulty speaking. Unfortunately, it will only be available in English when Apple releases the beta in the US and Canada later this year.

Finally, there are some interesting Apple Watch features. The first is mirroring. This setting allows people with motor control issues to operate an Apple Watch without fumbling with the small screen. It syncs with the user’s iPhone using AirPlay to enable various input methods, including voice control, head tracking, and Made for iPhone external switch controls.

Another innovative Apple Watch accessibility feature is Quick Actions. These are simple gestures with your fingers, like touching your first finger and thumb together (a pinch) that Apple first introduced last year. The watch will detect these movements as an input. This year it improved detection and added more functions to the list of things users can control.

For example, a single pinch can move to the next menu item, and a double pinch can go back to the previous one. Answering or rejecting a call while driving with a simple wave of your hand could come in handy even for those without motor control issues. Users can use gestures to dismiss notifications, activate the camera shutter, pause media content in the Now Playing app, and control workouts. There are probably many more examples, but these are the specific use cases mentioned by Apple.

A few more features are coming later this year, including Buddy Controller, Siri Pause Time, Voice Control Spelling Mode, and Sound Recognition. You can lire about what they are doing in Apple’s press release.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.