Apple introduces innovative accessibility features that combine the power of hardware, software and machine learning

This Thursday May 19 is the World Day to Promote Web Accessibility Awarenessy Apple wants to take advantage of this date to announce several new features that will receive the accessibility functions offered by its products.

The main novelty that the iPhone will offer us precisely will be the possibility of detecting doors, and even reading the text that it could have. A feature that will be really useful for users who are blind or suffer from low vision. This new feature combines the power of LiDAR, camera, and machine learning on the device, and will be available on iPhone and iPad models with the LiDAR scanner.

The second novelty is related to the Apple Watch. We know that this device has a small screen that could make viewing somewhat difficult for users with low vision. For the same reason, Apple has announced that the Apple Watch screen can be viewed and controlled directly from the iPhone with the Apple Watch Mirroring function.

We also have news for VoiceOver, as the feature will now support 20 new languages ​​and localizations, giving other English speakers the ability to use this feature. Additionally, live subtitles will be applied on iPhone, iPad, and Mac.

Another function is related to Siri, since now we will be able to adjust the time Siri waits before responding to a request, since for some people this time may be too short.

Finally, Apple also announced that it is updating the function that allows it to recognize sounds and notify them when it detects them. They want it to be able to detect home alarms, doorbells, as well as sounds from appliances.

All of these features are expected to arrive later this year for iPhone, Apple Watch, iPad, and Mac as well.

With information from Apple

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.