Apple unveils eye tracking, music tactile and more for iPhone with big accessibility push

Apple unveils eye tracking, music tactile and more for iPhone with big accessibility push

Over the next year, Apple devices will have plenty of new features to make them more accessible and easier to use.

This news comes to us thanks to a press release from Apple, just days after the company launched the new iPad Air2024 and iPad Pro2024.The latter is the first device to have a new Apple M4 chip inside. 

These iPads will earn some of the accessibility upgrades apple announced today, but the upgrade also includes new features for the iPhone and Apple Vision Pro headset. They include everything from enhanced voice control to eye tracking to small tricks that can help reduce motion sickness when looking at an iPhone in a moving vehicle.

Apple has not yet announced a clear timeline on when these features will debut, but it is likely to arrive at WWDC6 on May 2024. So, Apple will talk about the new operating system for the year 18, including iOS18 and ipados2024.

Apple has announced an upgrade smorgasbord, most of which are powered by AI, so let's delve into the new accessibility that has arrived this year

First, make it easier for your iPhone or iPad to navigate and use the device just using your eyes.

It uses the front-facing camera to track where on the screen your eyes are looking and you can effectively press a button or just swipe anything with your eyes- no additional equipment required, you just need the best ipad and other devices. It uses a Dwell control accessibility control system that is already in place on Apple devices. 

When active, eye tracking uses the NPU built into the device to track eye movements and calculate what you're doing, so your iPhone or iPad (according to Apple) doesn't send that data anywhere.

ipads and iphones also have a new vocal shortcut feature that allows you to assign shortcuts and other complex tasks to a wider range of vocal utterances than currently supported devices.

In addition, there should be a "listen to atypical speech" feature along the way, allowing the device to understand "a wider range of speech" than it currently does.This could be a big win for accessibility.

The goal is to make it easier for people with progressive conditions that affect speech, such as those recovering from a stroke or battling cerebral palsy, to control the device with their own voice.  

In addition, Apple plans to extend the existing voice control capabilities of these devices to support "custom vocabulary and complex words."

Apple is also upgrading Apple Music with a new music tactile feature that makes it easier for deaf and hearing impaired people to enjoy music on their iPhones.

As far as I know, this feature is only going to be on the iPhone at the moment, and the phone's Taptic en to move the phone to match the track you're listening to

Which, according to Apple, indicates that the iPhone is playing "taps, textures, and sophisticated vibrations to the audio of music." The goal is to provide people with a new way to watch music on iPhone. Also, although initially only appearing on Apple Music, there is also an API available to developers so that developers can incorporate this feature into their apps.

There's also a new feature called car movement cues coming on iphones and ipads aimed at making them a little more comfortable to use in the car.

When Visual Motion Cues is enabled, apple says that animated dots that move as the vehicle moves are overlaid on the screen. The goal is to "help reduce sensory conflicts" and fight motion sickness while you can still scroll through your IG feed or read that recipe you're researching in the back of the car.

Apple's CarPlay software is also getting a small chunk of accessibility updates aimed at making it easier to use on the road.

Soon, CarPlay will get improved visual accessibility features like support for color filters that can make the interface easier for colorblind people to use, as well as options to make bold text and make it look bigger on the screen.

A new voice control feature that works that allows you to navigate CarPlay only by voice, as well as to notify when a car horn or siren is detected

Those who have invested in Apple Vision Pro may be willing to hear that they will get some accessibility upgrades in the future.

Most notably, the visionOS operating system will have a system-wide live caption feature that can provide real-time captions for dialogue and audio. This feature works with apps like FaceTime as well as live conversations happening in the room with you, impressive accessibility for the deaf and hearing hard

Vision Pro also gets support for some iPhone devices used by the deaf and hearing impaired, including hearing devices and cochlear hearing processors.

In addition, the headset has additional vision accessibility features such as options to dim flashing lights, reduce transparency, and enable smart inversion.This will reverse the colors on the display (on the iPhone), except for some apps that use images, media, and dark color styles.

At the end of that press release, Apple included a summary of additional accessibility features on iPhones, iPads and Macs.

These are generally extensions to existing accessibility apps and features on the device, so for brevity, we've put Apple's brief descriptions together in an easy-to-read list.

Many of these new features make life a little easier for Apple users, or make Apple devices more accessible and easier to use for more users Put together, they are a promising sign that Apple continues to be committed to improving the accessibility of its products.

As a person who has developed serious hand or wrist injuries after a lifetime of using a Pc, smartphone or tablet, it is especially meaningful to me that companies continue to improve and expand the accessibility of their devices. Now you may not need a good voice control system or screen reader, but it can increase over time, so it's great to see such features arrive on devices used by many people around the world.

.

Categories