Apple is letting you control your iPhone and iPad with your eyes

  • Apple announced a suite of updates for Global Disability Awareness Day.

  • Eye Tracking lets users control their iPhones and iPads by monitoring where they look.

  • There's also a new music experience for deaf and hard-of-hearing users.

Apple is soon letting iPad and iPhone users control their devices with a glance of their eyes.

A new AI-powered feature named Eye Tracking, unveiled Wednesday, is designed for people with physical disabilities. Apple says it's coming later this year.

The feature uses the front-facing camera to set up and calibrate, though Apple says it doesn't access or share any data.

Users can navigate with Dwell Control, which works by monitoring how long the eyes stay trained on various controls. Users can access physical buttons and complete swipes and other gestures with their eyes.

"For nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software," CEO Tim Cook said in a statement.

Eye Tracking will be included in iOS and iPadOS, with no additional hardware required, Apple says.

The tech giant rolled out a suite of changes ahead of Global Accessibility Awareness Day on Thursday.

Other updates include Music Haptics, a new music experience for deaf or hard-of-hearing users in which taps and other vibrations are played along with music audio.

And two new features are designed for people with speech conditions: Vocal Shortcuts enables iPhone and iPad users to launch Siri shortcuts using custom phrases, while a Listen for Atypical Speech option can recognize a wider range of speech patterns.

Another cool feature may help travelers with motion sickness.

Vehicle Motion Cues places moving black dots on the edge of device screens to denote which way a vehicle is moving. Apple says this helps motion sickness by assuaging the "sensory conflict between what a person sees and what they feel."

Read the original article on Business Insider