In a significant move to enhance device usability for individuals with physical disabilities, Apple has announced the introduction of several new AI-powered accessibility features for the iPhone and iPad, set to be released later this year with iOS 18. These advancements include Eye Tracking and Music Haptics, designed to provide users with more intuitive and inclusive ways to interact with their devices.
Eye Tracking
One of the most groundbreaking features is Eye Tracking, which leverages artificial intelligence to allow users to navigate their iPhone or iPad using only their eyes. This feature is specifically tailored for individuals with physical disabilities, offering an “in-built option for navigating iPad and iPhone with just their eyes,” according to Apple. Eye Tracking will enable users to move through app elements and employ Dwell Control to activate functions or perform physical button presses, swipes, and other gestures using their eye movements.
Music Haptics
In addition to Eye Tracking, Apple is also introducing Music Haptics, which enhances the sensory experience of music by providing haptic feedback in sync with the audio. This feature is particularly beneficial for users with hearing impairments, as it allows them to feel the rhythm and beats of the music, offering a more immersive experience.
These AI and machine learning-powered accessibility features reflect Apple’s ongoing commitment to making technology more inclusive and accessible to all users. By integrating such innovative functionalities, Apple aims to ensure that individuals with physical disabilities can navigate and enjoy their devices with greater ease and independence.