Apple recently announced powerful software features designed for people with mobility, vision, hearing, and cognitive disabilities. These next-generation technologies showcase Apple’s belief that accessibility is a human right and advance the company’s long history of delivering industry-leading features that make Apple products customizable for all users.
Later this year, with software updates across all of Apple’s operating systems, people with limb differences will be able to navigate Apple Watch using AssistiveTouch; iPad will support third-party eye-tracking hardware for easier control; and for blind and low vision communities, Apple’s industry-leading VoiceOver screen reader will get even smarter using on-device intelligence to explore objects within images. In support of neurodiversity, Apple is introducing new background sounds to help minimize distractions, and for those who are deaf or hard of hearing, Made for iPhone (MFi) will soon support new bi-directional hearing aids.
“At Apple, we’ve long felt that the world’s best technology should respond to everyone’s needs, and our teams work relentlessly to build accessibility into everything we make,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “With these new features, we’re pushing the boundaries of innovation with next-generation technologies that bring the fun and function of Apple technology to even more people — and we can’t wait to share them with our users.”
Here are just a few examples of what’s to come:
AssistiveTouch for Apple Watch
To support users with limited mobility, Apple is introducing a revolutionary new accessibility feature for Apple Watch. AssistiveTouch for watchOS allows users with upper body limb differences to enjoy the benefits of Apple Watch without ever having to touch the display or controls. Using built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and tendon activity, which lets users navigate a cursor on the display through a series of hand gestures, like a pinch or a clench.
Eye-Tracking Support for iPad
iPadOS will support third-party eye-tracking devices, making it possible for people to control iPad using just their eyes. Later this year, compatible MFi devices will track where a person is looking onscreen and the pointer will move to follow the person’s gaze, while extended eye contact performs an action, like a tap.
Sound Actions for Switch Control replaces physical buttons and switches with mouth sounds — such as a click, pop, or “ee” sound — for users who are non-speaking and have limited mobility.
Display and Text Size settings can be customized in each app for users with colorblindness or other vision challenges to make the screen easier to see. Users will be able to customize these settings on an app-by-app basis for all supported apps.
Apple-TV Barrier Breaking
The Apple TV app will spotlight its Barrier-Breaking Characters collection which celebrates authentic disability representation onscreen and behind the camera. It features guest curation from creators and artists like the cast of “Best Summer Ever,” who share their favorite movies and shows in an editorial experience designed by American Pop-Op and Urban Folk artist Tennessee Loveless, known for his vibrant illustrations and colorful storytelling told through the lens of his colorblindness.
You may read the full article here.
Source: Apple News Release