The American multinational tech company Apple’s smartwatch will get a brand new AssistiveTouch feature that will allow users to control their wearable without ever touching it.
The Apple Watch can identify muscle movement and tendon activity using its built-in motion sensors, heart rate monitor, and on-device machine learning.
This means that consumers can utilize tiny hand gestures like pinching or clenching to navigate their Apple Watch.
Users can answer calls and access the control center with easy gestures. The Assistive Touch feature for the Apple Watch is designed to make it easier for users with upper body limb variations to use their smartwatch, according to Apple.
People with mobility, vision, hearing and cognitive disabilities might benefit from features like AssistiveTouch for the Apple watch.
The Apple Watch is expected to get this feature “later this year.” Support for third-party eye-tracking devices to iPad is also coming later this year, allowing users to manage the tablet with their eyes.
Compatible MFi devices will track where a user is looking on the screen after Apple releases the update, and the cursor will then follow their gaze.
Voiceover, a screen reader for people who are blind or have low vision, is also on the way. The accessibility tool provides details about people, text, table data, and other objects within images. The feature, for example, can be used to define the position of a person or object in a picture.
Apple’s MFi hearing devices program now includes support for new bi-directional hearing aids. Audiograms that arrive at Headphone Accommodations can also be recognized.
Although Apple hardly discusses new features early on, the preview of these power accessibility tools shows how an iOS-style Cupertino deeply cares about people with disabilities and special needs.
The new accessibility features will be released in the coming months, and Apple is expected to show them off at this year’s Worldwide Developers Conference (WWDC) in June when iOS 15 is announced.