-
SignTime enables Apple’s customers to get product and store support in sign language.
-
VoiceOver describes images in more detail. For example, the feature would describe this image by saying something like, “Slight right profile of the face of a person with curly brown hair smiling.”
-
Background sounds help neurodiverse users to mask unwanted external noise.
-
These are the background noise options.
On Wednesday, Apple announced a number of new accessibility features for iPhones, iPads and the Apple Watch. The new features and services will be rolled out in the coming days, weeks and months.
The first feature to appear is a new service called SignTime, which Apple says will launch tomorrow, May 20. SignTime allows users to communicate with Apple’s customer service (either AppleCare or Retail Customer Care) using sign language. The service will launch first in the US, UK and France with US Sign Language, British Sign Language and French Sign Language respectively. In addition, Apple Store customers can use SignTime to connect with an interpreter while shopping or get customer support without having to make an appointment in advance.
While the arrival of SignTime is just around the corner, software updates packed with new features aimed at making Apple’s software and hardware more accessible to people with cognitive, mobility, hearing and vision disabilities will arrive on Apple’s platforms sometime later this year. to appear.
For example, users can navigate the Apple Watch with AssistiveTouch, which was already available on iPad and iPhone to help users who have difficulty touching the screen. The feature uses the watch’s motion sensors and heart rate sensor to “detect subtle differences in muscle movement and tendon activity” as input that allows users to move a cursor on the screen.
Apple also plans to roll out major improvements to its VoiceOver screen reader for blind or visually impaired users. Apple claims that VoiceOver can read the contents of a receipt in a more useful way. VoiceOver can also describe the position of people and objects in photos. In its editorial on the subject, Apple gives the following as an example of what VoiceOver might say to describe a portion of a photo: “Slight right profile of the face of a person with curly brown hair smiling.”
iPadOS will soon also support third-party MFi eye-tracking devices. Disabled users use these devices to operate computing devices with their eyes. A pointer follows a user’s gaze and actions can be performed when the user maintains eye contact with a UI element for a short period of time.
Speaking of MFi devices, Apple is adding support for new bi-directional hearing aids to aid in hands-free and FaceTime calls.
Because some users may be distracted or overwhelmed by background noises, Apple is introducing a new feature for neurodiverse users: soothing background noises that play throughout the operating system (such as ocean or rain sounds) to block out external noise that can cause discomfort. “The sounds mix with or dive under other audio and system sounds,” Apple says.
Apple has not specified when these new features will be released, only that they will come with new software updates later this year.
List image by Samuel Axon