Apple Watch Mirroring and live captions for hard-of-hearing community also added as part of latest features
Supplied photos
Ahead of Global Accessibility Awareness Day, American multinational technology company Apple previewed several new accessibility features for members of the disabilities community on Tuesday.
The tech giant showcased a slew of software updates in a media preview, including door detection for the visually impaired, live captions for the hearing community, and the Apple Watch Mirroring feature.
The features will be available later this year with software updates across Apple platforms.
Using advancements across hardware, software, and machine learning, people who are blind or have low vision can soon use their iPhone and iPad to navigate the last few feet to their destination with Door Detection.
Meanwhile, users with physical and motor disabilities who may rely on assistive features like Voice Control and Switch Control can fully control the Apple Watch from their iPhone with Apple Watch Mirroring.
The deaf and hard of hearing community can follow Live Captions on iPhone, iPad, and Mac. Apple is also expanding support for its screen reader, VoiceOver, with over 20 new languages and locales.
“Apple embeds accessibility into every aspect of our work. We are committed to designing the best products and services for everyone,” said Sarah Herrlinger, Apple’s senior director of accessibility policy and initiatives.
“We’re excited to introduce these new features, which combine innovation and creativity from teams across Apple to give users more options to use our products in ways that best suit their needs and lives,” said Sarah.
Door Detection is a cutting-edge navigation feature that can help users locate a door upon arriving at a new destination.
By using the application, users can learn how far they are from the door, describe door attributes — including if it is open or closed, and when it’s shut, whether users can open it by pushing, turning a knob, or pulling a handle.
Door detection can also read signs and symbols around the door, like the room number at an office or the presence of an accessible entrance symbol.
“This new feature combines the power of LiDAR, camera, and on-device machine learning and will be available on iPhone and iPad models with the LiDAR Scanner,” stated a company press release. However, the company has said the Door Detection tool should not be relied upon when a user may be harmed or injured or in high-risk or emergencies.
Once updated, the Door Detection tool will be available in a new Detection Mode within Magnifier, Apple’s built-in app supporting blind and low vision users.
Door Detection, along with People Detection and Image Descriptions, can be used alone or simultaneously in Detection Mode, offering users with vision disabilities a go-to place with customizable tools to help navigate and access detailed descriptions of their surroundings.
“In addition to navigation tools within Magnifier, Apple Maps will offer sound and haptics feedback for VoiceOver users to identify the starting point for walking directions,” the release stated.
Door Detection works in real time by using on-device machine learning, the LiDAR Scanner, and camera. There is no data or location saved or shared, clarified Apple.
Apple Watch Mirroring helps users with upper body limb differences control their Apple Watch remotely from their paired iPhone. Users can control the Apple Watch using iPhone’s assistive features like Voice Control and Switch Control and use inputs including voice commands, sound actions, head tracking, or external Made for iPhone switches as alternatives to tapping the Apple Watch display.
Apple Watch Mirroring uses hardware and software integration, including advances built on AirPlay, to help ensure users who rely on these mobility features can benefit from unique Apple Watch apps like Blood Oxygen, Heart Rate, and Mindfulness. The parts are available on Apple Watch Series 6 and above.
Users can do even more with simple hand gestures to control Apple Watch. With new Quick Actions on Apple Watch, a double-pinch motion can answer or end a phone call, dismiss a notification, take a photo, play or pause media in the Now Playing app, and start, pause, or resume a workout.
This builds on the innovative technology used in AssistiveTouch on Apple Watch, which gives users with upper body limb differences the option to control Apple Watch with gestures like a pinch or a clench without having to tap the display.
For the Deaf and hard of hearing community, Apple will introduce Live Captions on iPhone, iPad, and Mac.
Users can follow along more easily with any audio content — whether they are on a phone or FaceTime call, video conferencing or social media app, streaming media content, or having a conversation with someone next to them.
Users can also adjust the font size for ease of reading. Live Captions in FaceTime attribute auto-transcribed dialogue to call participants, making group video calls even more convenient for users with hearing disabilities.
When Live Captions are used for calls on Mac, users can type a response and have it spoken aloud in real-time to others who are part of the conversation. According to Apple, since Live Captions are generated on the device, user information stays private and secure.
Live Captions will be available in beta later this year in English (US, Canada) on iPhone 11 and later, iPad models with A12 Bionic and later, and Macs with Apple silicon.
VoiceOver, Apple’s industry-leading screen reader for blind and low vision users, will add support for more than 20 additional locales and languages, including Arabic, Bengali, Bulgarian, Catalan, Ukrainian, Tamil, Marathi, and Vietnamese etc.
Users can also select from dozens of new voices optimized for assistive features across languages. These new languages, locales, and voices will also be available for Speak Selection and Speak Screen accessibility features.
Additionally, VoiceOver users on Mac can use the new Text Checker tool to discover common formatting issues such as duplicative spaces or misplaced capital letters, making proofreading documents or emails even more accessible.
> With Buddy Controller, users can ask a care provider or friend to help them play a game; Buddy Controller combines any two-game controllers into one, so multiple controllers can drive the input for a single player.
> With Siri Pause Time, users with speech disabilities can adjust how long Siri waits before responding to a request.
>Voice Control Spelling Mode allows users to dictate custom spellings using letter-by-letter input. It is available in English.
> Users can customise sound recognition to recognize sounds specific to a person’s environment, like their home’s unique alarm, doorbell, or appliances.
> The Apple Books app will offer new themes and introduce customization options such as bolding text and adjusting line, character, and word spacing for an even more accessible reading experience.
dhanusha@khaleejtimes.com