Apple has recently introduced a range of new accessibility features aimed at assisting iPhone users with disabilities and impairments. These tools, set to be released later this year for iPhone, iPad, and Mac, include Assistive Access, Live Speech, Personal Voice, and more. Assistive Access caters to individuals with cognitive disabilities, offering a tailored experience for Phone and FaceTime. It incorporates a distinct interface with high-contrast buttons and large text labels. In contrast, Live Speech aids non-verbal individuals in communication, while Personal Voice utilizes machine learning to generate a unique voice for each user, benefiting those at risk of losing their ability to speak due to conditions like ALS.
To commemorate Global Accessibility Awareness Day, Apple announced these new features to empower individuals with speech, vision, and cognitive disabilities to use their devices more effectively. Assistive Access, designed for iPad and iOS, merges apps and provides a custom interface with high-contrast buttons and large text labels. The Phone and FaceTime functionalities have been integrated into a single Calls app, along with messaging, camera, photos, and music, facilitating easier communication and media sharing for users with accessibility needs. Apple sought feedback from people with cognitive disabilities and their supporters in the development of Assistive Access.
Live Speech allows non-speaking users on iPhone, iPad, and Mac to type their messages, which are then spoken out loud during calls, FaceTime conversations, and in-person interactions. This feature caters to those who are unable to speak or have gradually lost their speech. Another feature, Personal Voice, enables users to create a voice that resembles their own through machine learning. Initially available for English speakers and Apple silicon devices, this feature is particularly beneficial for individuals at risk of losing their ability to speak due to conditions like ALS.
Point and Speak is designed for people with visual impairments and utilizes the iPad and iPhone’s camera and LiDAR scanner. By leveraging the Magnifier app, users can interact with physical objects that have text labels, such as home appliances. Point and Speak supports VoiceOver and can be used in conjunction with features like People Detection, Door Detection, and Image Descriptions, providing visually disabled individuals with improved navigation. This functionality currently supports multiple languages.
Further features to be introduced later in the year include the ability to pair Made for iPhone hearing devices with Mac units, a Voice Control guide with phonetic suggestions, and the option to customize hearing devices directly on Mac for individuals who are deaf or hard-of-hearing. Apple also enhances accessibility for individuals with low vision by offering text size adjustment across various Mac apps. Moreover, users sensitive to rapid animations can automatically pause images with moving elements, like GIFs, in Messages and Safari.
As Apple prepares for its Worldwide Developers Conference (WWDC), where iOS 17 and iPadOS 17 are expected to be showcased, the company’s commitment to accessibility remains evident. Notably, Apple BKC and Apple Saket, retail stores in India, have been designed with accessibility in mind, featuring ample space between display tables for wheelchair navigation and braille on staircases for the visually impaired. Portable hearing loops are also available to enhance customer experiences.
With a keen interest in tech, I make it a point to keep myself updated on the latest developments in technology and gadgets. That includes smartphones or tablet devices but stretches to even AI and self-driven automobiles, the latter being my latest fad. Besides writing, I like watching videos, reading, listening to music, or experimenting with different recipes. The motion picture is another aspect that interests me a lot, and I'll likely make a film sometime in the future.