Apple has previewed software features for cognitive, vision, hearing, and mobility accessibility, along with tools for individuals who are nonspeaking or at risk of losing their ability to speak.
The tech giant says these updates draw on advances in hardware and software, include on-device machine learning to ensure user privacy, and expand on Apple’s long-standing commitment to making products for everyone.
Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives, says Apple works in deep collaboration with community groups representing a broad spectrum of users with disabilities to develop accessibility features that make a real impact on people’s lives.
Coming later this year, users with cognitive disabilities can use iPhone and iPad with greater ease and independence with Assistive Access; nonspeaking individuals can type to speak during calls and conversations with Live Speech; and those at risk of losing their ability to speak can use Personal Voice to create a synthesized voice that sounds like them for connecting with family and friends.
For users who are blind or have low vision, Detection Mode in Magnifier offers Point and Speak, which identifies text users point toward and reads it out loud to help them interact with physical objects such as household appliances.
Additional features planned include:
Here are the top Apple-related articles at Apple World Today for the week of April…
With the The Parking Spot, you can get easy long-term parking at airports nationwide. With…
The start of the civil trial over who is responsible for the deaths and injuries…
iPad sales are expected to grow by double digits this year.
The global smartphone market grew by 6% year-over-year (YoY) to reach 296.9 million unit shipments…
Slight Change of Sawblades +, Dicey Sungeons +, and Summer Pop + are now available…