Let iPhone speak for you with your own voice

While we take for granted the features of our constant companions, our smartphones, in our daily lives, we often don’t think twice about what it would be like without them. And yet, for those struggling with physical and mental challenges, a mere feature on a smartphone can be life-changing.

On Global Accessibility Awareness Day, it might be a good time to pause and take a look at some of the new inclusivity features Apple has announced. This can make a huge difference for a person with a disability to be able to do something that seemed impossible all along.

Point and Speak

Hardly anyone thinks that a smartphone’s magnifying glass is a unique selling point. However, for people with visual impairments, the magnifying glass can mean the difference between being able to do something on their own or waiting for help. The humble magnifying glass was once not even available on a phone and one had to look for an app to do the job. With Android, there was often a risk of malicious software getting onto your phone. Nowadays, the magnifying glass has evolved into a tool that can do more than just magnify what’s on the screen. Point and Speak is coming soon in Recognize mode in Magnifier on an iPhone or iPad. This means that the text users point to is identified and read aloud to help them interact with physical objects such as household appliances. For example, when using a home appliance — say, a microwave — Point and Speak combines input from the camera, the LiDAR scanner, and on-device machine learning to speak the text on each key as users move their finger across the keyboard.

Point and Speak integrates with the Magnifier app on iPhone and iPad, works well with VoiceOver, and can be used with other Magnifier features such as people detection, door detection, and image descriptions to help users navigate their physical environment. For those unfamiliar with these features, you can find them in Magnifier via the accessibility settings.

live speech

One of the amazing machine learning marvels that has emerged recently is a system’s ability to take a sample of speech data and use that to mimic the characteristics of the voice as text is entered. Apple is using this technology in an upcoming feature called Live Speech and Personal Voice, which will work on newer iPhones, iPads, and Macs. Users can type what they want to say to have it said out loud in phone and FaceTime calls, as well as in person conversations, using a synthetic voice like Siri. The personal voice offers the possibility to train the software to sound like the user. Reading the sentences presented takes only 15 minutes. Users can also save frequently used phrases to quickly tune in to conversations with family, friends, and co-workers.

Live Speech and Personal Voice are designed to support people who cannot speak or who have lost their speech over time.

This feature update leverages advances in hardware and software and includes on-device machine learning to ensure user privacy.

Apple has a centralized team working out of Cupertino to develop accessibility solutions. “Accessibility is part of everything we do at Apple,” said Sarah Herrlinger, Apple’s senior director of global accessibility policy and initiatives. “These groundbreaking features were developed with feedback from members of the disability community at every step to support a diverse group of users and help people connect in new ways.”

Another feature, Assistive Access, reduces apps like Photos, Music, Messenger, Face Time, phone calls, and other on-device experiences to only their essential functions to reduce cognitive load for people with cognitive problems. The Help feature provides a clear user interface with high-contrast buttons and large text labels, as well as tools to help trusted supporters customize the experience for the person they support. For example, for users who prefer to communicate visually, Messages offers an emoji-only keyboard and the ability to record a video message to share with loved ones.

“The community with intellectual and developmental disabilities is bursting with creativity, but technology often presents physical, visual, or knowledge barriers for these individuals,” said Katy Schmid, senior director of national program initiatives at The Arc of the United States. “Having a feature that offers a cognitively accessible experience on the iPhone or iPad means more open doors to education, employment, safety and autonomy. It means expanding the worlds and expanding the potential.”

“Ultimately, the most important thing is being able to communicate with friends and family,” said Philip Green, board member and ALS advocate for nonprofit Team Gleason, who has experienced significant changes in his voice since his ALS diagnosis in 2018. ” Being able to tell them you love them in a voice that sounds like you makes all the difference in the world — and being able to create your synthetic voice in just 15 minutes on your iPhone is extraordinary.”

The difficult question, of course, is whether users will discover and actually use the accessibility features. It has always been said that users ignore about 90% of a device’s software features. “One of the things we’ve done over the years is to improve the accessibility features within the devices themselves and not only ensure that accessibility is at the top level of settings, but also incorporate them into the setup process,” Sarah said Herrlinger . Apple Stores also host workshops and events to help users learn about these special features.

follow us on Facebook, Twitter, youtube & Instagram not to miss any update from Fortune India. To purchase a copy, visit Amazon.

Sybil Alvarez

"Incurable gamer. Infuriatingly humble coffee specialist. Professional music advocate."

Leave a Reply

Your email address will not be published. Required fields are marked *