Your iPhone Is About To Start Speaking To You In Your Own Voice
— Updated on 13 September 2023

Your iPhone Is About To Start Speaking To You In Your Own Voice

— Updated on 13 September 2023
Chris Singh
WORDS BY
Chris Singh

Notice more Apple news than usual plastered across the internet? That’s because we’re only three weeks away from Apple’s WWDC keynote, where the company is expected to announce a massive range of new features and the next major software update for the iPhone. And ahead of that event, we’ve already got word of plans for Apple to introduce a new iPhone feature called Personal Voice, which can read a set of text prompts aloud in a synthesised version of your voice.

Intended as an accessibility feature to assist users who’ve lost the ability to speak or who are blind (or have low vision), the iPhone’s Personal Voice feature is part of a wider suite of software updates designed for cognitive, vision, hearing and mobility accessibility. Although any user, regardless of their need for it, will be able to use these updates, of course.

With the proliferation of generative AI, something like this was to be expected eventually. According to reports, Personal Voice will let users create a synthesised version of their tone that sounds exactly like them to talk with friends or family members. This is done through on-device machine learning to keep all information private and secure, making it so your iPhone can read your text prompts aloud for a total of 15 minutes of audio.

RELATED: Everything You Need To Know About The iPhone 15

The Personal Voice feature integrates with Live Speech, which has already been such a game-changer for iPhone users. This is basically building upon that with your own voice.

You’ll need to train your iPhone to speak for you, however. The whole process should take 15 minutes, requiring you to read a set of randomised text prompts. I guess it’s a bit similar to setting up Face ID, where you’d have to roll your face in front of the selfie camera several times.

Personal Voice should be rolling out as part of the iOS 17 updates, which will be fleshed out in more detail during the Apple WWDC keynote. Arriving alongside the feature is Assistive Access, which can make any app or experience simpler, distilling information down to only the essentials to help users with cognitive disabilities, as well as a bundle of other accessibility features.

There are other smaller quality-of-life updates coming as part of iOS 17. Soon, users will be able to pause GIFS in Safari and Messages, adjust the rate at which Siri speaks, and use Voice Control to grab phonetic suggestions when editing any text.

“Accessibility is part of everything we do at Apple,” said the company’s Senior Director of Global Accessibility Policy and Initiatives, Sarah Herrlinger.

“These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new years.”

Subscribe to B.H. Magazine

Chris Singh
WORDS by
Chris is a freelance Travel, Food, and Technology writer. He has had work published by The AU Review, Junkee Media and Australian Traveller Media and holds tertiary qualifications in Psychology and Sociology.

TAGS

Share the article