Apple is preparing a new accessibility feature known as “Personal Voice” for iOS 17.
Before the celebration of the WWDC of 2023 in which Apple will present all the software news related to iOS 17, iPadOS 17, tvOS 17, watchOS 10, and macOS 14 the company has revealed some accessibility features that will land on the mobile operating system of the iPhone. Some of these new features will be very useful.
Among the new accessibility features that Apple plans to introduce in iOS 17, there are two new features that stand out among all the others. We talk about “Live Speech” and “Personal Voice”. These functions will allow users with speech difficulties to digitize their voice from text and audio.
“Personal Voice” feature creates a voice that will sound like you
This new function that Apple has developed for iOS 17 will allow create a digital voice that sounds exactly like each user’s voice sounds. A powerful tool that has been specially designed for users who are at risk of losing their ability to speak.
The “Personal Voice” feature will invite iOS 17 users read a random selection of texts for a 15 minute audio recording on iPhone and iPad. iOS 17’s AI software will take care of recreate the voice to digitize any text. The operation of “Personal Voice” is a process similar to that of virtual assistants like Siri but on a small scale.
“For users at risk of losing their ability to speak – such as those diagnosed with ALS (amyotrophic lateral sclerosis) – Personal Voice is a simple and safe way to create a voice that sounds like them.”
The functionality will be available in the accessibility section of the iOS 17 Settings application and will be accompanied, as we mentioned before, by the “Live Speech” function.
“Live Speech” will allow iOS 17 users to write any text on iPhone, iPad and Mac during FaceTime calls and in real time so that the device plays said text in an audio. The function will also allow you to save texts so that they can be reproduced quickly.
Creating a synthetic voice that sounds like each user in just 15 minutes is a true masterpiece of software engineering. Once again, Apple demonstrates its leadership in the field of accessibility features by offering users with dialectical problems a simple and effective solution. What do you think about the implementation of these new features for iOS 17?