The advancement of Artificial intelligence isn’t slowing down and Apple iPhones will soon be able to speak in their users’ voices, the tech company announced this past Tuesday.

According to TheHill, The new iPhone feature, Personal Voice, will give users randomized text prompts to generate 15 minutes of audio. Another feature, Live Speech, will allow users to save commonly used phrases for the device to speak during phone calls and in-person conversations.

Apple claims it will use machine learning, a type of AI, to create the voice on the device itself, rather than externally, so the data can be more secure and private.


Advertisement


The tech giant notes that these tools will help users who are speech-impaired. For example, a man who was diagnosed with ALS and is losing his ability to speak said, “If you can tell them you love them, in a voice that sounds like you, it makes all the difference in the world.” However, critics say this could pose security and privacy threats in the future.

“There are a number of privacy concerns,” said Vahid Behzadan, a University of New Haven cybersecurity expert. “What if the voice model is not fully stored on your phone, but is backed up on Apple? What if your voice can be stolen by your phone to be used by others?” “At Apple, they’ve always believed that the best technology is technology built for everyone,” said Apple CEO Tim Cook.

CNN stated that other tech companies have experimented with using AI to replicate a voice. Last year, Amazon said it’s working on an update to its Alexa system that would allow the technology to mimic any voice, even a deceased family member. (The feature has not yet been released).

In addition to the voice features, Apple announced Assistive Access, which combines some of the most popular iOS apps, such as FaceTime, Messages, Camera, Photos, Music and Phone, into one Calls app. The interface includes high-contrast buttons, large text labels, an option for an emoji-only keyboard and the ability to record video messages for people who may prefer visual or audio communications.

Apple is also updating its Magnifier app for the visually impaired. It will now include a detection mode to help people better interact with physical objects. The update would allow someone, for example, to hold up an iPhone camera in front of a microwave and move their finger across the keypad as the app labels and announces the text on the microwave’s buttons.