Amazon has revealed an experimental Alexa feature that allows the AI assistant to mimic the voices of users’ dead relatives.
The company demoed the feature at its annual MARS conference, showing a video in which a child asks Alexa to read a bedtime story in the voice of his dead grandmother.
“As you saw in this experience, instead of Alexa’s voice reading the book, it’s the kid’s grandma’s voice,” said Rohit Prasad, Amazon’s head scientist for Alexa AI. Prasad introduced the clip by saying that adding “human attributes” to AI systems was increasingly important “in these times of the ongoing pandemic, when so many of us have lost someone we love.”
“While AI can’t eliminate that pain of loss, it can definitely make their memories last,” said Prasad. You can watch the demo itself below:
Amazon has given no indication whether this feature will ever be made public, but says its systems can learn to imitate someone’s voice from just a single minute of recorded audio. In an age of abundant videos and voice notes, this means it’s well within the average consumer’s reach to clone the voices of loved ones — or anyone else they like.
Although this specific application is already controversial, with users on social media calling the feature “creepy” and a “monstrosity,” such AI voice mimicry has become increasingly common in recent years. These imitations are often known as “audio deepfakes” and are already regularly used in industries like podcasting, film and TV, and video games.
Many audio recording suites, for example, offer users the option to clone individual voices from their recordings. That way, if a podcast host flubs her or his line, for example, a sound engineer can edit what they’ve said simply by typing in a new script. Replicating lines of seamless speech requires a lot of work, but very small edits can be made with a few clicks.
The same technology has been used in film, too. Last year, it was revealed that a documentary about the life of chef Anthony Bourdain, who died in 2018, used AI to clone his voice in order to read quotes from emails he sent. Many fans were disgusted by the application of the technology, calling it “ghoulish” and “deceptive.” Others defended the use of the technology as similar to other reconstructions used in documentaries.
Amazon’s Prasad said the feature could enable customers to have “lasting personal relationships” with the deceased, and it’s certainly true that many people around the world are already using AI for this purpose. People have already created chatbots that imitate dead loved ones, for example, training AI based on stored conversations. Adding accurate voices to these systems — or even video avatars — is entirely possible using today’s AI technology, and is likely to become more widespread.
However, whether or not customers will want their dead loved ones to become digital AI puppets is another matter entirely.
Credit: Source link