Edgar Cervantes / Android Authority
TL; DR
- Amazon has demonstrated Alexander’s ability to speak like a dead man.
- The AI ​​assistant can mimic the voice of the deceased by listening to a small recording of his voice.
Imagine your smart speaker suddenly talking like your dead relative. Will it make you happy, or will it sound awful? This is obviously thematic, but it is next for this writer.
However, Amazon recently boasted of Alexa’s ability to mimic the voice of a dead relative. Alex Prasad’s senior VP and chief scientist Rohit Prasad presented the feature at the Amazon Re: Mars Conference in Las Vegas.
On stage, Prasad said that although AI cannot alleviate the pain of losing a loved one, “it can certainly perpetuate their memory.” Then a video was played where a child asked Alexa – “Can Grandma finish me reading The Wizard of Oz?” After she says “OK,” Alexa continues to speak in the child’s grandmother’s voice.
Prasad explained that the engineers enabled Alexa to find a way to speak in anyone’s voice after listening to the recording for less than a minute. Previously, this would require hours upon hours of recording in a studio.
“Our dreams and myths are becoming a reality,” Prasad told the event audience. In fact, with the advent of machine learning, AI has reached new levels of human behavior (not verbalism). However, a feature like this may not sit well with everyone.
Amazon didn’t say anything about Alex’s acquisition of power anytime soon. The company was probably demonstrating the power of digital helpers.
While such a feature may be helpful, for example, when you want your kids to hear your voice through a smart speaker instead of Alexa, some people may be uncomfortable with how it can be abused. It would be really easy to get Alexa to mimic the voice of a dead celebrity by listening to a quote from a publicly available interview. There are many possibilities.
What do you think? Should AI assistants be allowed to mimic human voices, especially dead ones? Share your thoughts in the comments section below.