After hearing someone talk for less than a minute, Amazon is working on features that will enable its Alexa voice assistant to impersonate any human voice. Some are worried about the possibility of misuse despite dismissing the feature’s potential creepiness.
Since “so many of us have lost someone we love” as a result of the epidemic, Rohit Prasad, who oversees the Alexa team at Amazon, said the project’s aim is to “make the memories linger.”
Because Alexa may be taught to mimic a voice using previously recorded sounds, the source need not be present or even alive. During a conference this week, a video clip of a little child asking Alexa to finish reading The Wizard of Oz was exhibited. Alexa does indeed switch voices to make fun of the child’s grandma and complete the narrative.
Read: How to earn money from Amazon with zero investment
During the presentation, Prasad stated that hundreds of millions of Alexa-enabled devices in more than 70 countries across the world now send billions of queries every week in 17 different languages.
Abuse seems to have a high probability. The programme might be used, for instance, to produce convincing deepfakes for political propaganda or disinformation operations. When fraudsters conned a bank manager into sending $35 million to support an acquisition that didn’t exist in 2020, they used the capability to their advantage.
What do you think about the situation? Are you interested by the thought of having a “conversation” with someone who has passed away, or do you think Amazon is pushing the notion of voice cloning a little too far here?
To read our blog on “‘Amazon Alexa devices now function with Google Nest cameras,” Amazon Alexa devices now function with Google Nest cameras (techx.pk)