Okay, Google. I don’t feel sufficiently creeped out today.’

Last week, Amazon hosted its re:Mars demonstration event in Las Vegas. As you would expect, they previewed a number of new products that are in the works and sought to promote their increasingly advanced technology. But things took a turn for the weird (or at least weirder) when senior vice president and head scientist of Alexa AI at Amazon Rohit Prasad took the stage to demonstrate a new chatbot product that will supposedly offer users a “companionship relationship.” If you think that sounds kind of creepy just from the description, hold onto your hats. It’s pretty strange. As Arstechnica reported this week, the upcoming version of their digital assistant Alexa will be able to persuasively mimic the voice of anyone – living or dead – after only being “fed” one minute of a recording of the person’s voice. Prasad then presented a demonstration of a child who gets his presumably deceased grandmother to read him a bedtime story.

Amazon is figuring out how to make its Alexa voice assistant deepfake the voice of anyone, dead or alive, with just a short recording. The company demoed the feature at its re:Mars conference in Las Vegas on Wednesday, using the emotional trauma of the ongoing pandemic and grief to sell interest.

Amazon’s re:Mars focuses on artificial intelligence, machine learning, robotics, and other emerging technologies, with technical experts and industry leaders taking the stage. During the second-day keynote, Rohit Prasad, senior vice president and head scientist of Alexa AI at Amazon, showed off a feature being developed for Alexa.

After noting the large amount of lives lost during the pandemic, Prasad played a video demo, where a child asks Alexa, “Can grandma finish reading me Wizard of Oz?” Alexa responds, “Okay,” in her typical effeminate, robotic voice. But next, the voice of the child’s grandma comes out of the speaker to read L. Frank Baum’s tale.

Here’s the demo, starting at the point where Prasad introduces “Grandma” reading the story to her grandchild.

Supposedly, this version of Alexa will be able to absorb as little as a minute of someone’s voice, either live or a previous recording, and then build a new voice for itself mimicking the person in question. Of course, if you can put in your own voice or a recording of a deceased relative, it only stands to reason that you could record someone else without their consent and create an audio deepfake of them. Just imagine the mischief that people could get up to.

While some people who go into it with their eyes open might find this amusing or entertaining when it comes out, it also just seems to be highly disturbing. You may recall that we recently looked at the story of an engineer at Google who was suspended after claiming that one of their own chatbots (LaMDA) had become sentient. But LaMDA was only communicating through text on a screen and it was able to fool one of its own engineers.

Now imagine that some less tech-savvy person, perhaps even a child, finds themself suddenly in a conversation with their dead grandparent from beyond the grave. If that thing is even a small percentage as good as LaMDA, the kid could be in for many years of therapy on a couch somewhere. Even worse, what if the Google engineer is eventually proven correct and the application actually is becoming sentient? This would be a great trick to have up its sleeve.

The more I read the background story of the supposedly sentient chatbot at Google, the less convinced I am that it’s truly sentient. But its library of words and phrases to draw upon is so vast that it almost doesn’t really matter. The program is just that good at simulating a living consciousness. Combine that with the ability to deepfake anyone’s voice and unleash it on the public and it’s not hard to imagine any number of very disturbing scenarios unfolding.

This new Alexa product is not available to the public yet and Prasad didn’t announce a release date. But he was speaking of all the technological breakthroughs required to create it in the past tense, so it’s likely pretty close. I won’t be ordering one. (I don’t even use the normal Alexa now.) But if you do, I wish you luck. And say hi to your grandmother for me.

You Might Like
Learn more about RevenueStripe...