Friday, May 3, 2024

Amazon Alexa unveils new technology that can mimic voices, including the dead



Placeholder whereas article actions load

Propped atop a bedside desk throughout this week’s Amazon tech summit, an Echo Dot was requested to finish a job: “Alexa, can Grandma finish reading me ‘The Wizard of Oz’?”

Alexa’s usually cheery voice boomed from the kids-themed good speaker with a panda design: “Okay!” Then, as the gadget started narrating a scene of the Cowardly Lion begging for braveness, Alexa’s robotic twang was changed by a extra human-sounding narrator.

- Advertisement -

“Instead of Alexa’s voice reading the book, it’s the kid’s grandma’s voice,” Rohit Prasad, senior vice chairman and head scientist of Alexa synthetic intelligence, excitedly defined Wednesday throughout a keynote speech in Las Vegas. (Amazon founder Jeff Bezos owns The Washington Post.)

The demo was the first glimpse into Alexa’s latest characteristic, which — although nonetheless in growth — would enable the voice assistant to copy folks’s voices from quick audio clips. The purpose, Prasad stated, is to construct better belief with customers by infusing synthetic intelligence with the “human attributes of empathy and affect.”

- Advertisement -

The new characteristic might “make [loved ones’] memories last,” Prasad stated. But whereas the prospect of listening to a dead relative’s voice might tug at heartstrings, it additionally raises a myriad of safety and moral considerations, consultants stated.

“I don’t feel our world is ready for user-friendly voice-cloning technology,” Rachel Tobac, chief government of the San Francisco-based SocialProof Security, advised The Washington Post. Such technology, she added, may very well be used to govern the public by way of pretend audio or video clips.

“If a cybercriminal can easily and credibly replicate another person’s voice with a small voice sample, they can use that voice sample to impersonate other individuals,” added Tobac, a cybersecurity skilled. “That bad actor can then trick others into believing they are the person they are impersonating, which can lead to fraud, data loss, account takeover and more.”

- Advertisement -

Then there’s the threat of blurring the traces between what’s human and what’s mechanical, stated Tama Leaver, a professor of web research at Curtin University in Australia.

“You’re not going to remember that you’re talking to the depths of Amazon … and its data-harvesting services if it’s speaking with your grandmother or your grandfather’s voice or that of a lost loved one.”

“In some ways, it’s like an episode of ‘Black Mirror,’ ” Leaver stated, referring to the sci-fi collection envisioning a tech-themed future.

The Google engineer who thinks the firm’s AI has come to life

The new Alexa characteristic additionally raises questions on consent, Leaver added — significantly for individuals who by no means imagined their voice can be belted out by a robotic private assistant after they die.

“There’s a real slippery slope there of using deceased people’s data in a way that is both just creepy on one hand, but deeply unethical on another because they’ve never considered those traces being used in that way,” Leaver stated.

Having just lately misplaced his grandfather, Leaver stated he empathized with the “temptation” of wanting to listen to a cherished one’s voice. But the chance opens a floodgate of implications that society may not be ready to tackle, he stated — as an example, who has the rights to the little snippets folks go away to the ethers of the World Wide Web?

“If my grandfather had sent me 100 messages, should I have the right to feed that into the system? And if I do, who owns it? Does Amazon then own that recording?” he requested. “Have I given up the rights to my grandfather’s voice?”

Prasad didn’t tackle such particulars throughout Wednesday’s tackle. He did posit, nonetheless, that the potential to mimic voices was a product of “unquestionably living in the golden era of AI, where our dreams and science fiction are becoming a reality.”

This AI mannequin tries to re-create the thoughts of Ruth Bader Ginsburg

Should Amazon’s demo change into an actual characteristic, Leaver stated folks may want to begin fascinated with how their voices and likeness may very well be used once they die.

“Do I have to think about in my will that I need to say, ‘My voice and my pictorial history on social media is the property of my children, and they can decide whether they want to reanimate that in chat with me or not?’ ” Leaver puzzled.

“That’s a weird thing to say now. But it’s probably a question that we should have an answer to before Alexa starts talking like me tomorrow,” he added.



Source link

More articles

- Advertisement -
- Advertisement -

Latest article