December 23, 2024

MediaBizNet

Complete Australian News World

Amazon Alexa unveils new technology that can imitate sounds, including those of the dead

Amazon Alexa unveils new technology that can imitate sounds, including those of the dead

Placeholder while loading article actions

bolster above a beside the bed Schedule During this week’s Amazon tech summit, the Echo Dot was asked to complete a task: “Alexa, can Grandma finish reading me” The Wizard of Oz’?”

Alexa’s usually cheerful voice spread from the baby-themed, panda-themed smart speaker: “Okay!” Then, when the device began recounting a scene of a cowardly lion begging for courage, Alexa’s robotic voice was replaced with a more human-sounding cut.

“Instead of Alexa’s voice reading the book, it’s the baby’s grandmother’s voice,” Rohit Prasad, Alexa’s senior vice president and chief AI scientist, explained enthusiastically on Wednesday during a keynote address in Las Vegas. (Amazon founder Jeff Bezos owns The Washington Post.)

the offer It was our first glimpse of Alexa’s latest feature, which – although still in development – will allow the voice assistant to repeat people’s voices from short audio clips. The goal, Prasad said, is to build greater trust with users by infusing AI with “the human traits of empathy and influence.”

The new feature can ‘make [loved ones’] Flashbacks,” Prasad said. But while the prospect of hearing the voice of a dead relative may be very moving, it also raises a myriad of security and ethical concerns, experts say.

“I don’t feel like our world is ready for easy-to-use voice cloning technology,” Rachel Tobak, CEO of San Francisco-based SocialProof Security, told The Washington Post. She added that such technology could be used to manipulate the public through fake audio or video clips.

“If cybercriminals can easily and reliably clone someone else’s voice using a small voice sample, they can use the voice sample to impersonate other individuals,” added Tupac, a cybersecurity expert. “This bad actor can then trick others into thinking they are the person they are impersonating, which can lead to fraud, data loss, account takeover, and more.”

There is also the danger of blurring the lines between what is human and what is mechanical, said Tama Lever, professor of Internet studies at Curtin University in Australia.

“You won’t remember talking to the depths of Amazon… and Its data collection services If he’s talking to your grandmother, or your grandfather’s voice, or the voice of a loved one.”

“It’s a bit like an episode of Black Mirror,” Leaver said, noting. A science fiction series that depicts a future under the title of technology.

A Google engineer who believes that the company’s artificial intelligence has been achieved

Leaver added that the new Alexa feature also raises questions about consent — especially for people who never imagined their voice would be carried by an automated personal assistant after they died.

“There is a real slippery slope of using deceased people’s data in a way that is frightening on the one hand, but very unethical on the other because they never thought of using these effects in this way,” Leaver said.

Having recently lost his grandfather, Lever said he sympathized with the “temptation” of wanting to hear a loved one’s voice. But, he said, the possibility opens a door of consequences that society may not be willing to bear – for example, Who has the right to the little snippets people leave for World Wide Web influencers?

If my grandfather sent me 100 messages, do I have the right to put that into the system? And if so, who owns it? Does Amazon have that recording then? ‘ he asked. ‘Have you given up the right to my grandfather’s voice?’

Prasad did not elaborate on such details during Wednesday’s speech. However, he posited that the ability to imitate sounds was a product of “unquestionably living in the golden age of artificial intelligence, where our dreams and science fiction became reality.”

This AI model is trying to recreate the mind of Ruth Bader Ginsburg

If Amazon’s demo becomes a real feature, Leaver said people may need to start thinking about how they use their voice and appearance when they die.

“Should I consider in my will that I need to say, ‘My voice and my depicted history on social media is the property of my children, and they can decide whether or not they want to revive that in chat with me? ‘ Lever asked.

“That’s a strange thing to say now. But it’s probably a question we need to have an answer to before Alexa starts talking like me tomorrow.”

yowamushi pedal hentai xyzhentai.com marron marron hentai 9 hentai savehentai.info hroz hentaifox بينيك مرات ابوه arabicpornmovies.com نيك غصب love chunibyo and other delusions hentai hentaiquality.com anime porn' imbestigador july 2 2022 pinoyfilms.net live kumu www indian anti sex com alohaporn.me indian mms clips www.brazzer hd videos rajwap.biz vitya sipsons hentai hentairips.com hrntai comic giantess anime hentai hentaiceleb.com elf no yomeiri answer to 4 pics 1 word teleseryestv.com taiwan lotto result 6/49 الراقصة كاميليا سكس luksporno.net افلام سكس ايطالي аска хентай hentaimage.net mercy hentia hot tv actress indian mochito.mobi wwe girls fight hot boobs massage pornoguru.info human digest.com جنس شرجى rjvend.com سكس الراقصة شمس