Amazon Echo Dot can complete task: “Alexa, can Grandma finish reading me ‘The Wizard of Oz’?”

“If a cybercriminal can easily and credibly replicate another person’s voice with a small voice sample, they can use that voice sample to impersonate other individuals,” added Tobac, a cybersecurity expert. The demo was the first glimpse into Alexa’s newest feature, which — though still in development — would allow the voice assistant to replicate people’s voices from short audio clips. “You’re not going to remember that you’re talking to the depths of Amazon … and its data-harvesting services if it’s speaking with your grandmother or your grandfather’s voice or that of a lost loved one.” The new Alexa feature also raises questions about consent, Leaver added — particularly for people who never imagined their voice would be belted out by a robotic personal assistant after they die. But while the prospect of hearing a dead relative’s voice may tug at heartstrings, it also raises a myriad of security and ethical concerns, experts said. Having recently lost his grandfather, Leaver said he empathized with the “temptation” of wanting to hear a loved one’s voice. But it’s probably a question that we should have an answer to before Alexa starts talking like me tomorrow,” he added. The goal, Prasad said, is to build greater trust with users by infusing artificial intelligence with the “human attributes of empathy and affect.” The new feature could “make [loved ones’] memories last,” Prasad said. “Do I have to think about in my will that I need to say, ‘My voice and my pictorial history on social media is the property of my children, and they can decide whether they want to reanimate that in chat with me or not?’ ” Leaver wondered. (The Washington Post). Continue reading.



Related Technology news



You may also be interested in France Programming Flight Boeing Materials Science Moleskine Supercars Ford