A new voice AI tool is already being used to spoof celebrity voices

A new voice AI tool is already being used to spoof celebrity voices

ElevenLabs, an AI voice cloning tool, has been abused before. The business is currently considering ways to stop these abuses.

ElevenLabs, an AI startup, recently released a beta version of its platform that lets users either clone someone else’s voice or create entirely new synthetic voices for text-to-speech. Sadly, the Internet began maliciously utilizing this tool within a few days. “A growing number of instances of abuse of voice cloning” are being observed, the company tweeted, and it is considering “implementing additional security measures” as a solution.

MotherBoard discovered posts on 4chan with audio clips in which voices were generated that sounded very much like celebrity voices reading or saying something very controversial.

ElevenLabs, an AI voice cloning tool, did not specify what was meant by “case of abuse.” For instance, one of them has a voice that sounds like Emma Watson reading from Mein Kampf. Additionally, videos included remarks that were violent, homophobic, transphobic, or racist. Although a post on 4chan with a large collection of audio files included a link to the startup’s platform, nobody is sure whether all of these clips actually utilize ElevenLabs technology.

In light of the fact that we recently observed such a phenomenon, this appearance of an audio “deepfake” may not come as a surprise at all. Deepfake videos, particularly deepfake pornography, have increased as a result of advancements in artificial intelligence and machine learning. Existing porn content has been altered to include celebrity faces. In addition, Emma Watson’s face has indeed been utilized in some of these videos.

The business is currently considering ways to stop these abuses.

ElevenLabs is currently gathering suggestions for preventing users from misusing this technology from professionals as well as users. To enable voice cloning, there is currently talk of adding additional levels of account verification, such as requiring users to enter an ID or payment method. The company is also thinking about sending a sample with the text read to check the copyright of the voice that the user wants to clone. Last but not least, the startup is thinking about removing its Voice Lab tool from the market and focusing only on queries that it can manually check. More to come!

We appreciate everyone who tried out our beta platform over the crazy weekend. While voice cloning is increasingly being misused, we are also seeing a rise in the number of applications for the technology. We want to get feedback and ideas from the Twitter community!

Leave a Reply

Your email address will not be published. Required fields are marked *