Curated by THEOUTPOST
On Mon, 10 Mar, 4:05 PM UTC
5 Sources
[1]
Consumer Reports calls out poor AI voice-cloning safeguards
Study finds 4 out of 6 providers don't do enough to stop impersonation Four out of six companies offering AI voice cloning software fail to provide meaningful safeguards against the misuse of their products, according to research conducted by Consumer Reports. The nonprofit publication evaluated the AI voice cloning services from six companies: Descript, ElevenLabs, Lovo, PlayHT, Resemble AI, and Speechify. It found ElevenLabs, Speechify, PlayHT, and Lovo "required only that researchers check a box confirming that they had the legal right to clone the voice or make a similar self-attestation." To establish an account, Speechify, Lovo, PlayHT, and Descript only required users to provide a name and email address. "I actually think there's a good argument that can be made that what some of these companies are offering runs afoul of existing consumer protection laws," said Grace Gedye, citing Section 5 of the FTC Act and various state laws. Gedye, a policy analyst at Consumer Reports and author of the AI voice cloning report [PDF], acknowledged that open source voice cloning software complicates matters, but said that even so, it's worthwhile to try to encourage American companies to do a better job protecting consumers. Descript, ElevenLabs, Speechify, PlayHT, and Lovo did not immediately respond to requests for comment. Several of these firms defended their business practices in response to questions posed by Consumer Reports in November 2024. Speech synthesis has been the focus of research for decades, but only recently, thanks to advances in machine learning, has voice cloning become convincing, easy to use, and widely accessible. The software has a variety of legitimate uses such as generating narration for audio books, enabling speech from those unable to speak, and customer support, to the extent customers tolerate it. But it can also be easily misused. Lyrebird was the canary in the coal mine. In 2017, the Canada-based startup (since acquired by Descript) released audio clips featuring the voices of Donald Trump, Barack Obama, and Hillary Clinton, saying things they hadn't actually said. It was a proof of concept for what's now a real problem - reproducing other people's voices for deceptive purposes, or audio deepfakes. According to a 2023 US Federal Trade Commission's Consumer Sentinel Network report [PDF], that year there were more than 850,000 impostor scams, about a fifth of which resulted in monetary losses totaling $2.7 billion. While an unknown but presumably small portion of these involved AI voice cloning software, reports of misuse of the technology have become more common. Last year, for example, police in Baltimore, Maryland, arrested the former athletic director of a high school for allegedly impersonating the school's principal using voice cloning software to make it sound as if the principal had made racist, antisemitic remarks. And the voice cloning report cites numerous testimonials from hundreds of consumers who told the publication about their experience with impersonation phone calls in response to a February 2024 solicitation. The concern raised in Gedye's report is that some of these companies specifically market their software for deception. "PlayHT, a voice cloning company, lists 'pranks' as a use case for its AI voice tools in a company blog post," the report says. "Speechify, another AI voice company, also suggests prank phone calls as a use case for its tools. 'There's no better way to prank your friends than by pretending you're someone else.'" That concern is evident among some large commercial AI vendors. Microsoft, for example, has chosen not to publicly release its VALL-E 2 project, citing the risk of potential misuses "such as spoofing voice identification or impersonating a specific speaker." Similarly, OpenAI has limited access to its Voice Engine for speech synthesis. The US Federal Trade Commission last year finalized a rule that prohibits AI impersonation of governments and businesses. It subsequently proposed to extend that ban to prohibit the impersonation of individuals, but no further progress appears to have been made toward that end. Given the current US administration's efforts to eliminate regulatory bodies like the Consumer Financial Protection Bureau, Gedye said state-level regulation might be more likely than further federal intervention. "We've seen a lot of interest from states on working on the issue of AI specifically," said Gedye. "Most of what I do is work on state-level AI policy and there's a bunch of ambitious legislators who want to work on this issue. I think [state Attorneys General] are also interested in protecting their constituents from the harms of emerging technology. Maybe you have challenges at the federal level right now for consumer protection, although I hope that scams and impersonation are particularly nonpartisan issues." ®
[2]
Consumer Reports finds popular voice cloning tools lack safeguards | TechCrunch
Several popular voice cloning tools on the market don't have "meaningful" safeguards to prevent fraud or abuse, according to a new study from Consumer Reports. Consumer Reports probed voice cloning products from six companies -- Descript, ElevenLabs, Lovo, PlayHT, Resemble AI, and Speechify -- for mechanisms that might make it more difficult for malicious users to clone someone's voice without their permission. The publication found that only two, Descript and Resemble AI, took steps to combat misuse. Others required only that users check a box confirming that they had the legal right to clone a voice or make a similar self-attestation. Grace Gedye, policy analyst at Consumer Reports, said that AI voice cloning tools have the potential to "supercharge" impersonation scams if adequate safety measures aren't put in place. "Our assessment shows that there are basic steps companies can take to make it harder to clone someone's voice without their knowledge -- but some companies aren't taking them," Gedye said in a statement.
[3]
AI voice cloning software has flimsy guardrails, report finds
It's easy to bypass steps that voice cloning services have taken to prevent nonconsensual voice cloning, according to the report. Most leading artificial intelligence voice cloning programs have no meaningful barriers to stop people from nonconsensually impersonating others, a Consumer Reports investigation found. Voice cloning AI technology has made remarkable strides in recent years, and many services can effectively mimic a person's cadence with only a few seconds of sample audio. A flashpoint moment came during the Democratic primaries last year, when robocalls of a fake Joe Biden spammed the phones of voters telling them not to vote. The political consultant who admitted to masterminding the scheme was fined $6 million, and the Federal Communications Commission has since banned AI-generated robocalls. A new survey of the six leading publicly available AI voice cloning tools found that five have easily bypassable safeguards, making it simple to clone a person's voice without their consent. Deepfake audio detection software often struggles to tell the difference between real and synthetic voices. Generative AI, which mimics human qualities like their appearance, writing and voices, is a new and rapidly evolving technology, and the industry has few federal regulations. Most ethical and safety checks in the industry at large are self-imposed. President Joe Biden had included some safety demands in his executive order on AI, which he signed in 2023, though President Donald Trump revoked that order when he took office. Voice cloning technology works by taking an audio sample of a person speaking and then extrapolating that person's voice into a synthetic audio file. Without safeguards in place, anyone who registers an account can simply upload audio of an individual speaking, like from a TikTok or YouTube video, and have the service imitate them. Four of the services -- ElevenLabs, Speechify, PlayHT and Lovo -- simply require checking a box saying that the person whose voice is being cloned had given authorization. Another service, Resemble AI, requires recording audio in real time, rather than allowing a person to just upload a recording. But Consumer Reports was able to easily circumvent that restriction by simply playing an audio recording from a computer. Only the sixth service, Descript, had a somewhat effective safeguard. It requires a would-be cloner to record a specific consent statement, which is difficult to falsify except through cloning through another service. All six services are available to the public via their websites. Only Eleven Labs and Resemble AI cost money -- respectively $5 and $1 -- to create a custom voice clone. The others are free. Some of the companies claim that abuse of their tool can have serious negative consequences. "We recognize the potential for misuse of this powerful tool and have implemented robust safeguards to prevent the creation of deepfakes and protect against voice impersonation," a spokesperson for Resemble AI told NBC News in an emailed statement. There are legitimate uses for AI voice cloning, including helping people with disabilities and creating audio translations of people speaking in different languages. But there is also enormous potential for harm, said Sarah Myers West, the co-executive director of the AI Now Institute, a think tank that focuses on the consequences of AI policy. "This could obviously be used for fraud, scams and disinformation, for example impersonating institutional figures," West told NBC News. There is little research on the scope of how often AI is used in audio-based scams. In so-called grandparent scams, a criminal makes a phone call to a person claiming an emergency involving a family member, like they have been kidnapped, arrested or injured. The Federal Trade Commission has warned that such scams may use AI, though the scams predate the technology. Cloned voices have been used to create music without the depicted artist's permission, as happened with a viral 2023 song that falsely seemed to be by Drake and the Weeknd, and some musicians have struggled to control their image when other people release music with their voices.
[4]
Most AI voice cloning tools aren't safe from scammers, Consumer Reports finds
Consumer Reports assessed the most leading voice cloning tools, including Descript and ElevenLabs. Here's the verdict. AI voice cloning technology has made remarkable advances in the last few years, reaching the ability to create realistic-sounding audio from just a few seconds of a sample. Although this has many positive applications -- such as audiobooks, marketing materials, and more -- the technology can also be exploited for elaborate scams, fraud, and other harmful applications. To learn more about the safeguards currently in place for these products, Consumer Reports assessed six of the leading voice cloning tools: Descript, ElevenLans, Lovo, PlayHT, Resemble AI, and Speechify. Specifically, Consumer Reports were looking for proper safeguards that prevent the cloning of someone's voice without their knowledge. Also: Got a suspicious E-ZPass text? It's a trap - how to spot the scam The results found that four of the six products -- from ElevenLabs, Speechify, PlayHT, and Lovo -- did not have the technical mechanisms necessary to prevent cloning someone's voice without their knowledge or to limit the AI cloning to only the user's voice. Instead, the protection was limited to a box users had to check off, confirming they had the legal right to clone the voice. The researchers found that Descript and Resemble AI were the only companies with additional steps in place that made it more challenging for customers to do non-consensual cloning. Descript asked the user to read and record a consent statement and used that audio to generate the clone. Resemble AI takes a different approach, ensuring that the first voice clone created is based on audio recorded in real time. Neither method is impenetrable, as a user could hit play on another AI-cloned snippet or an existing video on a different device. A common use of non-consensual cloning is scamming people. For example, a popular attack involves cloning the voice of a family member and then using that recording to contact a loved one to request that money be sent to help them out of a dire situation. Because the victim thinks they are hearing the voice of a family member in distress, they are more likely to send whatever funds are necessary without questioning the situation. Also: Tax scams are getting sneakier - 10 ways to protect yourself before it's too late Voice cloning has also been used to impact voters' decisions in upcoming elections, as seen in the 2024 election when someone cloned former President Joe Biden's voice to discourage people from showing up to the voting polls. Consumer Reports also found that Speechify, Lovo, PlayHT, and Descript only required an email and name for a user to create an account. Consumer Reports recommends that these companies also collect customers' credit card information to trace fraudulent audio back to the bad actor. Other Consumer Reports recommendations include mechanisms to ensure the ownership of the voice, such as reading off a unique script, watermarking AI-generated audio, creating a tool that detects AI-generated images, detecting and preventing the cloning of the voice of influential or public figures, and prohibiting audio containing scam phrases. The biggest departure from the current system would be Consumer Report's proposal to have someone supervise voice cloning instead of the current do-it-yourself method. Consumer Reports also said there should be an emphasis on making the necessary actors understand their liability should the voice model be misused in a contractual agreement. Also: How Cisco, LangChain, and Galileo aim to contain 'a Cambrian explosion of AI agents' Consumer Reports believes companies have a contractual obligation under Section 5 of the Federal Trade Commission Act to protect their products from being used for harm, which can only be done by adding more protections. If you receive an urgent call from someone you know demanding money, don't panic. Use another device to directly contact that person to verify the request. If you cannot make contact with that person, you can also ask the caller questions to verify their identity. For a full list of how to protect yourself from AI scam calls, check out ZDNET's advice here.
[5]
Consumer Reports: AI voice cloning tools have almost no security checks
Consumer Reports reveals that several popular voice cloning tools lack adequate safeguards against fraud or abuse, highlighting potential risks associated with AI voice technology. The study examined products from six companies: Descript, ElevenLabs, Lovo, PlayHT, Resemble AI, and Speechify. The investigation found that only Descript and Resemble AI have implemented meaningful measures to prevent misuse. Other tools merely require users to confirm they have the legal right to clone a voice, often through self-attestation. Grace Gedye, a policy analyst at Consumer Reports, warned that without proper safety mechanisms, AI voice cloning tools could "supercharge" impersonation scams. AI voice cloning technology has advanced significantly, capable of mimicking a person's speech with minimal audio samples. A notable incident occurred during the Democratic primaries last year, where robocalls featuring a fake Joe Biden misled voters. This resulted in a $6 million fine for the political consultant behind the scheme, and the Federal Communications Commission subsequently banned AI-generated robocalls. The analysis of the six AI voice cloning tools indicated that five have bypassable safeguards, making it easy to clone voices without consent. Deepfake audio detection software often struggles to distinguish between genuine and synthetic voices, complicating the issue. Generative AI, which imitates human characteristics such as voice, has limited federal regulation, with most ethical practices driven by the companies themselves. An executive order signed by President Biden in 2023 included safety demands for AI, but a later revocation by former President Trump dismantled those provisions. Sesame's AI voice is so real, it's unsettling Voice cloning technology utilizes audio samples from individuals to create synthetic voices. Without safeguards, anyone can upload audio from various platforms, such as TikTok or YouTube, and have the service replicate that voice. Four of the examined services -- ElevenLabs, Speechify, PlayHT, and Lovo -- simply require users to check a box asserting authorization for the voice clone. Resemble AI, while insisting on real-time audio recording, was circumvented by Consumer Reports, which played recorded audio during verification. Only Descript offered a somewhat effective safeguard, requiring users to record a specific consent statement. This method is difficult to falsify, except when using another service to clone the voice. All six services are publicly accessible on their respective websites, with ElevenLabs and Resemble AI charging fees of $5 and $1, respectively, for creating custom voice clones, while the others are free to use. Some companies acknowledged the potential for abuse and reported having implemented stronger safeguards to prevent deepfake creation and voice impersonation. There are legitimate applications for AI voice cloning, such as aiding individuals with disabilities and providing audio translations. However, risks remain significant. Sarah Myers West, co-executive director of the AI Now Institute, noted that this technology could facilitate fraud, scams, and disinformation, including the impersonation of influential figures. Research on the prevalence of AI in audio scams is limited. The Federal Trade Commission has indicated that AI may be employed in "grandparent scams," where criminals impersonate family members in distress. Additionally, some musicians have faced challenges due to cloned voices being utilized for unauthorized music production, as exemplified by a viral 2023 song falsely attributed to Drake and the Weeknd.
Share
Share
Copy Link
A Consumer Reports study reveals that most popular AI voice cloning tools lack meaningful safeguards against fraud and abuse, raising concerns about potential misuse and impersonation scams.
A recent investigation by Consumer Reports has revealed that several popular AI voice cloning tools lack adequate safeguards against fraud and abuse. The study, which examined products from six companies including Descript, ElevenLabs, Lovo, PlayHT, Resemble AI, and Speechify, found that four out of the six providers fail to implement meaningful measures to prevent the misuse of their technology 12.
The investigation discovered that ElevenLabs, Speechify, PlayHT, and Lovo only required users to check a box confirming they had the legal right to clone a voice or make a similar self-attestation 1. This minimal barrier to entry raises concerns about the potential for voice impersonation and fraud.
Grace Gedye, a policy analyst at Consumer Reports and author of the AI voice cloning report, stated, "Our assessment shows that there are basic steps companies can take to make it harder to clone someone's voice without their knowledge -- but some companies aren't taking them" 2.
Among the companies evaluated, only Descript and Resemble AI implemented more robust safeguards:
The other services (Speechify, Lovo, PlayHT, and Descript) only required a name and email address to create an account, further lowering the barrier for potential misuse 4.
The ease of bypassing these minimal safeguards raises concerns about various forms of abuse:
The AI voice cloning industry currently operates with limited federal regulations. While President Biden signed an executive order on AI safety in 2023, it was later revoked by former President Trump 3. This regulatory gap leaves most ethical and safety checks to be self-imposed by the industry.
Some companies have acknowledged the potential for misuse. A spokesperson for Resemble AI stated, "We recognize the potential for misuse of this powerful tool and have implemented robust safeguards to prevent the creation of deepfakes and protect against voice impersonation" 3.
Consumer Reports has proposed several recommendations to enhance the security of AI voice cloning tools:
The organization also suggests that companies should collect more user information, such as credit card details, to trace potential fraudulent activities 4.
As AI voice cloning technology continues to advance, the need for robust safeguards and regulatory frameworks becomes increasingly critical to prevent misuse while allowing for legitimate applications in areas such as accessibility and content creation.
Reference
[1]
AI-powered voice cloning technology is advancing rapidly, raising concerns about fraud, privacy, and legal implications. Celebrities like David Attenborough and Scarlett Johansson have been targeted, prompting calls for updated regulations.
3 Sources
3 Sources
AI-powered voice cloning scams are becoming increasingly prevalent, with 28% of adults falling victim. Banks and experts warn of the sophisticated techniques used by scammers to exploit social media content and empty bank accounts.
6 Sources
6 Sources
A report by Recorded Future suggests ElevenLabs' AI voice generation technology was likely used in a Russian influence operation targeting European support for Ukraine, highlighting the dual-edged nature of advanced AI tools.
2 Sources
2 Sources
As AI technology advances, scammers are using sophisticated tools to create more convincing frauds. Learn about the latest AI-enabled scams and how to safeguard yourself during the holidays.
7 Sources
7 Sources
OpenAI's ChatGPT introduces an advanced voice mode, sparking excitement and raising privacy concerns. The AI's ability to mimic voices and form emotional bonds with users has led to mixed reactions from experts and users alike.
5 Sources
5 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved