Curated by THEOUTPOST
On Wed, 18 Sept, 4:05 PM UTC
6 Sources
[1]
28% Of Adults Have Fallen Victim To AI Voice Scam: 'It Can Clone Your Voice In 3 Seconds And Empty Out Your Bank Account'
Criminals are using AI to replicate voices and trick people into giving away money or personal information Starling Bank, a UK bank, has issued a warning regarding a recent surge in scams that employ artificial intelligence (AI) to mimic individuals' voices. The British bank is cautioning the global community about the emergence of AI voice cloning scams. According to a press release, the bank is currently handling hundreds of such cases, and these fraudulent activities could potentially target individuals with social media accounts. Based on recent data released by Starling Bank: The same data suggests that criminals can now replicate a person's voice using as little as three seconds of audio. To raise awareness about AI voice cloning scams, Starling Bank has launched the 'Safe Phrases' campaign in conjunction with the government's Stop! Think Fraud campaign. "People regularly post content online, which has recordings of their voice, without ever imagining it's making them more vulnerable to fraudsters," said Lisa Grahame, chief information security officer at Starling Bank, in the press release. Starling Bank's Safe Phrases campaign suggests that individuals establish a unique "Safe Phrase" known only to their close friends and family. This phrase can be used to verify the authenticity of their identity during conversations. Suppose someone is contacted by an individual claiming to be a friend or family member but unfamiliar with the agreed-upon Safe Phrase. In that case, they should immediately suspect that it might be a fraudulent attempt. A reported case from Arizona, US, last year involved a woman who claimed that scammers utilised AI to replicate her 15-year-old daughter's voice and demanded a $1 million ransom. This situation could have been partially prevented if they had established a Safe Phrase. Financial fraud offences in England and Wales are on the rise due to the increasingly sophisticated techniques criminals employ to extort money. According to UK Finance, these offences increased by 46 percent last year. Last year, it was reported that fraudsters were creating fraudulent job advertisements specifically targeting UK job seekers. These individuals were then deceived into selling counterfeit products online, with the fraudsters ultimately absconding with the profits. Additionally, Starling's research revealed that the average UK adult has been targeted by a fraud scam five times in the past twelve months. "People regularly post content online which has recordings of their voice, without ever imagining it's making them more vulnerable to fraudsters," said Lisa Grahame, Chief Information Security Officer at Starling Bank. "Scammers only need three seconds of audio to clone your voice, but it would only take a few minutes with your family and friends to create a Safe Phrase to thwart them. So it's more important than ever for people to be aware of these types of scams being perpetuated by fraudsters and how to protect themselves and their loved ones from falling victim," the top executive added. She said, "We hope that through campaigns such as this, we can arm the public with the information they need to keep themselves safe. Simply having a Safe Phrase in place with trusted friends and family - which you never share digitally - is a quick and easy way to ensure you can verify who is on the other end of the phone." To initiate the campaign, Starling Bank has enlisted the participation of renowned actor James Nesbitt. His voice has been cloned using AI technology, underscoring the ease with which anyone could potentially fall victim to such scams.
[2]
AI Cloning Hoax Can Copy Your Voice in 3 Seconds -- and It's Emptying Bank Accounts. Here's How to Protect Yourself.
The financial institution is encouraging people to come up with AI-proof "safe phrases" to share with loved ones. A U.K. bank is warning the world to watch out for AI voice cloning scams. The bank said in a press release that it's dealing with hundreds of cases and the hoaxes could affect anyone with a social media account. According to new data from Starling Bank, 28% of UK adults say they have already been targeted by an AI voice cloning scam at least once in the past year. The same data revealed that nearly half of UK adults (46%) have never heard of an AI voice-cloning scam and are unaware of the danger. Related: How to Outsmart AI-Powered Phishing Scams "People regularly post content online, which has recordings of their voice, without ever imagining it's making them more vulnerable to fraudsters," said Lisa Grahame, chief information security officer at Starling Bank, in the press release. The scam, powered by artificial intelligence, needs merely a snippet (only three or so seconds) of audio to convincingly duplicate a person's speech patterns. Considering many of us post much more than that on a daily basis, the scam could affect the population en mass, per CNN. Once cloned, criminals cold-call victim's loved ones to fraudulently solicit funds. Related: Andy Cohen Lost 'A Lot of Money' to a Highly Sophisticated Scam -- Here's How to Avoid Becoming a Victim Yourself In response to the growing menace, Starling Bank recommends adopting a verification system among relatives and friends using a unique safe phrase that you only share with loved ones out loud -- not by text or email. "We hope that through campaigns such as this, we can arm the public with the information they need to keep themselves safe," Grahame added. "Simply having a safe phrase in place with trusted friends and family -- which you never share digitally -- is a quick and easy way to ensure you can verify who is on the other end of the phone."
[3]
Warning social media videos could be exploited by scammers to clone voices
Research released by Starling Bank finds 28% of people have been targeted at least once in the past year Consumers have been warned that their social media videos could be exploited by scammers to clone their voices with AI and then trick their family and friends out of cash. Scammers look for videos that have been uploaded online and need only a few seconds of audio to replicate how the target talks. They then call or send voicemails to friends and family, asking them to send money urgently. Research released by the digital lender Starling Bank found that 28% of people had been targeted by an AI voice cloning scam at least once in the past year. However, 46% of people did not even know this type of scam exists, and 8% said they would be likely to send whatever money was requested, even if they thought the call from their loved one seemed strange. Lisa Grahame, a chief information security officer at Starling Bank, said: "People regularly post content online which has recordings of their voice, without ever imagining it's making them more vulnerable to fraudsters." The lender is now suggesting that people use a safe phrase with close friends and family to check whether a call is genuine. "Scammers only need three seconds of audio to clone your voice, but it would only take a few minutes with your family and friends to create a safe phrase to thwart them," Grahame said. "So it's more important than ever for people to be aware of these types of scams being perpetuated by fraudsters, and how to protect themselves and their loved ones from falling victim." There is always a chance that safe words could be compromised. Anyone wary of any voice call or message could also call a trusted friend or family member to sense check the request, or call 159 to speak directly to their bank. The UK's cybersecurity agency said in January that AI was making it increasingly difficult to identify phishing messages, where users are tricked into handing over passwords or personal details. These increasingly sophisticated scams have even managed to dupe big international businesses. Hong Kong police began an investigation in February after an employee at an unnamed company claimed she had been duped into paying HK$200m (£20m) of her firm's money to fraudsters in a deepfake video conference call impersonating senior officers of the company. The criminal is believed to have downloaded videos in advance and then used artificial intelligence to add fake voices to use in the video conference. Lord Hanson, Home Office minister with responsibility for fraud, said: "AI presents incredible opportunities for industry, society and governments, but we must stay alert to the dangers, including AI-enabled fraud."
[4]
Starling warns of rise in voice cloning scams
This content has been selected, created and edited by the Finextra editorial team based upon its relevance and interest to our community. The study found that over a quarter (28%) of UK adults say they have been targeted by an AI voice cloning scam at least once in the past year. Starling says faudsters can now use voice cloning technology to replicate a person's voice from as little as three seconds of audio, which can easily be captured from a video someone has uploaded online or to social media. Scam artists can then identify that person's family members and use the cloned voice to stage a phone call, voice message or voicemail to them, asking for money that is needed urgently. In the survey, nearly 1 in 10 say they would send whatever they needed in this situation, even if they thought the call seemed strange. Despite the prevalence of this attempted fraud tactic, just 30% say they would confidently know what to look out for if they were being targeted with a voice cloning scam. To help combat the fraudsters, Starling Bank has launched the Safe Phrases campaign, in support of the government's Stop! Think Fraud campaign, encouraging the public to agree a 'Safe Phrase' with their close friends and family that no one else knows, to allow them to verify that they are really speaking to them. Lisa Grahame, chief information security officer at Starling Bank, comments: "People regularly post content online which has recordings of their voice, without ever imagining it's making them more vulnerable to fraudsters. Simply having a Safe Phrase in place with trusted friends and family - which you never share digitally - is a quick and easy way to ensure you can verify who is on the other end of the phone." To launch the campaign, Starling Bank has recruited leading actor, James Nesbitt, to have his own voice cloned by AI technology, demonstrating just how easy it is for anyone to be scammed. Commenting on the campaign, Nesbitt says: "I think I have a pretty distinctive voice, and it's core to my career. So to hear it cloned so accurately was a shock. You hear a lot about AI, but this experience has really opened my eyes to how advanced the technology has become, and how easy it is to be used for criminal activity if it falls into the wrong hands. I have children myself, and the thought of them being scammed in this way is really scary. I'll definitely be setting up a Safe Phrase with my own family and friends."
[5]
Money blog: Britons warned against posting their voice online - as celebrity joins AI scam campaign
Britons warned against posting their voice online - as celebrity joins AI scam campaign The phone rings and it is the distinctive voice of Northern Irish actor James Nesbitt at the end of the line, seemingly speaking to his daughter. "Hi Peggy, I'm on my way to a shoot and trying to send you some money for the weekend. "Can you send me a picture of your card so I've got your details? Thanks hun. Bye!" Except James Nesbitt is the one who answered the phone - and he has never said those words before. Voice cloning scams are the latest, terrifying, use of AI and could catch out millions of Britons this year, according to new research released by Starling Bank. Voice cloning, where fraudsters use AI technology to replicate the voice of a friend or family member, can be done from as little as three seconds of audio - which can be easily captured from a video someone has uploaded online, or to social media. To launch the campaign, Starling Bank has recruited leading actor James Nesbitt to have his voice cloned by AI technology, demonstrating just how easy it is for anyone to be scammed. Starling Bank has launched the Safe Phrases campaign, in support of the government's Stop! Think Fraud campaign, encouraging the public to agree on a "safe phrase" with their close friends and family that no one else knows, to allow them to verify that they are really speaking to them in a bid to try and catch out scammers. James Nesbitt agreed to have his voice cloned as part of the campaign's launch. Speaking after hearing his voice, he said: "I think I have a pretty distinctive voice, and it's core to my career. So to hear it cloned so accurately was a shock. You hear a lot about AI, but this experience has really opened my eyes (and ears) to how advanced the technology has become, and how easy it is to be used for criminal activity if it falls into the wrong hands. "I have children myself, and the thought of them being scammed in this way is really scary." How bad are AI scams? Data from Starling Bank found that more than a quarter (28%) of UK adults say they have been targeted by an AI voice cloning scam at least once in the past year. Yet nearly half (46%) have never even heard of such scams. Nearly one in ten people (8%) said they would send money to someone if they were victims of a voice-cloning scam, even if they thought the call seemed strange. This means millions of pounds are at risk. Lisa Grahame, chief information security officer at Starling Bank, said: "People regularly post content online which has recordings of their voice, without ever imagining it's making them more vulnerable to fraudsters." Lord Sir David Hanson, fraud minister at the Home Office, said: "AI presents incredible opportunities for industry, society and governments but we must stay alert to the dangers, including AI-enabled fraud. "As part of our commitment to working with industry and other partners, we are delighted to support initiatives such as this through the Stop! Think Fraud campaign and provide the public with practical advice about how to stay protected from this appalling crime."
[6]
PSA: AI-generated voice cloning scams are on the rise - secret code recommended - 9to5Mac
A survey by a UK bank suggests that AI-generated voice cloning scams are on the rise, with 28% claiming to have been targeted. It's recommended that people agree a secret code to guard against the possibility of being take in ... A voice cloning scam is when a criminal uses AI to generate a fake version of the voice or a friend or family member, claiming to be in trouble and needing money urgently. While these scams have been around for years in text form, the use of AI voice tech gives attackers the ability to fool many more people. The Metro reports that today's AI tech can generate a convincing-sounding imitation of someone's voice using as little as three seconds of source material - and it's not hard to find social media videos with a sentence or two. A survey of over 3,000 people by Starling Bank sound that voice cloning scams [are] now a widespread problem [...] In the survey, nearly 1 in 10 (8%) say they would send whatever they needed in this situation, even if they thought the call seemed strange - potentially putting millions at risk. Despite the prevalence of this attempted fraud tactic, just 30% say they would confidently know what to look out for if they were being targeted with a voice cloning scam. The bank recommends that people agree code phrases they will use if they ever actually do need to contact a close friend or family member for urgent assistance.
Share
Share
Copy Link
AI-powered voice cloning scams are becoming increasingly prevalent, with 28% of adults falling victim. Banks and experts warn of the sophisticated techniques used by scammers to exploit social media content and empty bank accounts.
In an alarming trend, artificial intelligence-powered voice cloning scams are becoming increasingly sophisticated and widespread. Recent reports indicate that 28% of adults have fallen victim to these scams, which can clone a person's voice in as little as three seconds 1. This technology allows scammers to impersonate loved ones or authority figures, potentially leading to significant financial losses for unsuspecting victims.
Scammers exploit publicly available voice samples, often sourced from social media platforms like TikTok and Instagram. By using AI algorithms, they can create convincing voice clones that mimic the speech patterns and intonations of their targets 3. These cloned voices are then used in phone calls or voice messages to deceive victims into transferring money or sharing sensitive information.
The banking sector has raised serious concerns about the proliferation of these scams. Starling Bank, a UK-based digital bank, has reported a significant increase in voice cloning fraud attempts 4. The bank warns that these scams are becoming "out of control" and urges customers to be vigilant 2.
The financial impact of these scams can be devastating. Victims have reported losing substantial sums of money, with some cases involving tens of thousands of dollars. To combat this threat, experts recommend:
As the threat of AI voice cloning scams grows, there are calls for increased regulation and technological solutions. Some experts suggest developing AI detection tools that can identify cloned voices 5. However, the rapid advancement of AI technology presents ongoing challenges for both regulators and cybersecurity professionals.
Raising public awareness about these scams is crucial. Financial institutions and cybersecurity experts are emphasizing the importance of educating consumers about the risks associated with AI voice cloning. By understanding how these scams operate, individuals can better protect themselves and their financial assets from this emerging threat.
Reference
[1]
[2]
[4]
A sophisticated AI-based scam targeting Gmail users combines spoofed phone numbers, fake emails, and AI-generated voices to trick victims into revealing their account credentials.
11 Sources
Deepfake technology is increasingly being used to target businesses and threaten democratic processes. This story explores the growing prevalence of deepfake scams in the corporate world and their potential impact on upcoming elections.
2 Sources
Actor Tom Hanks has issued a public warning about fraudulent advertisements using AI-generated versions of his image and voice. The Hollywood star took to Instagram to alert his followers about these deceptive practices.
15 Sources
Hong Kong police bust a sophisticated AI-driven romance scam ring that used deepfake technology to create fake online personas, swindling victims across Asia out of $46 million through fraudulent cryptocurrency investments.
3 Sources
As AI-powered scams become more sophisticated, financial institutions are turning to AI to combat fraud and money laundering. This technological arms race is reshaping the landscape of financial crime prevention.
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2024 TheOutpost.AI All rights reserved