Curated by THEOUTPOST
On Thu, 13 Feb, 12:09 AM UTC
5 Sources
[1]
AI-Enhanced Romance Scams are Stealing Hearts and Bank Accounts
Be cautious not to "swipe right" into a scam as Valentine's Day approaches. Researchers at Tenable®, Inc., the exposure management company, warn that romance scams continue to be the biggest consumer threat today. In fact, 39% of Indians' love interests turned out to be scammers. Romance scams are surging in the country, with a 400% increase in romance-related spam and email scams. Scammers exploit dating apps and messaging platforms to target victims looking for companionship. Unsurprisingly, scammers are now leveraging generative AI to refine their messages, making their deception more convincing than ever. "Many of these scammers operate from overseas and don't speak fluent English," said Satnam Narang, senior staff research engineer at Tenable. "AI helps them craft sophisticated, emotionally compelling messages that make their scams more believable and harder to detect." Romance scams affect people of all ages and backgrounds. In India, particularly, scams on matrimonial websites are on the rise. A majority of women (78%) have encountered fake profiles. Scammers deploy various tactics, but most often, they create fake profiles using stolen photographs, gaining the trust of victims and eventually manipulating them into sending them money for "custom clearance of gifts" or for "medical emergencies". In addition, elderly individuals, former military personnel, and those seeking financial arrangements are among the most vulnerable to romance scams. Scammers deploy various tactics, from impersonating service members using stolen photos to orchestrating fake "sugar mummy and daddy" schemes, luring victims into fraudulent financial transactions. Others entice victims into adult video chats that require paid registrations, generating illicit profits in the process. The most dangerous form of romance scam today is 'romance baiting,' previously known as pig butchering. In these long-term cons, scammers establish fake relationships to build trust before convincing their victims to invest in bogus cryptocurrency or stock platforms. This method has now overtaken other romance scams in terms of prevalence and financial impact. "People have lost their life savings to romance scams, and it's heartbreaking," said Narang. "Victims are often blamed for falling for these schemes, but these scams are highly manipulative and exploit vulnerabilities that anyone could have." Recovering stolen funds is notoriously difficult, particularly when cryptocurrency is involved. To make matters worse, scammers often double down by targeting victims again, posing as recovery agents who promise to retrieve lost funds -- for a fee. The best defence is scepticism. If someone you've never met in person asks for money, whether for a sudden emergency, a business opportunity, or an investment, consider it a major red flag. If you believe you've been scammed, report the crime to your local law enforcement and cybercrime authorities immediately.
[2]
Romance scams - now AI powered - are on the rise
Image: Getty Images For many years, scammers have been exploiting dating apps and messaging platforms to target victims looking for companionship. Romance scams affect people of all ages and backgrounds, but older individuals, former military personnel and those seeking financial arrangements are among the most vulnerable. In November last year, Meta said it had identified and removed more than two million accounts in 2024 connected to romance scams. Many were conducted from Southeast Asia, especially in Myanmar, Cambodia, Laos and the Philippines. Love hurts Scammers deploy various tactics, from impersonating service members using stolen photos to orchestrating fake "sugar mummy and daddy" schemes, luring victims into fraudulent financial transactions. Others entice victims into adult video chats that require paid registrations, generating illicit profits in the process. More recently, scammers are thought to be leveraging generative AI to refine their messages, making their deception more convincing than ever. Grammar and spelling are no longer a barrier for scammers who are harnessing the power of AI to craft sophisticated, emotionally compelling messages and profiles on dating apps in a variety of languages that make their scams more believable and harder to detect. The most dangerous form of romance scam today is 'romance baiting,' previously known as 'pig butchering' -- a translation of the Chinese phrase "Shāzhūpán" used to describe a type of financial scam. In these long-term cons, scammers establish fake relationships to build trust before convincing their victims to invest in bogus cryptocurrency or stock platforms. This method has now overtaken other romance scams in terms of prevalence and financial impact. Romance scams: Life savings lost People have lost their life savings to romance scams, and it's heartbreaking. In January, a French woman revealed she had been tricked out of GBP700,000 after falling for a scammer who convinced her they were the famous actor, Brad Pitt. The sad reality of these schemes is that the victims are often blamed for falling for the scammers' con. In truth, these scams are highly manipulative and exploit vulnerabilities that anyone could fall for. Now, recovering stolen funds is notoriously difficult, particularly when cryptocurrency is involved. Historically, scammers would steal money from traditional finance (tradFi), such as banks and credit unions. However, tradFi has better recourse for recovering stolen funds, whereas cryptocurrency is decentralised, making it much harder to claw back a victim's money. To make matters worse, other scammers swoop in by targeting victims, posing as recovery agents who promise to retrieve lost funds -- for a fee. Protect your heart and your money The best defence against these romance-baiting scams is scepticism. If someone you've never met in person asks for money, whether for a sudden emergency, a business opportunity, or an investment, consider it a major red flag. If you believe you've been scammed, report the crime to your local law enforcement and cybercrime authorities immediately. The writer is a senior staff research engineer at Tenable. Read: Scam alert - Beware of phishing emails that impersonate CBUAE
[3]
Don't be duped by AI cupid: Stay alert for romance scams this Valentine's
This content is contributed or sourced from third parties but has been subject to Finextra editorial review. While Valentine's Day is a happy holiday for many, it also marks the day when the lonely are at their lowest. Just as it is destined to be full of red roses, heart-shaped candy, and cheesy greeting cards, Valentine's Day is fated to bring down the isolated and prey on the vulnerable. Which makes it a prime time to bring awareness to the flurry of romance scams taking place, and keep those susceptible lonely hearts vigilant. Johnathan Frost, director, global advisory at BioCatch, stated: "Romance fraud often involves a form of coercive control, with the offender engaging in a pattern of abuse that uses tactics to hurt, humiliate, intimidate, exploit, isolate, and dominate the victim. Romance fraudsters traditionally created a situation where the victim felt obligated to offer them financial support. Typically, this involved designing a "crisis" that could be addressed with a loan or direct financial support." According to the UK Finance 2024 Annual Fraud Report, romance scam payments increased 31% in 2023 with an average of ten payments per case, and is up 200% since 2020. Authorised romance scams are at the highest level since 2020, 75% of authorised push payment (APP) fraud cases originate online. Alarmingly, after purchase scams, romance scams are the most common types of APP fraud. Santander Breaks the Spell for lovelorn victims Designed to protect hopeful lovers from online scams, Santander UK's Breaking the Spell team works diligently to prevent their customers from falling victim to fraud. The Breaking the Spell team is based in Liverpool, and handles roughly 50-100 cases daily, dealing from all sorts of cons from phishing to investment scams, but mostly romance fraud. The team speaks directly to their consumers who are at risk of falling for scams, APP fraud, and identifying other forms of coercion to transfer money to illicit sources. Speaking to Chris Ainsley, head of fraud risk management at Santander, he stated that the team has prevented fraud up to £17 million since their inception in 2021. All 23 members of the team have a background in fraud investigation. "The Breaking the Spell team became highest tier of what we wanted, as at first we had people dealing with fraud victims, but we didn't have it all in one place. We didn't have our best team who could speak to customers all in one area with a multi-skill capability. It isn't just people in a contact center having a chat with you about your fraud. This is something where we can do a lot more with the customers. The main reason we came to the sort of realisation, that this was very important, was that we often knew when payments were fraudulent. We knew what was going on, even if that's as simple as the customers told us something that we know is wrong, because we know that that would never really happen." Providing an example of a case they were handling that day, Ainsley explained how the team had flagged an unusual transaction with a customer who spent £1000 at their local grocery store on gift cards, and upon calling the customer they stated their intention to take photos of the gift cards to share on WhatsApp to a person they are in a relationship with. Ainsley pointed out that the transaction itself was unusual, hence it was declined, and the call made it clear that the customer was a victim of a romance scam. Ainsley stated that the most intensive cases could take months of conversations with the customer to tear them away from the fraudster's (emotional) clutches. "They team needs to understand how to manage a customer at a time of real vulnerability. There's a huge amount of skill that's involved with that, and also huge amount of knowledge. This is one of the toughest jobs emotionally that you can have in the organisation. It is a tough job, and they do it excellently, every day," Other than romance scams, the Breaking the Spell team also deals with a lot of investment scams, celebrity-fronted cryptocurrency scams, clone scams, and job scams. Ainsley stated that romance scams take up most of their time, as it is easier to debunk job scams or impersonations of the police or authorities by speaking to the customer a few times and directing them to resources or a family member for guidance. With romance scams, however, the issue lies in people being unwilling to believe that they are being tricked, and the need to have multiple conversations to convince them of the fraud. What romance scams are afoot? According to Action Fraud Claims Advice, romance fraud has cost UK victims £400 million in the last five years. There were 232, 429 APP fraud cases in 2023, with a loss of £459.7 million, of which £287.3 million was returned to the victims. In 2023, there was £36.5 million lost to romance scams, the highest volume ever reported, and the averaging of ten payments per case indicates that the criminal convinced the victim to make multiple smaller payments over a longer period. Silvija Krupena, director of financial intelligence at RedCompass Labs, stated that pig-butchering scams, which I covered in last year's romance fraud update, are leading to major losses. The killer cocktail of crypto resurgence, economic crisis, and AI technology has created a vulnerable social environment ripe for catfishing. Frost added that the amount of cryptocurrency investment scams has become more sophisticated, by deceiving the victims that they are making returns and convincing them to slowly increasing the amount their investing. Marko Maras, CEO and founder of fraud prevention firm Trustfull, detailed: "The most successful scammers don't rush. They adopt a 'slow-burn' approach, carefully nurturing online relationships over weeks or even months before making their move. When the time is right, the ask is usually linked to a fake medical emergency, a golden investment opportunity, or a sudden family crisis." In 2023, 84% of all romance scams originated online and 63% were reimbursed. Your catfish could be AI The Norton Cyber Safety Insights Report found that 54% of British individuals on dating apps have engaged with suspicious profiles or messages at least once a week, and 81% targeted have lost money to them, £1008 on average. Individuals aged 40 to 60 were most targeted, though youths were also duped through dating apps. According to Statista, 36% of 18-24 year olds admit that loneliness makes them more vulnerable while online dating, and more susceptible to risks. Greg Hancell, head of fraud at Lynx, cited that 11.1 million people in the UK used dating apps in 2023, which is prokected to reach 12 million in 2028. Hancell expounded: "As more people turn to digital platforms for companionship, fraudsters have capitalised on this trend, using increasingly sophisticated techniques to deceive and exploit victims. Additionally, the algorithmic approach leads to new people being amplified and therefore sticking out to criminals who target people new to online dating and the apps as they are still learning what to look out for. "A decade ago, if someone had met a romantic partner online but had never spoken to them on the phone or via video call, it would have been an immediate red flag if they asked them for money. Now, with GenAI and deepfake technology, which enable anyone to impersonate someone else in many ways we would normally communicate criminals can build up trust with their victim. For example, by messages, by video or even by audio. This allows scammers to establish long-term emotional connections, making their deception even more convincing." Maras indicated that the pickup in online dating since COVID has curated an environment for fraudsters to flourish, where it is easy for conners to make false profiles and target those specifically looking for companionship, however ID verification on dating apps are attempting to limit the catfishing potential. Frost stated: "Just as legitimate businesses are using GenAI to boost productivity, criminals are using these technologies to scale their fraud operations. Research on large language models (LLMs) in personalised phishing attacks found that AI-generated scams achieved a staggering 54% click-through rate (CTR) -- matching human fraudsters but at a fraction of the cost. When AI was combined with human expertise, the CTR jumped even higher to 56%, demonstrating the dangerous efficiency of these tools." Maras added: "For instance, do you know those calls from unfamiliar numbers in remote locations, where all you hear is silence when you pick up? One possible reason for them is that fraudsters could be recording and sampling your voice, feeding it into an AI-powered generator to create highly realistic deepfake audio. These fake voice notes can then be used to deceive loved ones or to trick a victim of a romance scam. "The reality is that generative AI has become so advanced that we can no longer automatically trust what we see or hear online. As fraudsters become more sophisticated, it's crucial for individuals to exercise extra caution, and for messaging and social media platforms to implement stronger verification measures to identify and block scammers early on." Krupena added: "With AI farming, criminals can scale these operations using AI-generated profiles and voice clones to create eerily lifelike interactions. Then there's the rise of AI chatbot 'girlfriends' or 'boyfriends'. These are designed to emotionally entangle users before they extract money through subscriptions or direct fraud. Sextortion also remains a serious and growing threat, especially for minors, who are coerced into sharing explicit content that is later weaponised for blackmail. "Meanwhile, deepfake technology is making deception even more dangerous. Scammers can now create ultra-realistic fake videos and audio, as seen in the recent €830,000 Brad Pitt deepfake scam. These evolving tactics demonstrate how fraudsters are weaponising trust and emotions to cause lasting financial and psychological harm." What regulation is in place to combat romance fraud? There are AML rules and regulations in place to combat scams and fraud in the EU and the UK. The UK Payments System Regulator requires financial institutions to compensate victims up to £85,000 for transactions not made via cryptocurrency or internationally. Maras pointed to the UK Confirmation of Payee and Verification of Payee schemes as services that prevent fraud by authenticating the recipient. Frost detailed how banks are increasing their fraud capabilities to flag down suspicious activity and identify when victims are unknowingly engaging in APP scams or money mule situations. He highlighted that behavioural analysis has seen progress with banks in Australia, one of which blocked AUS $50 million in fraudulent payments, but it is for more banks to identify how customers' behaviours change over time as fraudsters socially engineer situations to control them. Krupena emphasised that more pressure should be placed on social media to combat scams and fraud that run rampant on their platforms: "Social media platforms like Meta must do more, as most fraud originates on Facebook and Instagram. Where are the awareness campaigns, misinformation labels, and warnings for high-risk content like dubious investment advice, marketplace scams, and fake job offers? Banks struggle to detect socially engineered scams because victims believe their payments are legitimate. True change requires a united effort from social media, banks, regulators, and law enforcement." Keep safe from romance scams - everyone is susceptible It is hard to know when you are being scammed at times, as AI technology is now able to pose as your loved ones, and fraudsters are well-trained in what they do. Their aim is to build up trust with the victim to take more money over time - so stay vigilant against these warnings. Maras advised to never send money to someone you haven't met in person, and Krupena stated that if you face the misfortune of falling victim to a romance scam, don't be embarrassed - the tech is sophisticated and it could happen to anyone. Be sure to report it and get reimbursed. Frost concluded: "Research shows that romance fraudsters seek to strike a balance between the romantic and financial aspects of the communication, helping them hide their criminal intent. Be alert to the use of language that creates a strong emotional response and question any attempts to isolate you from sources of support such as friends and family. Romance fraud is a financially and emotionally devastating crime that uses grooming strategies that try and prevent victims from using logic or reason to protect themselves. For those who are doubtful about the relationship you are in, please seek support."
[4]
Scammers using AI to dupe the lonely looking for love
San Francisco (AFP) - Meta on Wednesday warned internet users to be wary of online acquaintances promising romance but seeking cash as scammers use deep fakes to prey on those looking for love. "This is a new tool in the toolkit of scammers," Meta global threat disruption policy director David Agranovich told journalists during a briefing. "These scammers evolve consistently; we have to evolve to keep things right." Detection systems in Meta's family of apps including Instagram and WhatsApp rely heavily on behavior patterns and technical signals rather than on imagery, meaning it spies scammer activity despite the AI trickery, according to Agranovich. "It makes our detection and enforcement somewhat more resilient to generative AI," Agranovich said. He gave the example of a recently disrupted scheme that apparently originated in Cambodia and targeted people in Chinese and Japanese languages. Researchers at OpenAI determined that the "scam compound" seemed to be using the San Francisco artificial intelligence company's tools to generate and translate content, according to Meta. Generative AI technology has been around for more than a year, but in recent months its use by scammers has grown strong, "ethical hacker" and SocialProof Security chief executive Rachel Tobac said during the briefing. GenAI tools available for free from major companies allow scammers to change their faces and voices on video calls as they pretend to be someone they are not. "They can also use these deep fake bots that allow you to build a persona or place phone calls using a voice clone and a human actually doesn't even need to be involved," Tobac said. "They call them agents, but they're not being used for customer support work. They're being used for scams in an automated fashion." Tobac urged people to be "politely paranoid" when an online acquaintance encourages a romantic connection, particularly when it leads to a request for money to deal with a supposed emergency or business opportunity. Winter blues The isolation and glum spirits that can come with winter weather along with the Valentine's Day holiday is seen as a time of opportunity for scammers. "We definitely see an influx of scammers preying on that loneliness in the heart of winter," Tobac said. The scammer's main goal is money, with the tactic of building trust quickly and then contriving a reason for needing cash or personal data that could be used to access financial accounts, according to Tobac. "Being politely paranoid goes a long way, and verifying people are who they say they are," Tobac said. Scammers operate across the gamut of social apps, with Meta seeing only a portion of the activity, according to Agranovich. Last year, Meta took down more than 408,000 accounts from West African countries being used by scammers to pose as military personnel or businessmen to romance people in Australia, Britain, Europe, the United States and elsewhere, according to the tech titan. Along with taking down nefarious networks, Meta is testing facial recognition technology to check potential online imposters detected by its systems or reported by users.
[5]
Meta warns users not to fall for romance scammers posing as celebrities or military
The company has already taken down 116,000 accounts associated with the scams in 2025. Think you might have met someone "attractive, single and successful" on Facebook or Instagram? You might want to think again, Meta says. Ahead of Valentine's Day, the company is once again warning users not to fall for romance scams. These kinds of schemes, in which scammers create fictitious identities to form online relationships with unsuspecting victims, aren't exactly new. (The FTC says that people lost more than a half billion dollars to romance scams in 2021.) But the people behind these scams are apparently persistent. Meta says that already in 2025 it's taken down more than 116,000 accounts and pages across Facebook and Instagram that were linked to romance scams. In 2024, it removed more than 408,000 such accounts. According to Meta, these scam accounts often originate in West African countries with scammers impersonating members of the US military or famous celebrities. In both cases, they'll claim to be "looking for love" and will strike up conversations with people on Facebook, Instagram and WhatsApp as well as other messaging platforms. Eventually, the scammer will request gift cards, crypto, or other types of payments. Meta has taken steps to fight these types of schemes. The company said last year it would bring back facial recognition tech to address celebrity impersonation. It also works with other companies to shut down organized groups of scammers. Still, David Agranovich, director of threat disruption at Meta, noted that "scammers evolve consistently." Researchers also say that AI has made it even easier for scammers to assume convincing fictitious identities. "In the last three or four months, there's a couple of different tools that have come out where they're free, they're accessible, they're easy to use, and they allow the attacker to transform their face dynamically within the video call," Rachel Tobac CEO of SocialProof Security said during a call with reporters. "They can also use these deepfake bots that allow you to build a persona, place phone calls, use a voice clone and a human actually doesn't even need to be involved."
Share
Share
Copy Link
As Valentine's Day approaches, AI-enhanced romance scams are on the rise, posing a significant threat to online daters and lonely individuals. Scammers are leveraging advanced technologies to create more convincing personas and messages, leading to substantial financial losses for victims worldwide.
As Valentine's Day approaches, cybersecurity experts are warning of a surge in AI-powered romance scams. These sophisticated schemes are stealing hearts and bank accounts, with scammers leveraging generative AI to craft more convincing and emotionally compelling messages 1.
Satnam Narang, senior staff research engineer at Tenable, explains, "Many of these scammers operate from overseas and don't speak fluent English. AI helps them craft sophisticated, emotionally compelling messages that make their scams more believable and harder to detect" 1.
Romance scams are not limited to any particular region. In India, there has been a 400% increase in romance-related spam and email scams, with 39% of Indians' love interests turning out to be scammers 1. Meta reported taking down more than 408,000 accounts from West African countries in 2024, which were being used to pose as military personnel or businessmen to romance people in Australia, Britain, Europe, and the United States 4.
While romance scams affect people of all ages and backgrounds, certain groups are particularly vulnerable:
Scammers employ various tactics, including:
The most dangerous form of romance scam today is 'romance baiting,' previously known as 'pig butchering.' In these long-term cons, scammers establish fake relationships to build trust before convincing their victims to invest in bogus cryptocurrency or stock platforms 2.
Scammers are now leveraging generative AI to refine their messages and create more convincing personas. Rachel Tobac, CEO of SocialProof Security, notes, "They can also use these deep fake bots that allow you to build a persona or place phone calls using a voice clone and a human actually doesn't even need to be involved" 4.
The financial impact of these scams is significant. In the UK alone, romance fraud has cost victims £400 million in the last five years 3. Recovering stolen funds is notoriously difficult, particularly when cryptocurrency is involved. To make matters worse, some scammers pose as recovery agents, promising to retrieve lost funds for a fee 2.
Meta and other platforms are taking steps to combat these scams. Meta has removed over 116,000 accounts and pages linked to romance scams in 2025 alone 5. The company is also testing facial recognition technology to check potential online imposters 4.
Experts advise users to be skeptical of online acquaintances asking for money. Chris Ainsley, head of fraud risk management at Santander, emphasizes the importance of specialized teams like their "Breaking the Spell" unit, which has prevented fraud up to £17 million since 2021 3.
As AI continues to evolve, staying vigilant and educated about these scams becomes increasingly crucial for online safety, especially during emotionally vulnerable times like Valentine's Day.
Reference
[1]
[2]
[3]
A recent McAfee study uncovers a significant increase in AI-driven romance scams, with 51% of Indians encountering AI chatbots posing as real people on dating platforms. The research highlights the growing threat of AI in online dating and its impact on user trust.
2 Sources
2 Sources
A French woman fell victim to an elaborate romance scam involving AI-generated images of Brad Pitt, losing $865,000. The incident highlights the growing sophistication of online fraud and the psychological factors that make victims vulnerable.
3 Sources
3 Sources
As AI technology advances, scammers are using sophisticated tools to create more convincing frauds. Learn about the latest AI-enabled scams and how to safeguard yourself during the holidays.
7 Sources
7 Sources
AI-generated phishing emails are becoming increasingly sophisticated, targeting executives and individuals with hyper-personalized content. This new wave of cyber attacks poses significant challenges for email security systems and users alike.
9 Sources
9 Sources
The FBI has issued an alert about the increasing use of generative AI by criminals to enhance fraud schemes, urging the public to adopt new protective measures against these sophisticated threats.
4 Sources
4 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved