8 Sources
8 Sources
[1]
Swipe Left on These 5 Major Dating App Red Flags
Remember The 5 Love Languages? The premise is that people perceive and show deep emotions through five distinct love languages. The names are trademarked, of course, but they boil down to doing favors, enjoying intimate moments, giving gifts, sharing kind words, and spending time together. It turns out that romance scammers often use tactics akin to a "love language" as well. But unlike the romantic behaviors in the book, a scammer's love language isn't meant to gain empathy or understanding. Scammers just want their victims to be besotted and vulnerable as quickly as possible so they can steal their money or identity and move on. So, who are the people getting scammed? How do scammers groom their victims? To find out, I asked Ashley Rose, CEO of Living Security, an online security training firm, what people can watch out for as they seek their soulmate online. Decoding the 5 Love Languages of Romance Scammers Rose told me that sometimes, there are warning signs that may indicate an online scam is afoot. I've boiled these signs down to five love languages that romance scammers may use when luring a victim on a dating app. 1. Acts of Speed Scammers usually try to lock down the relationship quickly. "Scammers [are] trying to get your heart connected. They're saying, 'I love you,' and they're moving very fast in the relationship," said Rose. 2. Receiving Money Money requests in any form are a giant red flag when it comes to online dating, especially for older victims. "Don't send money to anyone you meet online," Rose said. "That should just be across the board." 3. Requests for Information Sometimes, your personal information is even more valuable to a scammer than cash. Rose said, "Things like your birthday, your Social Security number, your banking information -- all of that can [allow] the scammer to take on your identity." 4. Faked Photos, Videos, or Voice Calls Generative AI tools help anyone, including online financial scammers, create believable fake profiles on dating platforms. They can use real people's photos and videos posted on social media platforms to generate new audio and photos that appear indistinguishable from reality. Scammers can even use AI to impersonate people on video calls or live video streams. In other words, be careful, because you can't know if you're being catfished or defrauded until you see your lover in the flesh. Always verify whether the person sending you photos is the person you're speaking to online. Rose said, "If they're sending you images, you can do reverse image searches on Google. Many of these scammers are taking people's pictures from social media accounts or somewhere else online." 5. Virtual-Only Touch Be wary of people who always have an excuse not to meet in person. "If you're trying to meet up and you set a time and date, and then, 'Oh, I have a business trip that came up,' or somebody got sick, or this happened, they're avoiding in-person meetups," Rose said, warning of common red flags. The Most Popular Online Romance Scams in 2026 Don't feel bad when you realize you're talking to a scammer. According to new research from McAfee, 53% of American survey respondents say they've been asked for financial information or money while talking to someone they met on a dating site. These requests came via QR codes or links sent via direct messages, email, or text. Don't give them what they want! Here's a rundown of other scam attempts to watch out for on dating apps this year. Fake, AI-Generated Photos and Profiles McAfee's survey results reveal that 35% of Americans who use dating apps have encountered fake or AI-generated photos or profiles. Men (27%) are a lot more likely to report an AI catfish profile than women (19%), though they're also the ones who are encountering them a lot more often (30% as opposed to 20% of women surveyed). Younger people are also in scammers' sights now. About 1 in 3 adults under 45 say they've encountered fake or AI-generated profiles while browsing dating platforms, with 21% of people aged 18 to 24 saying they encounter possible romance scammers every day. Survey respondents older than 45 said they encounter fake dating profiles far less often, though it's unclear whether they were correctly identifying fake or AI-generated photos and profiles. Automated AI Scams These days, scamming has less of a human touch, as would-be criminals simply outsource much of the pesky, time-consuming social engineering tasks to AI chatbots. That's right, all a scammer needs to do is use AI to create believable, personalized dating profiles on a few platforms, and then deploy bots to keep conversations going with an unlimited number of victims. After the AI establishes trust with its suitor over days or weeks, the conversation shifts from family planning to financial planning. Suddenly, the lover will claim a medical emergency or another desperate situation, such as needing a travel visa to stay in the country. The "relationship" ends when the victim hands over the money, and the chatbot stops responding. Sometimes online scammers return to their victims after the initial payment, only to request more money for different ailments or emergencies. Phishing Links to Membership-Only Dating Sites In the age of Raya and a plethora of other members-only dating apps catering to the most attractive, famous, and wealthy people, it's no wonder some scammers are getting in on the trend, spamming dating sites with phishing links to fraudulent platforms. In fact, one in three McAfee survey respondents said they received an invitation to an "exclusive" or "invite‑only" dating app, and 14% even signed up and shared payment information. Pig Butchering Scams There's also the threat of the oh-so-unappetizingly-named "pig-butchering scams." Interpol asked us to refer to these crimes as romance baiting scams, not because the other term sounds gross, but because it dehumanizes the victim, and we agree. After all, the scams are pretty dehumanizing, too. Here's how it works: A scammer contacts their victims via a dating app, social media, or SMS and starts a conversation. After building trust, they ask the victim to download an app or visit a website to invest in cryptocurrency. You probably know what's coming next if you've read this far. The victim's banking details are entered, and the criminals clean out the money. Typically, the scammer deletes their dating profile from the platform immediately, leaving the victim high and dry. Navigate Dating Apps With Caution These warnings may sound a bit depressing, but you can still find love and affection online. The important thing, according to Rose, is to enter any conversation or relationship with a healthy dose of skepticism and proceed slowly from there. Scammers are usually impatient and will move on, while a real match grows at its own pace. Ready to dip your toes into the dating pool? Use the above suggestions and check out our picks for the best dating apps.
[2]
A new wave of romance scams is washing across the internet - here's how to stay safe
Romance scams are among the most emotionally damaging forms of cyber crime because they combine carefully manufactured intimacy with financial theft - the scammers go after your heart, and then your wallet. Just last week, Australian police warned more than 5,000 people they may have been targeted in a large-scale romance scam linked to overseas syndicates. The scammers used common dating apps to find victims and start online relationships, then tricked their victims into buying a fake cryptocurrency. Importantly, the romance scammers' toolkit has changed in recent years. Artificial intelligence (AI) has lowered the cost of impersonation. Convincing profile photos can be generated in minutes, affectionate conversations can be auto-generated, and "proof" of identity can now be faked through voice and video. In the lead-up to Valentine's Day, dating apps get busier. So how can we stay safe from romance scammers? Anatomy of a romance scam Romance scams rely on a small number of psychological levers, applied repeatedly. Finding their victims online through various platforms, romance scammers accelerate intimacy, often expressing strong feelings unusually early. Then, they isolate their target. Often, the entire romance scam quite literally follows a script and plays out like this. First, the "dating" profile of the scammer appears highly credible. Scammers use attractive photos - increasingly AI-generated or stolen - paired with plausible personal details and consistent messaging. Second, the scammer pushes to move the conversation off the app. WhatsApp, Telegram or text messages are pitched as more convenient or more private. This shift is key. Once the victim has been persuaded to move the communications off the dating app, they lose access to built-in safety features that could help to protect them. If they're using their real email address or phone number, this also potentially exposes more of their personal details to the scammer. Third, comes the financial request. The scammer may cite a believable excuse - travel problems, banking issues, family emergencies. But it's not always an urgent plea for help. Many scams now evolve into investment fraud, where victims are steered into fake profit-making opportunities, often involving cryptocurrency. Victims may be encouraged to invest "together" or are shown screenshots of supposed past profits. Because the scam is framed as a shared future rather than a request for cash, it can go unrecognised. It's harder to tell who's a real person AI strengthens these tactics by making the scams much easier to scale up. Automated tools allow scammers to maintain frequent, emotionally warm conversations across multiple victims with minimal effort. For years, video calls functioned as an informal identity check. If you could see someone talk and respond in real time, you would feel confident you were talking to a real person. Now, generative AI-powered deepfakes - artificial video or audio designed to imitate a person - are increasingly accessible for scammers to use. A simple face-swapping or voice-cloning tool can be persuasive over a short call. The scammer only needs enough plausibility to move a conversation past doubt. When the victim is already emotionally invested, they ignore red flags more easily. How can you stay safe online? While AI makes romance scams more convincing, effective defences do exist. You can still date online safely - as long as you stay vigilant and follow some easy steps to verify the people you engage with. Slowing the relationship down remains one of the strongest ways of protecting yourself. If you spend more time talking to the person, there's a chance some inconsistencies will surface. Besides, scammers get tired quickly. Keep conversations on the dating platform for longer. Don't cave into early pressure to move off-platform, and treat this as a potential red flag. Make sure you identify the person across different platforms. Use reverse-image searches which can expose stolen or synthetic photos. A genuine person usually has a broader, consistent digital footprint beyond a single curated profile. Treat investment advice or requests for money as a bright-red flag. This is the most important advice. If someone you have never met in person begins steering you toward cryptocurrency, trading platforms or guaranteed returns, disengage. Never send intimate images to someone you haven't met and verified. Financial scams can also quickly pivot to blackmail. If you have already transferred money, acting quickly matters. Contact your bank immediately and report the incident to Scamwatch or ReportCyber. Early reporting can reduce losses and help authorities disrupt larger networks. Remember that romance scammers are highly skilled at appearing trustworthy, so "trusting your gut" or relying on your feelings won't necessarily help you. As generative AI tools proliferate, verifying what's real online is getting harder. So take things slowly, check details in different places, and - by far the most important step - avoid anything that turns a romance into a money request, no matter how infatuated you might be.
[3]
4 romance scams to watch out for this V-Day -- including AI grifts
Driving the news: The FBI warned this week against romance scams, flagging that "criminals are also exploiting generative AI to commit fraud on a larger scale." * FBI field offices in cities like San Francisco, Jacksonville and Philadelphia put out separate notices outlining local fraud incidents and how to avoid them. By the numbers: Americans in 2025 reported $1.16 billion stolen in romance scams, according to FTC data. * BioCatch, a startup that specializes in fraud and financial crime prevention, told Axios that 340 financial institutions reported a 63% uptick in romance scam attempts between 2024 and 2025. What they're saying: "Evolving criminal tactics have driven this surge, leveraging the capabilities of GenAI tools to create deepfake personas, hyper-personalize convincing outreach, and exploit social media and dating apps, fueling the fraud epidemic even further," Jonathan Frost, BioCatch's director of Global Advisory, said. Here's four scams to watch out for: Celebrity impersonation Zoom in: AI is making it easier for scammers to pose as celebrities in order to swindle their starstruck victims. * The Better Business Bureau's Scam Tracker "has received numerous reports involving products supposedly endorsed by well-known and trusted celebrities," the nonprofit said in December. * In one example, "consumers reported receiving emails from what appears to be Kim Kardashian asking them to send her money to help the victims of the California wildfires," per the BBB. * There have been multiple cases of scammers posing as Kevin Costner, Sandra Bullock, and Jennifer Aniston. Frost said, "Fraudsters use AI-generated personas, recruit 'models' to enhance their deception, and even deploy deepfake videos and live video calls to build trust with victims." "Pig butchering" schemes "Pig-butchering," also known as "romance baiting," involves fraudsters encouraging victims to make investments by posing as successful crypto investors. * "This scam makes fraud detection more difficult, as victims believe they are engaging in legitimate investing rather than being duped," Frost said. The FBI's San Antonio office cautioned that these scams often begin on dating or social media platforms. * "Victims are directed to professional-looking websites controlled by criminals that falsely display significant profits," the field office said. Tragedy scams The FTC said in 2023 the most-cited reason scammers gave to extort their victims was: "I or someone close to me is sick, hurt, or in jail." * FBI's Boston office said that in one case, a woman from Casco, Maine lost $20,000 after starting an online romance with a man from Cuba who claimed he needed the money to return to the U.S. "Worker abroad" scams The FTC warned that scammers typically say they can't meet you in person. * "They might say they're living or traveling outside the country, working on an oil rig, in the military, or working with an international organization." * The agency said, "I'm in the military far away" was the third-most cited scam in 2023. An elderly woman in Scotland was duped out of £17,000 by scammers using deepfake AI technology, Frost noted. * "They convinced her to buy Steam gift cards, as the scammer highlighted that these would allow for their conversation to continue as they worked on an oil rig in the North Sea." How to prevent and report scams Zoom out: The FBI field offices urged people to "take a beat," when escalating online romances. * The Boston office recommended only using "reputable, nationally recognized dating websites," keeping in mind that scammers may be using those too. * "Research photos and profiles in other online search tools to see if the image, name, or details have been used elsewhere." * The bureau's full list of advice can be found here. How it works: If you suspect you're the victim of a romance scam, you can report it to the FBI and FTC. The bottom line: Look out for more than just Cupid's arrow this V-Day.
[4]
Dating online this Valentine's Day? Here's how to spot an AI romance scam
Valentine's Day is around the corner, and cybersecurity experts warn that the season of love is also the peak season for romance scams. As people swipe, match, and open up online on dating apps and social media, scammers are using AI to scale fake relationships with speed and precision. Such romance scams are spreading fast, and artificial intelligence is making them far more convincing than the awkward catfishing attempts of the past. What once felt easy to spot now looks thoughtful, emotionally tuned, and unsettlingly real scams. For victims, the damage is not just emotional. It is often financial, and sometimes life-altering. Why romance scams are exploding Romance scams have quietly become one of the most costly forms of online fraud. According to the US Federal Trade Commission, reported losses to romance scams totalled $1.14 billion in 2023, making them one of the most financially damaging scam categories tracked by the agency. Regulators say those losses continue to rise as scams become more sophisticated and harder to detect due to the use of AI. Recommended Videos What sets today's scams apart is how quickly trust is built. Instead of manually crafting messages, scammers now use large language models (LLMs) to generate emotionally engaging conversations at scale. Messages feel attentive and deeply personal, even when they are sent to hundreds of people at once. Many victims do not realise anything is wrong until money enters the conversation. How AI is changing the scam playbook With the help of AI, scammers now adapt in real time, and send messages based on a target's tone, interests, or vulnerabilities. Text-generation tools help mirror language and emotion, making conversations feel natural and responsive. Voice cloning has added another layer of realism. With short audio samples, scammers can recreate a person's voice to send convincing voice notes or calls. Some groups are also experimenting with deepfake video chats, using AI-generated visuals or prerecorded footage to simulate live interaction. The result is a scam that feels authentic enough to bypass warning signs people once relied on. The emotional hook and the money trap Cybersecurity experts say most romance scams follow a familiar pattern. Darius Kingsley, Head of Consumer Business Practices at Chase Bank, says, "Online romantic acquaintances may approach unsuspecting victims to lure them in, either through friendly texts or on dating apps, then request money or propose an investment opportunity." Here is a practical checklist of warning signs you should look out for: The person claims to live or work far away, often overseas. Their profile looks unusually perfect or professionally curated. The relationship escalates very quickly with intense emotional language. Promises to meet in person are repeatedly delayed or cancelled. Conversations shift toward money, investments, or financial emergencies. Cryptocurrency or forex trading are mentioned early or framed as urgent. You are pressured to use a specific payment method. How to protect yourself before it's too late Experts agree that prevention starts with slowing down because romance scams rely on urgency, secrecy, and emotional pressure. According to Steve Grobman, Senior Vice President and Chief Technology Officer at McAfee, "a healthy dose of skepticism, combined with using the right tools to protect your privacy, identity, and personal information, is a good place to start." Here is a checklist to protect yourself from romance scams: Take relationships slowly and be cautious of intense emotional bonding early on. Verify identities by meeting them in person. Ask the person to turn their head fully or wave a hand in front of their face during a video call. Many AI deepfake tools struggle with sudden or exaggerated movements. Watch for visual glitches as faces can briefly distort, freeze, or lose alignment when the software fails to track motion correctly. Be aware that advanced deepfakes may still pass these tests. Never send money, gifts, or cryptocurrency to someone you have not met in person. Reverse-image search profile photos to check for reused images. Talk to friends or family about new online relationships. Lock down social media privacy settings to limit misuse of your data. When people are emotionally invested, inconsistencies are easier to overlook. That is why scammers often discourage victims from discussing the relationship with friends or family, framing outside concern as jealousy or misunderstanding. John Clay, Vice President of Threat Intelligence at Trend Micro, notes that scammers use urgency and isolation to bypass rational thinking, which is why outside perspectives of trusted family and friends are so important. A caution for anyone dating online right now Romance scams are not just evolving, they are becoming harder to recognise, with AI quietly doing much of the work behind the scenes. Valentine's Day simply amplifies the risk as people seek connection and lower their guard. While the technology driving these scams is changing fast, the core advice remains unchanged. Take your time. Verify who you are talking to. Be wary of the moment money, investments, or urgency enter the conversation, particularly if you have never met in person. In an age where algorithms can convincingly simulate care and intimacy, trusting your instincts may still be the most important safeguard you have.
[5]
Is it love? Or is it an AI romance scam?
Shayna Korol is a Future Perfect fellow, Vox's section on making the world a better place. She reports on emerging technology, biosecurity, and human and animal health. Happy Valentine's Day. Don't let romance scams -- which ramp up around the holiday and are at an all-time high -- break your heart. These scams cost Americans $3 billion last year alone. That's almost certainly an undercount, given victims' particular reluctance to report that they've fallen for such ruses. Many romance scams fall under the umbrella of so-called "pig-butchering" scams, in which fraudsters build relationships with and gain the trust of victims over long periods of time. The moniker is a crude reference to fattening up a pig before the slaughter -- and they go for the whole hog, repeatedly attempting to extract money from the target. Between 2020 and 2024, these scams defrauded more than $75 billion from people around the world. Now, AI is making these scams increasingly accessible, affordable, and profitable for scammers. In the past, romance scammers had to have a strong grasp of the English language if they wanted to effectively scam Americans. According to Fred Heiding, a postdoctoral researcher at the Harvard Kennedy School who studies AI and cybersecurity, AI-enabled translation has completely removed that roadblock -- and scammers now have millions more potential victims at their disposal. AI is fundamentally changing the scale, serving as a force multiplier for scammers. A single person who used to manage a few scams at a time can use these toolkits to run 20 or more simultaneously, Chris Nyhuis, the founder of cybersecurity firm Vigilant, told me over email. AI-assisted scams are significantly more profitable than traditional ones, and they're increasingly cheap and easy to run. On the dark web, fraudsters can purchase romance scam toolkits complete with customer support, user reviews, and tiered pricing packages. These toolkits come with pre-built fake personas with AI-generated photosets, conversation scripts for each stage of the scam, and deepfake video tools, Nyhuis told me. "The skill barrier to entry is essentially gone." I wondered if romance scammers might automate themselves out of a job, but the Kennedy School's Heiding told me that "oftentimes it's just augmentation, rather than complete automation." Many of the scammers are also victims themselves, with at least 220,000 people trapped in scam centers in Southeast Asia and forced to defraud targets, facing terrible abuse if they refuse. Leveraging AI means "the crime syndicates [who run these centers] will probably just have better profit margins," Heiding said. For now, there's a human being behind the scenes of the scams, even if they're just pressing start on an AI agent. But apart from that, it can be fully automated. At the moment, Heiding told me, AI isn't much better than human romance scammers, but the technology evolves rapidly. In 2016, Google DeepMind's AlphaGo beat the world's best human go player in a landslide. Human forecasters think that AI is set to far outpace their ability to predict the future very soon. "I wouldn't be surprised [if] within a few years or a decade, we have AI scammers that are just thinking in completely different patterns than humans," Heiding said. "And unfortunately, they probably will be really, really good at persuading us." Romance scams are unique: They target a core human need for love and connection. You may have heard that we're in a loneliness epidemic, officially declared by the US Surgeon General in 2023, with health risks on par with smoking up to 15 cigarettes a day. Social isolation is linked to higher rates of heart disease, dementia, depression, and even premature death - and reportedly, 1 in 6 people worldwide are lonely. And lonely people make for prime targets. Fraudsters send out initial AI-generated messages to prospective victims. Over time, they use lovebombing techniques to convince them that they are in a romantic relationship. Once trust is established, they make requests for money through methods that are difficult to recover like gift cards, wire transfers, or cryptocurrency. They will often make up crises that require urgent transfers. They might ghost the victim after reaching their goals, or continue the scam to squeeze more out of them. AI romance scams use deepfake video calls, "cheap fake" social media profiles, and voice cloning technology like other AI-enabled scams to draw people in. But according to Nyhuis, they're "uniquely dangerous because of what they exploit. Phishing uses urgency; tech support scams use fear. Romance scams use love, which can make people think irrationally or overlook their gut feeling that something is wrong." Older adults often experience social isolation and are frequently targeted by romance scammers. Retirement and bereavement can create circumstances that scammers deliberately manipulate, making victims feel seen and cared for, even as they steal their life savings and the homes where they plan to spend their retirement years. But anyone can be deceived by these scams. Despite being digital natives, Gen Z is three times more vulnerable to online scams than older generations since they spend so much time online, although they tend to have -- and therefore lose -- less money than older victims. Here's something else that will break your heart: Scam victims are more likely to be targeted again. Scammers create profiles of their targets, sometimes adding them to "sucker lists" shared across criminal networks. Victims of other crimes are also more likely to be revictimized, and falling prey to a romance scam isn't a moral failing on the part of the target. But it is something to be on guard against, since the vast majority of scam victims will not be able to get their money back. About 15 percent of Americans have lost money to online romance scams, and only 1 in 4 were able to recover all the stolen funds. Romance scams thrive in shame and secrecy. Victims are sometimes blackmailed and told that if they confide in people in their lives, the scammers will expose sensitive information. Sanchari Das, an assistant professor and AI researcher at George Mason University, and Ruba Abu-Salma, a senior lecturer in computer science at King's College London, received a Google Academic Research Award to study AI-powered romance scams targeting older adults in 13 countries. Their research examines how AI tools can amplify traditional scam tactics and how families and communities can better support the victims. The researchers are building connections with gerontological societies, and aim to build educational tools to support AI romance scam victims. There's a fair amount of information already out there about prevention, but very little directing victims what to do next. Like so many people, I met my partner online. I'm grateful that we started dating in the late 2010s, before the explosion of AI-generated profiles on apps and dating sites. AI is getting better at tricking people across the board. It has massively improved at rendering hands, a formerly reliable tell for deepfakes, and it learns from its mistakes. "As these technologies improve, traditional signals for spotting manipulation are no longer dependable," Das said. "At the same time, we are leveraging AI to counter these threats by detecting scam patterns, forecasting emerging tactics, and strengthening protective responses. The goal is to build systems and communities that are as adaptive as the technology itself." Society is also getting increasingly desensitized to AI romance. One study found that almost a third of Americans had an intimate or romantic relationship with an AI chatbot. The 2013 movie Her, in which a man falls in love with an AI voiced by Scarlett Johansson, was set in 2025. It wasn't too far off the mark. AI chatbots are purposefully designed to keep people engaged. Many use a "freemium" model, in which basic services don't cost anything, but charge a premium for longer conversations and more personalized interactions. Some "companion bots" are designed to make users form deep connections. Even though people know that the "significant other" is AI, these companion bot apps sell user data for targeted advertising and aren't transparent about their privacy policies. Is that not also a sort of intimacy scam, a way to extract resources from lonely people for as long as possible? There are steps you can take to protect your heart, wallet, and peace of mind. It seems obvious, but refusing to send money to someone you haven't met in person will stop a romance scam in its tracks. You can demand spontaneous video calls, and ask the person on the other end to do something random; deepfakes still struggle with "unscripted" actions. "Be suspicious of anyone you've never met in person -- that's the only safe approach in a digital world increasingly filled with scams," Konstantin Levinzon, the co-founder of free VPN service provider PlanetVPN, said in a press release. "If someone you meet on a dating site seems suspicious, perform a reverse image search to check if their pictures are stolen from other sources. And if the conversation shifts to money, or if someone asks for personal information, leave the conversation immediately." You can also use a VPN to obscure your location, since scammers might track users' location and try to personalize their scams based on the target's city or country. If you are scammed, reporting early to the FBI Internet Crime Complaint Center, Federal Trade Commission, and your bank increases the chances that you'll be able to recover the stolen funds. Several nonprofits offer support for victims of romance scams. "No matter how alone you feel right now, no matter how embarrassed you are, you will recover from this and one day look back and see how you made it through it," Nyhuis said. "These scammers are good at removing hope. Don't let them take that from you."
[6]
AI is making romance scams harder to spot, and this Philadelphia expert has these red flags to look for
With Valentine's Day approaching, financial experts warn to be on alert for a rise in romance scams that are becoming harder to spot due to artificial intelligence. One in four Americans say they've encountered a fake profile or AI-generated bot online, according to newly released data from McAfee. Americans have lost more than $1.14 billion to romance scams since 2023, according to the Federal Trade Commission. Tanyika Rickard, a community manager with Chase Bank in Philadelphia, who works directly with customers who have been targeted or defrauded, said scammers often spend months building trust before ever bringing up money. "They love bomb you," she said in a recent appearance on the In Your Corner podcast with CBS News Philadelphia consumer reporter Josh Sidorowicz. "In the beginning, you're hearing, 'I can't believe I've met someone like you,' and if you are in a space where you don't have that currently, that loneliness can take over." Rickard also warns about how artificial intelligence has transformed the landscape of romance scams. Scammers are now able to create entire fake identities -- including photographs, social media histories, real‑time chat responses and even AI‑generated voices or video calls. They can mimic voices, and they can create a video chat with AI. Rickard said scammers also scrape victims' online presence to tailor their approach. "They've gone through your social media," she said. "They know how many grandchildren you have, they know what's important to you because you post it." Kate Kleinert previously shared with In Your Corner how a romance scammer stole her entire life savings. Her story highlighted the emotional manipulation scammers use to con their victims. "I had been widowed for 12 years at that point," she said. "It was nice talking to a man again, and every single night he would call me and say, 'How was your day, honey?' Nobody asked me that anymore." Rickard highlighted several warning signs: She also urged consumers not to answer calls from unknown numbers. "If it does not look right, nine times out of 10, it isn't," she said. "Go with your gut." Many victims never report what happened, often out of embarrassment. More than half of adults who lost money said they never reported it, according to new AARP research. But Rickard said reporting is essential, not only to stop the scammer but to connect victims with resources that may help recover some of the money. "You absolutely should file a complaint," she said. "Come into the bank, speak to your banker, don't feel embarrassed." Chase and other institutions regularly host fraud prevention workshops, including sessions in West Philadelphia, to help the community identify threats and protect their finances. Scammers have made a career out of defrauding people, Rickard said. "We need to make a career out of protecting each other," Rickard said. You can report the crime to local law enforcement or the FBI's IC3.gov. You can also call the AARP Fraud Watch Network Helpline at 877-908-3360 for support. The In Your Corner podcast is dedicated to providing practical solutions to everyday problems. Each week will feature a different guest expert. You can find new episodes posted every Wednesday on the CBS Philadelphia YouTube channel.
[7]
Love, lies, and LLMs: How to protect yourself from AI romance scams
Romance scams used to feel like a cliché. Everyone pictured an email from an overseas "prince" that was poorly written and full of typos and pleas for cash. Now, that cliché is dead. Today's romance scams are industrial-scale operations. Attackers use artificial intelligence to clone voices, create deepfake video calls, and write scripts with large language models (LLMs). In 2024 alone, the Federal Trade Commission reported that financial losses to romance scams skyrocketed, with victims losing $1.14 billion. The real number, hidden by shame and silence, is likely triple that. Romance scams aren't just a tragedy for the victims. A successful scam is a massive risk for businesses, too. When an employee with access to sensitive data or funds is compromised, the "heartbreak hack" can harm an entire organization.
[8]
FBI Warns Of Rising AI Romance Scams Targeting Dating App Users This Valentine's Season
Deepfake Love And Crypto Traps: AI Dating Scams Expand Rapidly Ahead Of Valentine's Day The message begins like many modern love stories: a warm greeting, a shared interest, and a polite good-morning text sent regularly. This Valentine's Day, investigators warn that behind many such online romances there may not be a true partner, but a well-planned scam. The US Federal Bureau of Investigation (FBI) has flagged a sharp rise in using synthetic faces, voices, and conversations. The risk is global: a survey found 60% of Indians received dating invites later exposed as fake, highlighting the scale.
Share
Share
Copy Link
Romance scams have reached unprecedented levels, costing Americans $3 billion in 2024 alone. AI is transforming these scams from crude catfishing attempts into sophisticated operations using deepfake technology, voice cloning, and AI chatbots. Scammers can now manage 20+ victims simultaneously using automated toolkits complete with fake personas and conversation scripts.
Romance scams have escalated into one of the most financially devastating forms of cybercrime, with Americans losing $3 billion in 2024 alone—a figure experts believe is significantly undercounted due to victims' reluctance to report
5
. The FBI warned this week that criminals are exploiting generative AI to commit fraud on a larger scale, with field offices across cities like San Francisco, Jacksonville, and Philadelphia issuing separate notices about local fraud incidents3
. BioCatch, a fraud prevention startup, reported that 340 financial institutions saw a 63% uptick in romance scam attempts between 2024 and 20253
. These numbers reflect a disturbing trend as AI transforms what were once easily identifiable scams into sophisticated operations that feel emotionally authentic and deeply personalized.
Source: Analytics Insight
The integration of AI has fundamentally changed the scale and sophistication of romance scams. A single scammer who previously managed a few victims can now use AI chatbots to run 20 or more scams simultaneously, according to Chris Nyhuis, founder of cybersecurity firm Vigilant
5
. On the dark web, fraudsters purchase romance scam toolkits complete with customer support, user reviews, and tiered pricing packages that include pre-built fake personas with AI-generated photosets, conversation scripts for each stage of the scam, and deepfake video tools5
. Deepfake technology now allows scammers to conduct convincing video calls using face-swapping and voice cloning tools2
. What once functioned as an informal identity check—seeing someone talk and respond in real time—no longer provides the same level of security2
. McAfee's survey revealed that 35% of Americans using dating apps have encountered fake or AI-generated photos or profiles, with men reporting encounters 50% more often than women1
.
Source: PC Magazine
Many romance scams fall under the umbrella of pig butchering scams, in which fraudsters build relationships with victims over extended periods before extracting money repeatedly
5
. Between 2020 and 2024, these scams defrauded more than $75 billion from people worldwide5
. The FBI's San Antonio office cautioned that these schemes often begin on dating apps or social media platforms, with victims directed to professional-looking websites controlled by criminals that falsely display significant profits3
. Australian police recently warned more than 5,000 people they may have been targeted in a large-scale romance scam linked to overseas syndicates, where scammers used common dating apps to find victims and trick them into buying fake cryptocurrency2
. Jonathan Frost, BioCatch's director of Global Advisory, explained that fraudsters use AI-generated personas and even deploy deepfake videos and live video calls to build trust with victims before steering them toward investment fraud3
.
Source: CBS
Cybersecurity experts have identified consistent patterns that signal potential AI romance scam activity. Ashley Rose, CEO of Living Security, notes that scammers try to lock down relationships quickly, saying "I love you" and moving very fast
1
. Requests for money in any form represent a giant red flag, particularly for older victims1
. Personal information requests—including birthdays, Social Security numbers, and banking information—can allow scammers to assume victims' identities1
. Romance scammers push to move conversations off dating apps onto WhatsApp, Telegram, or text messages, which eliminates built-in safety features and potentially exposes more personal details2
. People who always have excuses not to meet in person—citing business trips, family emergencies, or illness—represent another critical warning sign1
.Related Stories
Steve Grobman, Chief Technology Officer at McAfee, recommends that a healthy dose of skepticism combined with using the right tools to protect privacy, identity, and personal information is essential
4
. Slowing relationships down remains one of the strongest defenses, as spending more time talking increases the chance that inconsistencies will surface2
. Rose advises conducting reverse image searches on Google if someone sends photos, as many scammers take pictures from social media accounts or elsewhere online1
. During video calls, experts recommend asking the person to turn their head fully or wave a hand in front of their face, as many deepfakes struggle with sudden or exaggerated movements and faces can briefly distort, freeze, or lose alignment when software fails to track motion correctly4
. Never send money, gifts, or cryptocurrency to someone you haven't met in person, and talk to friends or family about new online relationships, as John Clay, Vice President of Threat Intelligence at Trend Micro, notes that scammers use urgency and isolation to bypass rational thinking4
.Romance scams uniquely target a core human need for love and connection during what the US Surgeon General officially declared a loneliness epidemic in 2023, with health risks on par with smoking up to 15 cigarettes a day
5
. Reportedly, 1 in 6 people worldwide experience loneliness, making them prime targets for social engineering5
. Older adults who experience social isolation are frequently targeted, as retirement and bereavement create circumstances that scammers deliberately manipulate5
. According to Nyhuis, romance scams are uniquely dangerous because of what they exploit—while phishing uses urgency and tech support scams use fear, romance scams use love, which can make people think irrationally or overlook their gut feeling that something is wrong5
. Fred Heiding, a postdoctoral researcher at Harvard Kennedy School who studies AI and cybersecurity, warns that AI-enabled translation has removed language barriers, giving scammers millions more potential victims5
. At least 220,000 people are trapped in scam centers in Southeast Asia and forced to defraud targets under threat of abuse5
. Looking ahead, Heiding predicts that within a few years or a decade, AI scammers will think in completely different patterns than humans and will probably be exceptionally effective at persuasion5
.Summarized by
Navi
[1]
[2]
1
Business and Economy

2
Policy and Regulation

3
Technology
