3 Sources
[1]
AI-powered 'nudify' apps fuel deadly wave of digital blackmail
Washington (AFP) - After a Kentucky teenager died by suicide this year, his parents discovered he had received threatening texts demanding $3,000 to suppress an AI-generated nude image of him. The tragedy underscores how so-called sextortion scams targeting children are growing around the world, particularly with the rapid proliferation of "nudify" apps -- AI tools that digitally strip off clothing or generate sexualized imagery. Elijah Heacock, 16, was just one of thousands of American minors targeted by such digital blackmail, which has spurred calls for more action from tech platforms and regulators. His parents told US media that the text messages ordered him to pay up or an apparently AI-generated nude photo would be sent to his family and friends. "The people that are after our children are well organized," John Burnett, the boy's father, said in a CBS News interview. "They are well financed, and they are relentless. They don't need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child." US investigators were looking into the case, which comes as nudify apps -- which rose to prominence targeting celebrities -- are being increasingly weaponized against children. The FBI has reported a "horrific increase" in sextortion cases targeting US minors, with victims typically males between the ages of 14 and 17. The threat has led to an "alarming number of suicides," the agency warned. Instruments of abuse In a recent survey, Thorn, a non-profit focused on preventing online child exploitation, found that six percent of American teens have been a direct victim of deepfake nudes. "Reports of fakes and deepfakes -- many of which are generated using these 'nudifying' services -- seem to be closely linked with reports of financial sextortion, or blackmail with sexually explicit images," the British watchdog Internet Watch Foundation (IWF) said in a report last year. "Perpetrators no longer need to source intimate images from children because images that are convincing enough to be harmful -- maybe even as harmful as real images in some cases -- can be produced using generative AI." The IWF identified one "pedophile guide" developed by predators that explicitly encouraged perpetrators to use nudifying tools to generate material to blackmail children. The author of the guide claimed to have successfully blackmailed some 13-year-old girls. The tools are a lucrative business. A new analysis of 85 websites selling nudify services found they may be collectively worth up to $36 million a year. The analysis from Indicator, a US publication investigating digital deception, estimates that 18 of the sites made between $2.6 million and $18.4 million over the six months to May. Most of the sites rely on tech infrastructure from Google, Amazon, and Cloudflare to operate, and remain profitable despite crackdowns by platforms and regulators, Indicator said. 'Whack-a-mole' The proliferation of AI tools has led to new forms of abuse impacting children, including pornography scandals at universities and schools worldwide, where teenagers created sexualized images of their own classmates. A recent Save the Children survey found that one in five young people in Spain have been victims of deepfake nudes, with those images shared online without their consent. Earlier this year, Spanish prosecutors said they were investigating three minors in the town of Puertollano for allegedly targeting their classmates and teachers with AI-generated pornographic content and distributing it in their school. In the United Kingdom, the government this year made creating sexually explicit deepfakes a criminal offense, with perpetrators facing up to two years in jail. And in May, US President Donald Trump signed the bipartisan "Take It Down Act," which criminalizes the non-consensual publication of intimate images, while also mandating their removal from online platforms. Meta also recently announced it was filing a lawsuit against a Hong Kong company behind a nudify app called Crush AI, which it said repeatedly circumvented the tech giant's rules to post ads on its platforms. But despite such measures, researchers say AI nudifying sites remain resilient. "To date, the fight against AI nudifiers has been a game of whack-a-mole," Indicator said, calling the apps and sites "persistent and malicious adversaries."
[2]
AI-powered 'nudify' apps fuel deadly wave of digital blackmail - The Economic Times
A Kentucky teen's suicide has spotlighted the rise of AI-driven sextortion, where scammers use "nudify" apps to blackmail minors with fake explicit images. Reports of such abuse are growing globally, prompting legal action and crackdowns, but experts warn these AI-powered tools remain resilient and deeply harmful to young victims.After a Kentucky teenager died by suicide this year, his parents discovered he had received threatening texts demanding $3,000 to suppress an AI-generated nude image of him. The tragedy underscores how so-called sextortion scams targeting children are growing around the world, particularly with the rapid proliferation of "nudify" apps -- AI tools that digitally strip off clothing or generate sexualized imagery. Elijah Heacock, 16, was just one of thousands of American minors targeted by such digital blackmail, which has spurred calls for more action from tech platforms and regulators. His parents told US media that the text messages ordered him to pay up or an apparently AI-generated nude photo would be sent to his family and friends. "The people that are after our children are well organized," John Burnett, the boy's father, said in a CBS News interview. "They are well financed, and they are relentless. They don't need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child." US investigators were looking into the case, which comes as nudify apps -- which rose to prominence targeting celebrities -- are being increasingly weaponized against children. The FBI has reported a "horrific increase" in sextortion cases targeting US minors, with victims typically males between the ages of 14 and 17. The threat has led to an "alarming number of suicides," the agency warned. Instruments of abuse In a recent survey, Thorn, a non-profit focused on preventing online child exploitation, found that six percent of American teens have been a direct victim of deepfake nudes. "Reports of fakes and deepfakes -- many of which are generated using these 'nudifying' services -- seem to be closely linked with reports of financial sextortion, or blackmail with sexually explicit images," the British watchdog Internet Watch Foundation (IWF) said in a report last year. "Perpetrators no longer need to source intimate images from children because images that are convincing enough to be harmful -- maybe even as harmful as real images in some cases -- can be produced using generative AI." The IWF identified one "pedophile guide" developed by predators that explicitly encouraged perpetrators to use nudifying tools to generate material to blackmail children. The author of the guide claimed to have successfully blackmailed some 13-year-old girls. The tools are a lucrative business. A new analysis of 85 websites selling nudify services found they may be collectively worth up to $36 million a year. The analysis from Indicator, a US publication investigating digital deception, estimates that 18 of the sites made between $2.6 million and $18.4 million over the six months to May. Most of the sites rely on tech infrastructure from Google, Amazon, and Cloudflare to operate, and remain profitable despite crackdowns by platforms and regulators, Indicator said. 'Whack-a-mole' The proliferation of AI tools has led to new forms of abuse impacting children, including pornography scandals at universities and schools worldwide, where teenagers created sexualized images of their own classmates. A recent Save the Children survey found that one in five young people in Spain have been victims of deepfake nudes, with those images shared online without their consent. Earlier this year, Spanish prosecutors said they were investigating three minors in the town of Puertollano for allegedly targeting their classmates and teachers with AI-generated pornographic content and distributing it in their school. In the United Kingdom, the government this year made creating sexually explicit deepfakes a criminal offense, with perpetrators facing up to two years in jail. And in May, US President Donald Trump signed the bipartisan "Take It Down Act," which criminalizes the non-consensual publication of intimate images, while also mandating their removal from online platforms. Meta also recently announced it was filing a lawsuit against a Hong Kong company behind a nudify app called Crush AI, which it said repeatedly circumvented the tech giant's rules to post ads on its platforms. But despite such measures, researchers say AI nudifying sites remain resilient. "To date, the fight against AI nudifiers has been a game of whack-a-mole," Indicator said, calling the apps and sites "persistent and malicious adversaries."
[3]
AI-powered 'nudify' apps fuel deadly wave of digital blackmail - VnExpress International
After a Kentucky teenager died by suicide this year, his parents discovered he had received threatening texts demanding US$3,000 to suppress an AI-generated nude image of him. The tragedy underscores how so-called sextortion scams targeting children are growing around the world, particularly with the rapid proliferation of "nudify" apps - AI tools that digitally strip off clothing or generate sexualized imagery. Elijah Heacock, 16, was just one of thousands of American minors targeted by such digital blackmail, which has spurred calls for more action from tech platforms and regulators. His parents told U.S. media that the text messages ordered him to pay up or an apparently AI-generated nude photo would be sent to his family and friends. "The people that are after our children are well organized," John Burnett, the boy's father, said in a CBS News interview. "They are well financed, and they are relentless. They don't need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child." U.S. investigators were looking into the case, which comes as nudify apps -- which rose to prominence targeting celebrities -- are being increasingly weaponized against children. The FBI has reported a "horrific increase" in sextortion cases targeting U.S. minors, with victims typically males between the ages of 14 and 17. The threat has led to an "alarming number of suicides," the agency warned. Instruments of abuse In a recent survey, Thorn, a non-profit focused on preventing online child exploitation, found that 6% of American teens have been a direct victim of deepfake nudes. "Reports of fakes and deepfakes -- many of which are generated using these 'nudifying' services -- seem to be closely linked with reports of financial sextortion, or blackmail with sexually explicit images," the British watchdog Internet Watch Foundation (IWF) said in a report last year. "Perpetrators no longer need to source intimate images from children because images that are convincing enough to be harmful -- maybe even as harmful as real images in some cases -- can be produced using generative AI." The IWF identified one "pedophile guide" developed by predators that explicitly encouraged perpetrators to use nudifying tools to generate material to blackmail children. The author of the guide claimed to have successfully blackmailed some 13-year-old girls. The tools are a lucrative business. A new analysis of 85 websites selling nudify services found they may be collectively worth up to $36 million a year. The analysis from Indicator, a U.S. publication investigating digital deception, estimates that 18 of the sites made between $2.6 million and $18.4 million over the six months to May. Most of the sites rely on tech infrastructure from Google, Amazon, and Cloudflare to operate, and remain profitable despite crackdowns by platforms and regulators, Indicator said. 'Whack-a-mole' The proliferation of AI tools has led to new forms of abuse impacting children, including pornography scandals at universities and schools worldwide, where teenagers created sexualized images of their own classmates. A recent Save the Children survey found that one in five young people in Spain have been victims of deepfake nudes, with those images shared online without their consent. Earlier this year, Spanish prosecutors said they were investigating three minors in the town of Puertollano for allegedly targeting their classmates and teachers with AI-generated pornographic content and distributing it in their school. In the United Kingdom, the government this year made creating sexually explicit deepfakes a criminal offense, with perpetrators facing up to two years in jail. And in May, U.S. President Donald Trump signed the bipartisan "Take It Down Act," which criminalizes the non-consensual publication of intimate images, while also mandating their removal from online platforms. Meta also recently announced it was filing a lawsuit against a Hong Kong company behind a nudify app called Crush AI, which it said repeatedly circumvented the tech giant's rules to post ads on its platforms. But despite such measures, researchers say AI nudifying sites remain resilient. "To date, the fight against AI nudifiers has been a game of whack-a-mole," Indicator said, calling the apps and sites "persistent and malicious adversaries."
Share
Copy Link
The rise of AI-generated nude images is leading to a surge in sextortion cases, particularly targeting minors. This trend has resulted in tragic consequences and calls for increased regulation.
The rapid proliferation of artificial intelligence (AI) tools has given rise to a disturbing trend: the use of "nudify" apps to generate fake nude images for blackmail. These AI-powered applications, which can digitally remove clothing from photos or create sexualized imagery, are increasingly being weaponized against minors in sextortion scams 123.
Source: France 24
The severity of this issue came to light with the tragic case of Elijah Heacock, a 16-year-old from Kentucky who died by suicide after receiving threatening texts demanding $3,000 to suppress an AI-generated nude image of him 123. This incident is just one among thousands of cases targeting American minors, prompting urgent calls for action from tech platforms and regulators.
The FBI has reported a "horrific increase" in sextortion cases targeting U.S. minors, primarily males between 14 and 17 years old, leading to an "alarming number of suicides" 123. A survey by Thorn, a non-profit focused on preventing online child exploitation, found that 6% of American teens have been direct victims of deepfake nudes 123.
Despite their harmful nature, nudify apps have become a lucrative business. An analysis of 85 websites selling nudify services revealed they could collectively be worth up to $36 million a year 123. Indicator, a U.S. publication investigating digital deception, estimates that 18 of these sites made between $2.6 million and $18.4 million over six months 123.
The problem extends beyond the United States. In Spain, a Save the Children survey found that one in five young people have been victims of deepfake nudes shared without consent 123. Spanish prosecutors are investigating cases where minors targeted classmates and teachers with AI-generated pornographic content 123.
Governments are beginning to respond to this threat:
Meta has taken legal action against a Hong Kong company behind a nudify app called Crush AI for repeatedly circumventing the platform's rules to post ads 123. However, despite such measures, AI nudifying sites remain resilient.
Researchers describe the fight against these apps as a "game of whack-a-mole," with the sites proving to be "persistent and malicious adversaries" 123. Most of these sites rely on tech infrastructure from major companies like Google, Amazon, and Cloudflare to operate, highlighting the complex challenge of combating this issue 123.
Source: Economic Times
As AI technology continues to advance, the need for comprehensive strategies to protect minors from digital exploitation becomes increasingly urgent. The tragic consequences of these scams underscore the importance of swift action from tech companies, lawmakers, and society as a whole to address this growing threat.
Summarized by
Navi
[2]
[3]
VnExpress International
|AI-powered 'nudify' apps fuel deadly wave of digital blackmail - VnExpress InternationalNetflix has incorporated generative AI technology in its original series "El Eternauta," marking a significant shift in content production methods for the streaming giant.
23 Sources
Technology
14 hrs ago
23 Sources
Technology
14 hrs ago
Meta declines to sign the European Union's voluntary AI code of practice, calling it an overreach that could stifle innovation and economic growth in Europe. The decision highlights growing tensions between tech giants and EU regulators over AI governance.
13 Sources
Policy and Regulation
14 hrs ago
13 Sources
Policy and Regulation
14 hrs ago
An advisory board convened by OpenAI recommends that the company should continue to be controlled by a nonprofit, emphasizing the need for democratic participation in AI development and governance.
6 Sources
Policy and Regulation
14 hrs ago
6 Sources
Policy and Regulation
14 hrs ago
Perplexity AI partners with Airtel to offer free Pro subscriptions, leading to a significant increase in downloads and user base in India, potentially reshaping the AI search landscape.
5 Sources
Technology
14 hrs ago
5 Sources
Technology
14 hrs ago
Perplexity AI, an AI-powered search engine startup, has raised $100 million in a new funding round, valuing the company at $18 billion. This development highlights the growing investor interest in AI startups and Perplexity's potential to challenge Google's dominance in internet search.
4 Sources
Startups
14 hrs ago
4 Sources
Startups
14 hrs ago