AI-Powered 'Nudify' Apps Fuel Deadly Wave of Digital Blackmail

Reviewed byNidhi Govil

3 Sources

Share

The rise of AI-generated nude images is leading to a surge in sextortion cases, particularly targeting minors. This trend has resulted in tragic consequences and calls for increased regulation.

The Rise of AI-Powered 'Nudify' Apps

The rapid proliferation of artificial intelligence (AI) tools has given rise to a disturbing trend: the use of "nudify" apps to generate fake nude images for blackmail. These AI-powered applications, which can digitally remove clothing from photos or create sexualized imagery, are increasingly being weaponized against minors in sextortion scams

1

2

3

.

Source: France 24

Source: France 24

Tragic Consequences and Growing Concern

The severity of this issue came to light with the tragic case of Elijah Heacock, a 16-year-old from Kentucky who died by suicide after receiving threatening texts demanding $3,000 to suppress an AI-generated nude image of him

1

2

3

. This incident is just one among thousands of cases targeting American minors, prompting urgent calls for action from tech platforms and regulators.

The FBI has reported a "horrific increase" in sextortion cases targeting U.S. minors, primarily males between 14 and 17 years old, leading to an "alarming number of suicides"

1

2

3

. A survey by Thorn, a non-profit focused on preventing online child exploitation, found that 6% of American teens have been direct victims of deepfake nudes

1

2

3

.

The Lucrative Business of AI Nudifiers

Despite their harmful nature, nudify apps have become a lucrative business. An analysis of 85 websites selling nudify services revealed they could collectively be worth up to $36 million a year

1

2

3

. Indicator, a U.S. publication investigating digital deception, estimates that 18 of these sites made between $2.6 million and $18.4 million over six months

1

2

3

.

Global Impact and Legal Responses

The problem extends beyond the United States. In Spain, a Save the Children survey found that one in five young people have been victims of deepfake nudes shared without consent

1

2

3

. Spanish prosecutors are investigating cases where minors targeted classmates and teachers with AI-generated pornographic content

1

2

3

.

Governments are beginning to respond to this threat:

  1. The United Kingdom has criminalized the creation of sexually explicit deepfakes, with perpetrators facing up to two years in jail

    1

    2

    3

    .
  2. In the U.S., President Donald Trump signed the bipartisan "Take It Down Act," criminalizing the non-consensual publication of intimate images and mandating their removal from online platforms

    1

    2

    3

    .

Tech Giants' Response and Ongoing Challenges

Meta has taken legal action against a Hong Kong company behind a nudify app called Crush AI for repeatedly circumventing the platform's rules to post ads

1

2

3

. However, despite such measures, AI nudifying sites remain resilient.

Researchers describe the fight against these apps as a "game of whack-a-mole," with the sites proving to be "persistent and malicious adversaries"

1

2

3

. Most of these sites rely on tech infrastructure from major companies like Google, Amazon, and Cloudflare to operate, highlighting the complex challenge of combating this issue

1

2

3

.

Source: Economic Times

Source: Economic Times

As AI technology continues to advance, the need for comprehensive strategies to protect minors from digital exploitation becomes increasingly urgent. The tragic consequences of these scams underscore the importance of swift action from tech companies, lawmakers, and society as a whole to address this growing threat.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo