Deepfake nudes crisis spreads to 90 schools globally, impacting over 600 students

3 Sources

Share

A global analysis reveals that deepfake sexual abuse incidents have hit around 90 schools worldwide, impacting more than 600 pupils across 28 countries since 2023. Teenage boys are using AI-powered nudify applications to create fake nude images of classmates, leaving victims traumatized while schools and law enforcement struggle to respond adequately to what experts call child sexual abuse material.

Deepfake Nudes Spread Across 90 Schools in 28 Countries

A disturbing pattern has emerged across the globe: teenage boys are downloading photos of female classmates from Instagram and Snapchat, then using AI-powered nudify applications to generate explicit imagery. The deepfake crisis in schools has escalated dramatically, with incidents now documented at around 90 schools worldwide, affecting more than 600 pupils across at least 28 countries since 2023, according to a comprehensive review by WIRED and Indicator

1

. These AI-generated images constitute child sexual abuse material (CSAM), yet many schools and law enforcement agencies remain unprepared to handle these serious cases of deepfake sexual abuse.

Source: NYT

Source: NYT

The geographic spread is extensive. Nearly 30 cases have been reported across North America, including one incident with more than 60 alleged victims. More than 10 cases surfaced in South America, over 20 across Europe, and another dozen in Australia and East Asia combined

1

. However, these figures likely represent only a fraction of the actual scale. A Unicef survey estimates that 1.2 million children had sexual deepfakes created of them last year, while one in five young people in Spain told Save the Children researchers that fake nude images of classmates had been created of them

1

.

Class-Action Lawsuit Targets xAI Over Grok's Role in Abuse

The legal landscape is shifting as victims fight back. A class-action lawsuit filed in March 2026 against xAI, Elon Musk's AI company, highlights how accessible technology enables this abuse. Three Jane Does in Tennessee allege that a perpetrator used xAI's assistant, Grok, to generate sexually explicit images based on their real, clothed photos—including a yearbook picture. One victim's doctored video "showed her entire body, including her genitals, without any clothes," according to the lawsuit

2

. The perpetrator allegedly circulated altered pictures of at least 18 underage girls to Discord, attaching their first names and school name to make them identifiable.

Source: Wired

Source: Wired

Victims Face Devastating Emotional Distress and Reputational Damage

The psychological impact on victims is severe and long-lasting. A 16-year-old student in Pennsylvania's New Hope-Solebury School District learned in her school lunchroom that deepfake pornographic images of her were circulating. "Your nudes got leaked," a friend told her—despite the fact she had never sent such photos

3

. The scandal began in December 2024 after a falling out with a friend, and by February 2025, explicit photos were being shown at parties and spread among students.

Source: USA Today

Source: USA Today

Once social and bubbly, the now 18-year-old has become reclusive, afraid to visit the grocery store or pharmacy out of fear that others have seen the images

3

. Jane Doe 1 from the Tennessee lawsuit "feels acute anxiety about who has viewed these files online and feels a complete lack of control over the ongoing dissemination," according to court documents. Two of the Jane Does fear engaging in normal activities like attending class, and all three report reputational damage when people believe the images are real

2

. While most reported cases involve female victims, teenage boys have also experienced harassment and extortion using deepfake technology.

Schools Struggle With Inadequate Investigation and Support

Many families report that schools fail to provide adequate responses. The Pennsylvania family is pursuing legal action against school district after what they describe as inadequate investigation and support. A Title IX investigation conducted by the school found no proof that accused perpetrators "actually circulated deepfake nudes and/or rumors of the same," according to a letter from the school district's solicitor reviewed by USA TODAY

3

. The parents pulled their daughter from public school after persistent bullying and victimization.

A 2024 survey by the Center for Democracy and Technology of 3,170 K-12 students, teachers, and parents revealed alarming gaps in preparedness. Six in 10 teachers were unaware of school policies for addressing deepfake sexual images, only 16 percent said teacher training covered protecting student privacy in deepfake cases, and just 13 percent of students reported that their school explained the harm of sharing AI-generated images

3

. "I think you'd be hard-pressed to find a school that has not been affected by this," says Lloyd Richardson, director of technology at the Canadian Centre for Child Protection

1

.

What Parents and Schools Should Watch For

Parental awareness remains critically low. The Pennsylvania mother emphasizes that "the school is supposed to be the one that is up to date on these types of things" and should provide outreach about deepfake risks just as they do for other forms of bullying

3

. Evan Harris, a national expert on emerging AI risks in schools at Pathos Consulting Group, has conducted thousands of webinars to help schools navigate this crisis and contributed to free educational programs. The accessibility of nudify applications, which can earn creators millions of dollars annually, means this threat will likely intensify before adequate protections are in place

1

. As legal frameworks evolve and awareness grows, the focus must shift to supporting victims and holding both perpetrators and enabling platforms accountable.

Today's Top Stories

TheOutpost.ai

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

Instagram logo
LinkedIn logo
Youtube logo
© 2026 TheOutpost.AI All rights reserved