AI-Powered "Nudify" Sites Pose New Threat to Schools and Teens

Curated by THEOUTPOST

On Mon, 16 Dec, 8:01 AM UTC

2 Sources

Share

A disturbing trend of AI-generated nude images in schools highlights the urgent need for policy changes and increased awareness about the misuse of artificial intelligence technology.

AI "Nudify" Sites: A Growing Threat to School Safety

In a disturbing trend, schools across the United States are grappling with a new challenge: the use of artificial intelligence to create realistic, non-consensual nude images of students. This phenomenon, facilitated by so-called "nudify" websites, has raised serious concerns about privacy, cyberbullying, and the misuse of AI technology 1.

The Westfield High School Incident

In October 2022, 14-year-old Francesca Mani and several of her classmates at Westfield High School in New Jersey became victims of this alarming practice. A male student allegedly used a popular "nudify" website called Clothoff to create fake nude images of female students using their clothed photos from Instagram 1.

Francesca recounted the chaotic aftermath: "It's like rapid fire. It just goes through everyone. And so then when someone hears-- hears this, it's like, 'Wait. Like, AI?' Like, no one thinks that could, like, happen to you." 1

The Scale and Accessibility of "Nudify" Sites

Researchers have identified over a hundred such "nudify" websites, with Clothoff being one of the most popular, receiving more than 3 million visits in a single month 1. These sites are easily accessible through a simple internet search, raising concerns about their widespread use and potential for abuse.

Kolina Koltai, a senior researcher at Bellingcat specializing in AI misuse, demonstrated the ease of accessing these sites: "You'll see, as we click accept, that there's no verification. And now we're already here." 1

Legal and Ethical Implications

The creation and distribution of AI-generated nude images of minors raise significant legal and ethical questions. Yiota Souras, chief legal officer at the National Center for Missing and Exploited Children, emphasized that while these images are fake, the harm they cause is real:

"They'll suffer, you know, mental health distress and reputational harm. In a school setting it's really amplified, because one of their peers has created this imagery. So there's a loss of confidence. A loss of trust." 2

The Department of Justice has stated that AI nudes of minors are illegal under federal child pornography laws if they depict "sexually explicit conduct." However, there are concerns that some images created by "nudify" sites may not meet this definition, potentially creating a legal gray area 2.

School and Law Enforcement Response

The response to these incidents has often been inadequate. In Francesca's case, the school's handling of the situation left much to be desired. Her mother, Dorota Mani, expressed disappointment with the minimal disciplinary action taken:

"The principal informed me that one boy receives one-day suspension, and that was it. So I ask her if this is all. Are there gonna be any other consequences? And she said, 'No, that's-- for now, this is all that is going to happen.'" 1

While Dorota filed a police report, no charges have been brought. This highlights the challenges in addressing these incidents through existing legal frameworks 2.

Call for Action and Policy Changes

In response to their experience, Francesca and Dorota Mani have been advocating for schools to implement policies addressing AI misuse. Their efforts led to the Westfield school district revising its Harassment, Intimidation and Bullying policy to incorporate AI 2.

However, experts argue that more comprehensive action is needed. Souras emphasized the need for tech companies to take greater responsibility in removing harmful content promptly 2.

As AI technology continues to advance, the incident at Westfield High School serves as a stark reminder of the urgent need for updated policies, increased awareness, and better safeguards to protect students from the misuse of artificial intelligence.

Continue Reading
Law Enforcement Races to Combat AI-Generated Child Sexual

Law Enforcement Races to Combat AI-Generated Child Sexual Abuse Imagery

U.S. law enforcement agencies are cracking down on the spread of AI-generated child sexual abuse imagery, as the Justice Department and states take action to prosecute offenders and update laws to address this emerging threat.

Economic Times logoAP NEWS logoABC News logoThe Seattle Times logo

7 Sources

Economic Times logoAP NEWS logoABC News logoThe Seattle Times logo

7 Sources

AI-Generated Nude Images of Students Spark Controversy at

AI-Generated Nude Images of Students Spark Controversy at Pennsylvania Private School

A male student at Lancaster Country Day School allegedly used AI to create fake nude images of female classmates, leading to leadership changes, student protests, and a criminal investigation.

USA Today logoU.S. News & World Report logoThe Seattle Times logoAP NEWS logo

9 Sources

USA Today logoU.S. News & World Report logoThe Seattle Times logoAP NEWS logo

9 Sources

AI-Driven Explosion of Fake Nudes: A Growing Concern for

AI-Driven Explosion of Fake Nudes: A Growing Concern for Women and Teenagers

The rise of AI-generated fake nude images is becoming a significant issue, affecting women and teenagers. Victims are calling for stronger laws and better enforcement to combat this form of online abuse.

Sky News logo

2 Sources

Sky News logo

2 Sources

AI-Generated Deepfakes Target Congresswomen, Sparking Calls

AI-Generated Deepfakes Target Congresswomen, Sparking Calls for Legislation

A new study reveals that 1 in 6 congresswomen have been victims of AI-generated sexually explicit deepfakes, highlighting the urgent need for legislative action to combat this growing threat.

Gizmodo logoCCN.com logoThe Hill logoMashable logo

6 Sources

Gizmodo logoCCN.com logoThe Hill logoMashable logo

6 Sources

San Francisco Takes Legal Action Against AI-Generated

San Francisco Takes Legal Action Against AI-Generated Deepfake Nude Websites

San Francisco's city attorney has filed a lawsuit against websites creating AI-generated nude images of women and girls without consent. The case highlights growing concerns over AI technology misuse and its impact on privacy and consent.

AP NEWS logoABC News logoThe Seattle Times logoU.S. News & World Report logo

12 Sources

AP NEWS logoABC News logoThe Seattle Times logoU.S. News & World Report logo

12 Sources

TheOutpost.ai

Your one-stop AI hub

The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.

© 2025 TheOutpost.AI All rights reserved