3 Sources
3 Sources
[1]
More than half of U.S. teens are using AI to create fake nude images
Researchers report that more than half of surveyed U.S. teens have created sexualized fake nude images using artificial intelligence tools. The study reframes these images as a common part of teen digital behavior rather than a rare or fringe misuse. Within an anonymous group of 557 teenagers, the activity appeared widely embedded in everyday online interaction rather than limited to isolated cases. By analyzing these responses, Chad Steel at George Mason University (GMU) determined that 55.3 percent had created at least one such image. More than half of the teens also reported receiving these images, showing how quickly creation extended into routine sharing among peers. The patterns point to a behavior already normalized at scale, setting the stage for deeper questions about consent and harm. Unlike image generators that build scenes from prompts and software that makes a clothed photo look nude, AI starts with a real person's picture. Because one existing selfie can become raw material in seconds, the design lowers the effort needed to make a sexualized image. Steel found that teens used these apps more than broader image-creation systems, which mattered because the target was usually recognizable. Once a real face anchors the image, embarrassment, coercion, and rumor spread become harder to dismiss as fantasy. Harm showed up almost as often as participation, and the gap between the two was smaller than many adults might expect. In the survey, 36.3 percent said someone had made a sexualized AI image of them without permission. Another 33.2 percent said someone had shared one of those images, turning a private violation into a social event. Creation and circulation matter as separate injuries, because stopping one does not automatically stop the other. Across race, age, and most other categories, the behavior looked broadly distributed rather than packed into one obvious subgroup. Male participants still reported higher rates in several categories, including creating or sharing images of themselves, peers, and adults. Female participants tracked close to the overall pattern, which undercut the idea that only boys engage. Programs aimed at a single stereotype would miss much of the behavior the survey actually captured. Age offered no clear protection inside the survey. The results showed that 13-year-olds and 17-year-olds reported similar levels of use and victimization. Schools could read that finding as a case for earlier lessons on consent, privacy, and image sharing, before middle-school habits harden. Elsewhere, Britain's communications regulator released a 2025 report showing that half of children ages eight through 17 had used AI tools. Familiarity with AI no longer begins in late adolescence, which helps explain why waiting until high school may miss the moment. Earlier teen sexting research did not involve AI, yet it helps show how far the behavior has moved. A 2018 review of 39 studies put youth sexting at 14.8 percent for sending and 27.4 percent for receiving. Now the image can be generated from a prompt or edited from a real photo, which lowers effort and muddies responsibility. "Teens are no longer just digital natives but AI-natives. 'Nudification' and GenAI apps are their new 'sexting', only with more challenging issues surrounding consent," said Steel. Federal law already reaches farther than many teenagers probably realize, even when an image is fully synthetic. Under law, obscene sexual images involving minors can be illegal even when no real child exists. That creates a clash with peer-to-peer behavior, because some teens may treat these exchanges as flirting or experimentation. Schools, parents, and lawmakers still face the harder question of how to deter harm without pretending the behavior is rare. Beyond the survey, youth-made sexual imagery already occupies a large share of online abuse reports. The Internet Watch Foundation's 2023 report said 92 percent of the sexual abuse imagery involving children it identified and helped remove was self-generated. AI did not create the urge to make or trade intimate images, but it made editing, faking, and redistribution easier. Prevention cannot focus on AI alone when the social behavior clearly predates the software itself. Captured during January 2025, the research still leaves important blind spots even while it puts hard numbers on a hidden behavior. Only English-speaking U.S. teens ages 13 through 17 were surveyed by GMU, and parents had to consent before participation. Adult perpetrators, younger children, and many smaller subgroups fell outside the design, so some harms could be higher or different. Even with those limits, the results are too large to treat as a narrow problem waiting for more data. This is not a fringe behavior confined to a few reckless teens, but a widespread digital practice shaped by questions of consent, social pressure, and harm. That reality makes earlier education, stronger safeguards, and better support for victims not just important - but overdue. Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
[2]
Teen girls use AI to create sexual images, study says
A new study suggests that teen girls use so-called nudification apps at the same rate as teen boys. The artificial intelligence-powered undressing tools allow users to create sexualized images of a person, typically by uploading a picture of them. The results surprised Dr. Chad M.S. Steel, a digital forensics researcher at George Mason University who studies technology-facilitated crimes against children. "Males tend to be more involved in any type of online sexual endeavors, whether it's sexting or viewing pornographic material or the like, there's usually a much stronger signal for males than females," Steel said of the findings, which were published Wednesday in the journal PLOS One. In Jan. 2025, Steel conducted an online survey of 557 English-speaking adolescents ages 13 to 17. Even a year ago, Steel found widespread use of nudification tools. Fifty-five percent of the respondents said they'd created a sexualized image, and 54 percent said they'd received one. More than a third of teens said they'd been victims of the technology. More than a third reported that someone had made a non-consensual image of them, and a third said an image of theirs had been shared without their permission. Roughly 1 in 6 teen girls and boys used nudification tools frequently to see how they looked. About the same share of teen girls shared such imagery "once or twice" with someone else. A slightly smaller percentage of boys reported the same behavior. Steel didn't ask the teens why they used nudification tools, though sexting is a common practice among adolescents. He suspects that the popularity of "try it on" clothing and makeup visualization tools among girls builds familiarity with the same type of engagement as nudification apps. Coupled with male coercion for sexually explicit imagery, teen girls may find themselves using a familiar technology to deal with the pressure, Steel explained. Dr. Linda Charmaraman studies girls' wellbeing with an emphasis on social media and digital health but wasn't involved in the study. She reviewed the findings and told Mashable that teens are in a delicate developmental period as they form their identities and seek social connection and acceptance. "When you combine that time of development with AI, it can bring further risks," Charmaraman, director of the Youth, Media, & Wellbeing Research Lab at Wellesley College, wrote in an email. "For example, there might be a lot of pressure for girls to create certain kinds of content in order to fit in with their peers and to possibly promote their social status." Boys did report higher usage of generative AI than girls to create and distribute sexual imagery, both with and without the permission of the subject. Steel said that he would like to see his results replicated among a much larger sample of teens. "In this case, I'd love to find out that I had an extremely unusual subset," Steel said. Charmaraman said that the survey's nationally representative sample and effective quality checks indicate it reached diverse households. Yet she wondered whether the way the survey was advertised could have attracted "technology-savvy" participants, potentially skewing the results.
[3]
A concerningly high number of teens admit to using AI for making sexualized pictures
A new study reveals just how normalized AI nudification has become among US teens, and the findings are hard to ignore. OpenAI CEO Sam Altman is pushing to release an "adult version" of ChatGPT, while Elon Musk has advocated allowing Grok to generate R-rated content. If you're not already concerned that allowing AI to create adult content is a bad idea, perhaps this new survey will change your mind. More than half of American teenagers have used AI tools to generate nude images of themselves or others, according to a new study published in the open-access journal PLOS ONE by Chad Steel of George Mason University. Recommended Videos The survey collected responses from 557 English-speaking US residents between the ages of 13 and 17. The survey was anonymous and conducted with parental consent. What did the study find? The results are hard to sit with. 55.3% of teens surveyed reported using nudification tools to create at least one image of themselves or others. 54.4% said they had received AI-generated nude images. Even more concerning, 36.3% reported that a sexualized AI image of themselves had been created by someone else without their consent, and 33.2% said those images were shared without their permission. The results were largely consistent across different demographics. However, male participants reported higher rates of creating and distributing these images, both consensually and non-consensually. Why is this a big concern? People have been sending nudes to each other since the dawn of smartphones. However, doing that requires a willing participant. AI nudification tools don't. Anyone with a photo of you and access to one of these apps can create a fake nude image without your knowledge or consent. Victims of this kind of abuse experience consequences similar to those of other forms of child sexual exploitation material, including a sense of dehumanization and lasting disruption to their lives. Steel sums it up well: "Teens are no longer just digital natives but AI-natives. 'Nudification' and GenAI apps are their new 'sexting,' only with more challenging issues surrounding consent." The hope is that findings like these will prompt lawmakers and educators to act before the problem becomes even more difficult to address.
Share
Share
Copy Link
A George Mason University study found that 55% of surveyed U.S. teens have used AI to create fake nude images, with more than a third reporting non-consensual image creation. The research reveals AI nudification has become normalized among adolescents, raising urgent questions about consent and privacy in the age of AI-powered tools.
More than half of American teenagers have used AI to create fake nude images of themselves or others, according to groundbreaking research published in the journal PLOS One
1
2
. Chad Steel, a digital forensics researcher at George Mason University, surveyed 557 English-speaking U.S. teens ages 13 through 17 in January 2025, revealing that 55.3 percent had created at least one sexualized image using AI-powered nudification apps3
. The study reframes AI nudification as a common part of teen digital behavior rather than a rare or fringe misuse, with 54.4 percent reporting they had received such images1
.The research exposed alarming rates of victimization, with 36.3 percent of surveyed teens reporting that someone had made a sexualized AI image of them without permission
2
. Another 33.2 percent said someone had shared one of those images, turning a private violation into a social event1
. The gap between participation and harm proved smaller than many adults might expect, highlighting how quickly creation extends into routine sharing among peers. Unlike traditional image generators that build scenes from prompts, AI tools for image generation start with a real person's picture, lowering the effort needed to make a sexualized image1
. Once a real face anchors the image, embarrassment, coercion, and rumor spread become harder to dismiss as fantasy.
Source: Mashable
The findings surprised Steel, who noted that males typically show stronger signals in online sexual endeavors
2
. While male participants reported higher rates of creating and distributing sexualized images both consensually and non-consensually, roughly 1 in 6 teen girls and boys used nudification tools frequently to see how they looked2
. About the same share of teen girls shared such imagery once or twice with someone else, tracking close to the overall pattern and undercutting the idea that only boys engage1
. Steel suspects the popularity of "try it on" clothing and makeup visualization tools among girls builds familiarity with the same type of engagement as nudification apps, coupled with male coercion for sexually explicit imagery2
.Earlier teen sexting research provides context for how far the behavior has moved. A 2018 review of 39 studies put youth sexting at 14.8 percent for sending and 27.4 percent for receiving
1
. Now the image can be generated from a prompt or edited from a real photo, which lowers effort and muddies responsibility. "Teens are no longer just digital natives but AI-natives. 'Nudification' and GenAI apps are their new 'sexting', only with more challenging issues surrounding consent," Steel explained1
3
. The critical difference lies in consent and privacy—traditional sexting requires a willing participant, while anyone with a photo and access to these apps can create a fake nude image without knowledge or permission3
.The results showed that 13-year-olds and 17-year-olds reported similar levels of use and victimization, suggesting age offered no clear protection inside the survey
1
. Across race, age, and most other categories, the behavior looked broadly distributed rather than packed into one obvious subgroup. Schools could read that finding as a case for earlier lessons on consent and privacy and image sharing, before middle-school habits harden1
. Britain's communications regulator released a 2025 report showing that half of children ages eight through 17 had used AI tools, indicating familiarity with AI no longer begins in late adolescence1
.Related Stories
Victims of this kind of abuse experience consequences similar to those of other forms of child sexual exploitation material, including a sense of dehumanization and lasting disruption to their lives
3
. Dr. Linda Charmaraman, director of the Youth, Media, & Wellbeing Research Lab at Wellesley College, noted that teens are in a delicate developmental period as they form their identities and seek social connection and acceptance2
. Federal law already reaches farther than many teenagers probably realize—under law, obscene sexual images involving minors can be illegal even when no real child exists1
. The Internet Watch Foundation's 2023 report said 92 percent of the sexual abuse imagery involving children it identified and helped remove was self-generated, though AI did not create the urge to make or trade intimate images but made editing, faking, and redistribution easier1
.
Source: Earth.com
The survey's nationally representative sample indicates it reached diverse households, though only English-speaking U.S. teens ages 13 through 17 were surveyed, and parents had to consent before participation
1
. Steel said he would like to see his results replicated among a much larger sample of teens2
. The findings arrive as OpenAI CEO Sam Altman pushes to release an "adult version" of ChatGPT, while Elon Musk has advocated allowing Grok to generate R-rated content3
. Schools, parents, and lawmakers face the harder question of how to deter harm without pretending the behavior is rare, making earlier education and strong digital consent frameworks essential for youth wellbeing1
.Summarized by
Navi
08 Apr 2026•Entertainment and Society

17 Jul 2025•Technology

16 Jan 2026•Policy and Regulation
