2 Sources
2 Sources
[1]
Teen girls use AI to create sexual images, study says
A new study suggests that teen girls use so-called nudification apps at the same rate as teen boys. The artificial intelligence-powered undressing tools allow users to create sexualized images of a person, typically by uploading a picture of them. The results surprised Dr. Chad M.S. Steel, a digital forensics researcher at George Mason University who studies technology-facilitated crimes against children. "Males tend to be more involved in any type of online sexual endeavors, whether it's sexting or viewing pornographic material or the like, there's usually a much stronger signal for males than females," Steel said of the findings, which were published Wednesday in the journal PLOS One. In Jan. 2025, Steel conducted an online survey of 557 English-speaking adolescents ages 13 to 17. Even a year ago, Steel found widespread use of nudification tools. Fifty-five percent of the respondents said they'd created a sexualized image, and 54 percent said they'd received one. More than a third of teens said they'd been victims of the technology. More than a third reported that someone had made a non-consensual image of them, and a third said an image of theirs had been shared without their permission. Roughly 1 in 6 teen girls and boys used nudification tools frequently to see how they looked. About the same share of teen girls shared such imagery "once or twice" with someone else. A slightly smaller percentage of boys reported the same behavior. Steel didn't ask the teens why they used nudification tools, though sexting is a common practice among adolescents. He suspects that the popularity of "try it on" clothing and makeup visualization tools among girls builds familiarity with the same type of engagement as nudification apps. Coupled with male coercion for sexually explicit imagery, teen girls may find themselves using a familiar technology to deal with the pressure, Steel explained. Dr. Linda Charmaraman studies girls' wellbeing with an emphasis on social media and digital health but wasn't involved in the study. She reviewed the findings and told Mashable that teens are in a delicate developmental period as they form their identities and seek social connection and acceptance. "When you combine that time of development with AI, it can bring further risks," Charmaraman, director of the Youth, Media, & Wellbeing Research Lab at Wellesley College, wrote in an email. "For example, there might be a lot of pressure for girls to create certain kinds of content in order to fit in with their peers and to possibly promote their social status." Boys did report higher usage of generative AI than girls to create and distribute sexual imagery, both with and without the permission of the subject. Steel said that he would like to see his results replicated among a much larger sample of teens. "In this case, I'd love to find out that I had an extremely unusual subset," Steel said. Charmaraman said that the survey's nationally representative sample and effective quality checks indicate it reached diverse households. Yet she wondered whether the way the survey was advertised could have attracted "technology-savvy" participants, potentially skewing the results.
[2]
A concerningly high number of teens admit to using AI for making sexualized pictures
A new study reveals just how normalized AI nudification has become among US teens, and the findings are hard to ignore. OpenAI CEO Sam Altman is pushing to release an "adult version" of ChatGPT, while Elon Musk has advocated allowing Grok to generate R-rated content. If you're not already concerned that allowing AI to create adult content is a bad idea, perhaps this new survey will change your mind. More than half of American teenagers have used AI tools to generate nude images of themselves or others, according to a new study published in the open-access journal PLOS ONE by Chad Steel of George Mason University. Recommended Videos The survey collected responses from 557 English-speaking US residents between the ages of 13 and 17. The survey was anonymous and conducted with parental consent. What did the study find? The results are hard to sit with. 55.3% of teens surveyed reported using nudification tools to create at least one image of themselves or others. 54.4% said they had received AI-generated nude images. Even more concerning, 36.3% reported that a sexualized AI image of themselves had been created by someone else without their consent, and 33.2% said those images were shared without their permission. The results were largely consistent across different demographics. However, male participants reported higher rates of creating and distributing these images, both consensually and non-consensually. Why is this a big concern? People have been sending nudes to each other since the dawn of smartphones. However, doing that requires a willing participant. AI nudification tools don't. Anyone with a photo of you and access to one of these apps can create a fake nude image without your knowledge or consent. Victims of this kind of abuse experience consequences similar to those of other forms of child sexual exploitation material, including a sense of dehumanization and lasting disruption to their lives. Steel sums it up well: "Teens are no longer just digital natives but AI-natives. 'Nudification' and GenAI apps are their new 'sexting,' only with more challenging issues surrounding consent." The hope is that findings like these will prompt lawmakers and educators to act before the problem becomes even more difficult to address.
Share
Share
Copy Link
A groundbreaking study from George Mason University reveals that 55% of American teenagers have used AI-powered nudification apps to create sexualized images. The research exposes widespread non-consensual image creation, with more than a third of teens reporting victimization. The findings challenge assumptions about gender differences in such behavior and raise urgent questions about consent in the AI era.
More than half of American teenagers have used artificial intelligence tools to generate nude or sexualized images, according to research published in the journal PLOS One by Dr. Chad M.S. Steel, a digital forensics researcher at George Mason University who studies technology-facilitated crimes against children
1
2
. The January 2025 survey of 557 English-speaking adolescents ages 13 to 17 found that 55.3% of respondents reported using AI-powered nudification apps to create at least one image of themselves or others, while 54.4% said they had received AI-generated nude images1
.These AI-powered nudification apps allow users to create and share sexualized images by uploading a photograph of a person, effectively "undressing" them through artificial intelligence
1
. Steel characterizes the phenomenon bluntly: "Teens are no longer just digital natives but AI-natives. 'Nudification' and GenAI apps are their new 'sexting,' only with more challenging issues surrounding consent"2
. The critical difference from traditional sexting is that these tools don't require a willing participant—anyone with access to a photo can generate sexualized images without the subject's knowledge or permission2
.
Source: Mashable
The study reveals disturbing patterns of non-consensual image creation among adolescents. More than 36% of teens reported that someone had created a sexualized AI image of them without their consent, and 33.2% said such images were shared without their permission
1
2
. These victims experience consequences similar to other forms of child sexual exploitation material, including dehumanization and lasting disruption to their lives2
.The findings surprised Steel, who noted that males typically show stronger engagement in online sexual activities, including sexting and viewing pornographic material
1
. However, roughly 1 in 6 teen girls and teen boys used nudification tools frequently to see how they looked, and about the same share of teen girls shared such imagery "once or twice" with someone else—rates nearly identical to boys1
. Steel suspects that the popularity of "try it on" clothing and makeup visualization tools among girls builds familiarity with similar technology. Combined with male coercion for sexually explicit imagery, teen girls may find themselves using familiar technology to deal with pressure .Related Stories
Dr. Linda Charmaraman, director of the Youth, Media, & Wellbeing Research Lab at Wellesley College, emphasized that teens are in a delicate developmental period as they form identities and seek social connection. "When you combine that time of development with AI, it can bring further risks," she noted, explaining that pressure to create certain content to fit in with peers and promote social status may drive usage
1
. While boys did report higher usage of generative AI to create and distribute sexual imagery both with and without permission, the gender gap was smaller than expected1
.The timing of this research coincides with broader debates about AI and adult content, as OpenAI CEO Sam Altman pushes to release an "adult version" of ChatGPT while Elon Musk advocates allowing Grok to generate R-rated content
2
. Steel acknowledges he would like to see his results replicated among a much larger sample, stating, "In this case, I'd love to find out that I had an extremely unusual subset"1
. Charmaraman noted that while the survey's nationally representative sample and quality checks reached diverse households, the way it was advertised could have attracted technology-savvy participants1
. The hope is that these findings will prompt lawmakers and educators to act before the problem becomes even more difficult to address, particularly around issues of consent and the unique challenges posed by teens using AI tools2
.Summarized by
Navi
1
Technology

2
Policy and Regulation

3
Business and Economy
