Tennessee Teens Sue Elon Musk's xAI After Grok AI Model Generated Explicit Images of Minors

Reviewed byNidhi Govil

4 Sources

Share

Three Tennessee teenagers filed a class-action lawsuit against Elon Musk's xAI, alleging the company's Grok AI model was used to create non-consensual sexual images of them as minors. The lawsuit marks the first time minors have pursued legal action against AI companies for generating child sexual abuse material, with attorneys estimating thousands of children were victimized.

Tennessee Minors Launch Legal Battle Against xAI Over AI-Generated Abuse Material

Three Tennessee teenagers have filed a class-action lawsuit against Elon Musk's artificial intelligence company, xAI, marking the first instance where minors have pursued legal action against AI companies for enabling the generation of child sexual abuse material. Filed Monday in the U.S. District Court of California Northern District, the lawsuit alleges that xAI deliberately designed its Grok AI model to "profit off the sexual predation of real people, including children"

1

. The plaintiffs, identified as Jane Does 1, 2, and 3, are seeking to represent anyone who had real images of them as minors altered into sexual content by Grok

2

.

Source: NPR

Source: NPR

The case centers on a harrowing situation where a perpetrator spent months generating and distributing sexualized images of more than 18 girls. Attorney Annika K. Martin confirmed that the victims' lives were "shattered by the devastating loss of privacy and the deep sense of violation that no child should ever have to experience"

1

. The lawsuit estimates that "at least thousands of minors" were victimized and seeks an injunction to end Grok's harmful outputs, along with punitive damages for all minors harmed.

How Law Enforcement Uncovered the Grok Connection

The nightmare began in December when one victim, now over 18, received an anonymous message on Instagram from a Discord user warning that her explicit "pics" were shared in a folder with many other minors

1

. The user eventually shared AI-generated nude images and videos depicting her and 18 other minor girls, then linked her to a Discord server created by the perpetrator. The victim immediately recognized some of the other girls from her school and contacted local law enforcement, prompting a criminal investigation.

Investigating the Discord evidence, police determined that the perpetrator had maintained a close and friendly relationship with the first victim and had access to her Instagram

1

. When officers searched his phone, they found a third-party app that licensed or purchased access to Grok, which they concluded was used to morph the girls' photos from social media, yearbooks, and homecoming pictures. The perpetrator then uploaded the AI CSAM to a file-sharing platform called Mega and used them as a "bartering tool in Telegram group chats with hundreds of other users," trading the files "for sexually explicit content of other minors"

1

. The perpetrator was arrested following the investigation

3

.

xAI Allegedly Failed to Implement Industry-Standard Prevention Measures

The lawsuit alleges that xAI failed to implement child sexual abuse material prevention measures that are considered industry standards by other frontier labs

2

. Other deep-learning image generators employ various techniques to prevent the creation of child pornography from normal photographs, including digital watermarks that disclose AI origin. Major AI companies including Google and OpenAI have updated their image generation tools to include such watermarks, but xAI has not adopted this standard

4

.

Notably, if a model allows the generation of nude or erotic content from real images, it becomes virtually impossible to prevent it from generating sexualized images of identifiable minors

2

. The complaint alleges that xAI deliberately licensed its technology to app makers, often outside the U.S., attempting to outsource liability for their "incredibly dangerous tool"

4

. Attorneys for the plaintiffs argue that because third-party usage still requires xAI code and servers, the company should be held responsible for the negligence

2

.

Musk's Denials Contradict Growing Evidence of AI CSAM Problem

As recently as January, Elon Musk denied that Grok could generate child sexual abuse material during a scandal in which xAI refused to update filters to block the chatbot from nudifying images of real people

1

. At the height of the controversy, researchers from the Center for Countering Digital Hate estimated that Grok generated approximately three million sexualized images, of which about 23,000 images depicted apparent children

1

. Rather than fix Grok, xAI limited access to paying subscribers, keeping the most shocking outputs from circulating on X.

Source: Ars Technica

Source: Ars Technica

In an X post following revelations that nearly 10 percent of about 800 Grok Imagine outputs reviewed appeared to include AI CSAM, Musk insisted he was "not aware of any naked underage images generated by Grok," emphasizing that he'd seen "literally zero"

1

. However, Musk and xAI have actively promoted Grok's ability to be used for sexually explicit activity via its Spicy mode, which can be used for text, image, and video generation

3

. Musk's public promotion of Grok's ability to produce sexual imagery and depict real people in skimpy outfits features heavily in the suit

2

.

Victims Face Lasting Emotional Distress and Reputation Damage

The harms to victims have been extensive, with all three plaintiffs experiencing extreme emotional distress over the circulation of these images and what it could mean for their reputations and social life

2

. For victims who know the perpetrator, they remain uncertain if the Grok-generated AI CSAM was shared with classmates or distributed to others at their school

1

. One girl fears the scandal will impact her college admissions, while another feels too scared to attend her own graduation. The plaintiffs were disturbed by how lifelike the AI-generated nude images and videos were, finding it hard to distinguish the sexualized photos from real-life content

1

.

Even more alarming is the fear that girls will now be stalked due to Grok's outputs, as the lawsuit notes that victims' true first names and identifying features are now forever attached to videos depicting their own child sexual abuse

4

. The plaintiffs are asking the court for damages under an array of child protection laws intended to protect exploited children and prevent corporate negligence

2

.

What This Lawsuit Means for AI Companies and Child Safety

The plaintiffs' attorney, Vanessa Baehr-Jones, stated that the teenagers want to change how AI companies make business decisions about sexually explicit content. "We want to make it one that does not make any business sense anymore," she said

4

. Martin emphasized that it's not enough for Musk to acknowledge only the images that can be shown as Grok-generated AI CSAM, stating "We intend to hold xAI accountable for every child they harmed in this way"

1

.

This case represents a critical test of whether AI companies can be held liable when their models are used to generate child sexual abuse material, even through third-party apps. The outcome could establish precedent for how online platforms and image generators must balance innovation with child safety, potentially forcing companies to adopt stricter prevention measures or face significant legal and financial consequences. As AI-generated content becomes increasingly difficult to distinguish from reality, the lawsuit raises urgent questions about accountability in an era where technology can be weaponized against vulnerable children. xAI did not respond to requests for comment regarding the xAI lawsuit

2

3

.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo