Teen Sues AI 'Clothes Removal' App Developer Over Deepfake Nudes

Reviewed byNidhi Govil

2 Sources

Share

A 17-year-old New Jersey girl is suing the developer of ClothOff, an AI-powered 'clothes removal' tool, after a classmate allegedly used it to create fake nude images of her when she was 14. The case highlights growing concerns over AI-generated sexual imagery and calls for stricter regulations.

News article

Legal Action Against AI-Powered 'Clothes Removal' Tool

A 17-year-old girl from New Jersey has filed a lawsuit against AI/Robotics Venture Strategy3, the developer of ClothOff, an AI-powered 'clothes removal' tool. The lawsuit alleges that the software was used by a classmate to create fake nude images of her when she was 14 years old

1

2

.

Details of the Case

The lawsuit, filed by a Yale Law School professor, his students, and a trial attorney, targets AI/Robotics Venture Strategy3, believed to be operated by residents of Belarus. The messaging platform Telegram is also named as a 'nominal defendant' for hosting bots that provided access to ClothOff

1

.

According to the complaint, the plaintiff's Instagram photo, showing her in a bathing suit, was altered into a realistic nude image and shared among male classmates. The teen now lives in 'constant fear' that the fake image will resurface online

2

.

Legal Demands and Developer's Response

The lawsuit demands:

  1. Deletion of all AI-generated nude images involving minors and adults without consent
  2. Removal of the ClothOff website and tool from the internet
  3. Prohibition of using these images to train AI models

    1

ClothOff's developer claims that processing images of minors is impossible and attempts to do so result in account bans. They also state that no data is saved. However, the plaintiff's lawyers allege that the software has been used to create child sexual abuse material, violating federal and state laws

2

.

Wider Implications and Similar Cases

This case is part of a growing trend of legal actions against makers of 'undressing' websites and apps. In 2024, The Guardian reported that ClothOff had over 4 million monthly visitors and had been used to generate nude images of children worldwide

1

.

The issue of AI-generated non-consensual imagery predates the current generative AI boom. In 2020, a deepfake bot on Telegram was found to have created over 100,000 fake naked photos of women based on social media images

1

.

Regulatory Response

The case adds to the mounting pressure for regulation of AI-generated sexual imagery. In May, the U.S. Congress passed the Take It Down Act, making it a federal crime to publish non-consensual intimate imagery, whether real or AI-generated. The act also requires platforms to remove such content within 48 hours of a valid complaint

2

.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

Β© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo