2 Sources
2 Sources
[1]
Teen sues ClothOff developer over fake nude images made with clothes removal tool
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. What just happened? A teenage girl is suing the maker of a "clothes removal" tool after it was used by a classmate to create at least one fake nude of her when she was 14. The lawsuit, which also names Telegram as a defendant, is the latest in a series of legal actions taken against the makers of undressing websites and apps. The now 17-year-old from New Jersey was one of several girls at Westfield High School who discovered a student had used photos from their social media accounts to create AI-generated nude images two years ago. Some male classmates shared the fakes in group chats, writes The Wall Street Journal. Now, a Yale Law School professor, his students, and a trial attorney have filed a lawsuit on behalf of the teen against AI/Robotics Venture Strategy3, developer of ClothOff - the web tool allegedly used to create the images. Also named as a "nominal defendant" in the lawsuit is Telegram; ClothOff could be accessed on the messaging app via bots. AI/Robotics Venture Strategy3 is based in the British Virgin Islands and is believed to be operated by residents of Belarus. The plaintiff says that the creation of these images constitute CSAM, but the developer claims processing images of minors is impossible and attempting to do so will lead to an account ban. The developer also says it does not save any data. The plaintiff has requested that a judge order AI/Robotics Venture Strategy3 to delete and destroy all nude images it possesses of adults and children who didn't provide consent, to refrain from using the images to train its AI models, and to remove both the website and the ClothOff tool. In 2024, The Guardian carried out an investigation into ClothOff, which at the time had more than 4 million visitors per month. The publication reported that the app had been used to generate nude images of children around the world. A Telegram spokesperson said clothes-removing tools and non-consensual pornography are a violation of its terms of service and removed when discovered. ClothOff has now been removed from the platform. The teen boy who created the fake nudes isn't named in the suit but is being sued separately by the plaintiff. It's alleged that he used an image of the girl in a swimsuit to create the nude. The girl says she lives in "constant fear" that the faked image of her is on the internet, and that images of her and her classmates are being used to train ClothOff's AI to improve its image-generation capabilities. The problem of using AI to create nude images of people without their consent goes back to before the generative AI revolution. A deepfake bot on Telegram was found to have made over 100,000 faked naked photos of women based on social media images in 2020. In 2024, 16 undressing websites were sued by San Francisco Attorney's office. More recently, Meta sued the maker of the Crush AI nudify app in June after 8,000 ads appeared on its platforms in just two weeks.
[2]
Teen Sues Developer Of AI 'Clothes Removal' App After Classmate Created Deepfake Nude
Enter your email to get Benzinga's ultimate morning update: The PreMarket Activity Newsletter A 17-year-old New Jersey girl is suing the developer of an AI-powered "clothes removal" app after a classmate allegedly used it to generate fake-nude images of her from a social media photo when she was 14. Yale Law Group Joins Case Targeting Deepfake Exploitation The lawsuit, filed by a Yale Law School professor, his students and a trial attorney, accuses AI/Robotics Venture Strategy 3 Ltd. -- the company behind the web tool ClothOff -- of enabling the creation and distribution of nonconsensual, sexually explicit deepfakes, reported the Wall Street Journal. The case also names Telegram as a nominal defendant, as the app hosted bots that provided access to ClothOff. According to the complaint, the teen's Instagram photo, which showed her in a bathing suit, was altered into a realistic nude image shared among male classmates. The lawsuit demands the deletion of all AI-generated nude images involving minors and adults without consent and seeks a court order to remove the software from the internet. ClothOff did not immediately respond to Benzinga's request for comments. See Also: Google Tightens 'Work From Anywhere' Policy, With Even 1 Remote Day Counting As Full Week: Report Developer Denies Wrongdoing, But Concerns Mount ClothOff's developer, based in the British Virgin Islands and believed to be operated from Belarus, states on its website that its system cannot process images of minors and automatically deletes all data. However, the plaintiff's lawyers allege the software has been used to create child sexual abuse material, violating federal and state laws. The teenage boy accused of creating the fake nudes is not included in the current lawsuit, although the plaintiff has filed a separate suit against him. In their response to the complaint. His attorneys stated that the "defendant is without knowledge or information sufficient to form a belief as to the truth of the allegations." Rising Pressure To Regulate AI Deepfakes The case adds to a growing push for regulation amid a surge in AI-generated sexual imagery. In May, Congress passed the Take It Down Act, which makes it a federal crime to publish nonconsensual intimate imagery -- real or AI-generated -- and requires platforms to remove such content within 48 hours of a valid complaint. The plaintiff's filing says she now "lives in constant fear" that her fake image will resurface online. Read More: Nvidia CEO Jensen Huang Says Intel Spent 33 Years 'Trying To Kill Us' But Now Calls The Chip Rival A Partner: 'We're Lovers, Not Fighters' Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors. Photo courtesy: Shutterstock Market News and Data brought to you by Benzinga APIs
Share
Share
Copy Link
A 17-year-old New Jersey girl is suing the developer of ClothOff, an AI-powered 'clothes removal' tool, after a classmate allegedly used it to create fake nude images of her when she was 14. The case highlights growing concerns over AI-generated sexual imagery and calls for stricter regulations.
A 17-year-old girl from New Jersey has filed a lawsuit against AI/Robotics Venture Strategy3, the developer of ClothOff, an AI-powered 'clothes removal' tool. The lawsuit alleges that the software was used by a classmate to create fake nude images of her when she was 14 years old
1
2
.The lawsuit, filed by a Yale Law School professor, his students, and a trial attorney, targets AI/Robotics Venture Strategy3, believed to be operated by residents of Belarus. The messaging platform Telegram is also named as a 'nominal defendant' for hosting bots that provided access to ClothOff
1
.According to the complaint, the plaintiff's Instagram photo, showing her in a bathing suit, was altered into a realistic nude image and shared among male classmates. The teen now lives in 'constant fear' that the fake image will resurface online
2
.The lawsuit demands:
1
ClothOff's developer claims that processing images of minors is impossible and attempts to do so result in account bans. They also state that no data is saved. However, the plaintiff's lawyers allege that the software has been used to create child sexual abuse material, violating federal and state laws
2
.Related Stories
This case is part of a growing trend of legal actions against makers of 'undressing' websites and apps. In 2024, The Guardian reported that ClothOff had over 4 million monthly visitors and had been used to generate nude images of children worldwide
1
.The issue of AI-generated non-consensual imagery predates the current generative AI boom. In 2020, a deepfake bot on Telegram was found to have created over 100,000 fake naked photos of women based on social media images
1
.The case adds to the mounting pressure for regulation of AI-generated sexual imagery. In May, the U.S. Congress passed the Take It Down Act, making it a federal crime to publish non-consensual intimate imagery, whether real or AI-generated. The act also requires platforms to remove such content within 48 hours of a valid complaint
2
.Summarized by
Navi
1
Technology
2
Business and Economy
3
Business and Economy