20 Sources
[1]
Meta sues AI 'nudify' app Crush AI for advertising on its platforms | TechCrunch
Meta has sued the maker of a popular AI "nudify" app, Crush AI, that reportedly ran thousands of ads across Meta's platforms. In addition to the lawsuit, Meta says it's taking new measures to crack down on other apps like Crush AI. In a lawsuit filed in Hong Kong, Meta alleged Joy Timeline HK, the entity behind Crush AI, attempted to circumvent the company's review process to distribute ads for AI nudify services. Meta said in a blog post that it repeatedly removed ads by the entity for violating its policies, but claims Joy Timeline HK continued to place additional ads anyway. Crush AI, which uses generative AI to make fake, sexually explicit images of real people without their consent, reportedly ran more than 8,000 ads for its "AI undresser" services on Meta's platform in the first two weeks of 2025, according to the author of the Faked Up newsletter, Alexios Mantzarlis. In a January report, Mantzarlis claimed that Crush AI's websites received roughly 90% of their traffic from either Facebook or Instagram, and that he flagged several of these websites to Meta. Crush AI reportedly evaded Meta's ad review processes by setting up dozens of advertiser accounts and frequently changed domain names. Many of Crush AI's advertiser accounts, according to Mantzarlis, were named "Eraser Annyone's Clothes" followed by different numbers. At one point, Crush AI even had a Facebook page promoting its service. Facebook and Instagram are hardly the only platforms dealing with such challenges. As social media companies like X and Meta race to add generative AI to their apps, they've also struggled to moderate how AI tools can make their platforms unsafe for users, particularly minors. Researchers have found that links to AI undressing apps soared in 2024 on platforms like X and Reddit, and on YouTube, millions of people were reportedly served ads for such apps. In response to this growing problem, Meta and TikTok have banned keyword searches for AI nudify apps, but getting these services off their platforms entirely has proven challenging. In a blog post, Meta said it has developed new technology to specifically identify ads for AI nudify or undressing services "even when the ads themselves don't include nudity." The company said it is now using matching technology to help find and remove copycat ads more quickly, and has expanded the list of terms, phrases and emoji that are flagged by its systems. Meta said it is also applying the tactics it has traditionally used to disrupt networks of bad actors to these new networks of accounts running ads for AI nudify services. Since the start of 2025, Meta said, it has disrupted four separate networks promoting these services. Outside of its apps, the company said it will begin sharing information about AI nudify apps through Tech Coalition's Lantern program, a collective effort between Google, Meta, Snap and other companies to prevent child sexual exploitation online. Meta says it has provided more than 3,800 unique URLs with this network since March. On the legislative front, Meta said it would "continue to support legislation that empowers parents to oversee and approve their teens' app downloads." The company previously supported the US Take It Down Act, and said it's now working with lawmakers to implement it.
[2]
Meta cracks down on nudify apps after being exposed
Meta's lawsuit specifically aims to prevent Hong Kong-based Joy Timeline from listing its ads for CrushAI nudify apps across its social media platforms, after the company made "multiple attempts... to circumvent Meta's ad review process." The legal action comes on the heels of a recently published CBS News investigation that found hundreds of ads for nudify apps across Meta's platforms. Meta told CBS at the time that it has since removed a number of these ads, deleted the accounts running them, and blocked the URLS associated with the nude deepfake apps, but said it's becoming harder to enforce its policies as exploitative generative AI apps find new ways to evade detection. CBS's report said that ads for AI deepfake nude tools could still be found on Instagram after Meta removed ads for apps that had been flagged by the investigation.
[3]
Meta: Oops, Ads for Deepfake 'Nudify' Apps Shouldn't Be on Facebook, Instagram
Meta is cracking down on ads that market "nudify" apps, which let people generate fake naked imagery of real-life people. Facebook and Instagram are two of the biggest sources of advertising for these apps, which can be used to make AI revenge porn. Meta is taking legal action against one of the biggest offenders, Crush AI, by suing developer Joy Timeline HK Limited. "We've filed a lawsuit in Hong Kong, where Joy Timeline HK Limited is based, to prevent them from advertising CrushAI apps on Meta platforms," the company says. In January, one report found that an estimated 90% of traffic to Crush AI came from Meta's Instagram. CBS News also found "hundreds" of ads on Meta platforms for similar apps. These services have been against Meta's policies for a while now, but Crush AI continually avoided detection by making new profiles on the company's ad tools and using domains that redirected to a different URL. "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," Meta says. "We'll continue to take the necessary steps, which could include legal action, against those who abuse our platforms like this." Meta says it has developed new tools to catch repeat offenders; they should detect ads even when they don't include nudity. "We've worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases, and emojis that our systems are trained to detect within these ads." Meta also says it will make clearer efforts to share information with its rivals and partners through the Tech Coalition's Lantern program. The hope is that bringing this data together will help authorities take action. Meta says this process is similar to how the company reports those "violating child safety activity." Earlier this month, Meta was criticized by its own independently run Oversight Board, which found that Meta has under-enforced its own rules on advertisers using AI-manipulated videos of celebrities promoting scams. The most recent example is an AI-generated video of Brazilian soccer player Ronaldo Nazário endorsing an online game. Meanwhile, it's now illegal to distribute nonconsensual intimate images in the US after President Trump signed the Take It Down Act last month. Websites must now "make reasonable efforts" to take down content and copies within 48 hours of a notification from a victim. Last year, Apple removed several generative AI apps from its App Store after a 404 Media investigation discovered they could be used to create nonconsensual nude images. A few months later, San Francisco sued 16 websites that use AI to help users "undress" or "nudify" photographs of women and girls. Grok on xAI also filled requests to "undress" people, though it generated them in bikinis or lingerie, not totally nude.
[4]
Meta Sues Hong Kong Firm in Crackdown on Deepfake Nude Apps
Meta Platforms Inc. is suing a Hong Kong-based business for allegedly promoting so-called nudify apps, as the Facebook owner attempts to combat a surge in naked and sexual images created by artificial intelligence. The tech giant said it filed a lawsuit in Hong Kong to stop Joy Timeline HK Ltd. allegedly advertising CrushAI apps on Meta platforms. The apps allow people to create AI-generated nude or sexually explicit images of individuals without their consent, Meta said in a post on its website Thursday.
[5]
Meta sues 'nudify' app-maker that ran 87k+ Facebook ads
Despite 'multiple enforcement actions,' Joy Timeline HK allegedly wouldn't stop Meta has sued an app maker for advertising on Facebook and Instagram a so-called "nudify" app that uses AI to generate nude and sexually explicit images and videos of individuals without their consent. The social media giant on Thursday filed the lawsuit in Hong Kong against Joy Timeline HK Limited. The company allegedly is behind the popular "nudify" app Crush AI, and placed tens of thousands of ads on Facebook and Instagram that promised to "erase any clothes" and display "nudy versions" of users' fully-clothed photos, in violation of Meta's policies. Meta says it continually removed these and other violating ads, shut down Facebook pages and Instagram accounts promoting these apps, and blocked links to Joy Timeline HK's various websites so users couldn't access them from Meta-owned platforms. But despite taking "multiple enforcement actions" against the app maker since 2023, Joy Timeline HK continued running these not safe for work (NSFW) ads on Facebook and Instagram, the lawsuit says. "Given the steps taken by the Defendant to create multiple accounts to advertise the Nudify Apps in direct contravention of Meta's Terms and Policies ... and even after Plaintiff Meta has taken steps to remove offending ads promoting the Nudify Apps, it is clear that unless restrained by a competent court, the Defendant will continue to publish such Violating Ads on Facebook and Instagram," according to the court documents. These ads primarily targeted users in the US, Canada, UK, Australia, and Germany, according to the court documents. And as of February, over 135 Facebook pages displayed more than 87,000 ads for nudify apps, with at least 170 business accounts on Facebook and Instagram placing these ads, the lawsuit says. One of these "depicted a woman in a black crop top and bottom," according to the lawsuit: "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," according to a Meta blog about the lawsuit. "We'll continue to take the necessary steps - which could include legal action - against those who abuse our platforms like this." In addition to suing the app maker, the tech giant said it's taking other actions to prevent these types of explicit deepfake and AI-based services from advertising online. "When we remove ads, accounts or content promoting these services, we'll share information -- starting with URLs to violating apps and websites -- with other tech companies through the Tech Coalition's Lantern program, so they can investigate and take action too," according to the blog. Since March, Meta said it has provided more than 3,800 unique URLs to participating tech companies. Plus, it's upped its own policing of advertisements for nudify apps, and "developed new technology specifically designed to identify these types of ads - even when the ads themselves don't include nudity - and use matching technology to help us find and remove copycat ads more quickly." ®
[6]
Meta files lawsuit against developer of CrushAI 'nudify' app
Facebook CEO Mark Zuckerberg departs E. Barrett Prettyman United States Court House on April 14, 2025 in Washington, DC. Meta is suing a company that ran ads on its services to promote an app that lets people create non-consensual, sexualized images of others using AI technology, the social media company said Thursday. The lawsuit is against Joy Timeline HK Limited, which develops the app called CrushAI and its variants. The Hong Kong-based company ran ads on Facebook and Instagram to promote CrushAI, an app that uses artificial intelligence to take a photo of someone and create nude imagery of them. Meta filed its lawsuit in Hong Kong with the intention of stopping Joy Timeline from continuing to advertise on its services, the social media company said. The lawsuit filing comes after "multiple attempts" by the CrushAI-maker to "circumvent Meta's ad review process and continue placing these ads, after they were repeatedly removed for breaking our rules," Meta said. "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," Meta said. "We'll continue to take the necessary steps - which could include legal action - against those who abuse our platforms like this." Researchers have sounded alarms about the rise of so-called nudify apps, which can be found online, in app stores and on Meta's advertising platform. Sen. Dick Durbin, D-Ill., sent a letter in February to Mark Zuckerberg urging the CEO to address his company's role in letting Joy Timeline run ads that violate Meta's standards on adult nudity, sexual activity and "certain forms of bullying and harassment." Durbin's letter cited a report by tech news outlet 404 Media and research by Cornell Tech's Alexios Mantzarlis that found that at least 8,010 CrushAI-related ads ran on Meta's apps during "the first two weeks of this year." In addition to the lawsuit, Meta said it's also updating its "enforcement methods" and has "developed new technology specifically designed to identify these types of ads -- even when the ads themselves don't include nudity -- and use matching technology to help us find and remove copycat ads more quickly." Meta said it's working with external experts and in-house "specialist teams" to keep up with how nudify app makers "evolve their tactics to avoid detection." Meta also said it would "be sharing signals about these apps with other tech companies" so they can also address the apps on their respective platforms. "We've also applied the tactics we use to disrupt networks of coordinated inauthentic activity to find and remove networks of accounts operating these ads," Meta said. "Since the start of the year, our expert teams have run in-depth investigations to expose and disrupt four separate networks of accounts that were attempting to run ads promoting these services."
[7]
Meta is cracking down on AI 'nudify' apps
Researchers have identified thousands of ads for such apps on Facebook and Instagram. Meta is finally cracking down that use AI to generate nonconsensual nude and explicit images of celebrities, influencers and others. The company is suing one app maker that's frequently advertised such apps on Facebook and Instagram, and taking new steps to prevent ads for similar services. The crackdown comes months after several researchers and journalists have raised the alarm about such apps. A recent from CBS News identified at least "hundreds" of ads on Meta's platform promoting apps that allow users to "remove clothing" from images of celebrities and others. One app in particular, called Crush AI, has apparently been a prolific advertiser on Facebook and Instagram. Researcher Alexios Mantzarlis, Director of Cornell Tech's Security, Trust and Safety Initiative, reported back in January that Crush AI had run on Facebook and Instagram since last fall. Now, Meta says it has filed a lawsuit against Joy Timeline HK Limited, the Hong Kong-based company behind Crush AI and other nudify apps. "This follows multiple attempts by Joy Timeline HK Limited to circumvent Meta's ad review process and continue placing these ads, after they were repeatedly removed for breaking our rules," the company wrote in a blog post. Joy Timeline HK Limited didn't immediately respond to a request for comment. Meta also says it's taking new steps to prevent apps like these from advertising on its platform. "We've developed new technology specifically designed to identify these types of ads -- even when the ads themselves don't include nudity -- and use matching technology to help us find and remove copycat ads more quickly," Meta wrote. "We've worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases and emojis that our systems are trained to detect within these ads." The social network says it also plans to work with other tech platforms, including app store owners, to share relevant details about entities that abuse its platform. Nudify apps aren't the only entities that have exploited Meta's advertising platform to run ads featuring celebrity deepfakes. Meta has also struggled to contain shady advertisers that use AI-manipulated video of public figures . The company's independent Oversight Board, which weighs in on content moderation issues affecting Facebook and Instagram, recently criticized Meta its rules prohibiting such ads.
[8]
Meta sues maker of Crush AI nudify app over Facebook and Instagram ads
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. What just happened? Meta's mission to crack down on AI "nudify" programs has led to it suing the maker of one of these apps. The social media giant has launched a lawsuit against Joy Timeline HK Limited, which developed an app called Crush AI. In the suit, which has been filed in the company's home of Hong Kong, Meta states that Crush AI made multiple attempts to circumvent Meta's ad review process and continued placing the ads. The ads appeared across Facebook and Instagram, and while Meta repeatedly removed them for breaking the rules, Joy Timeline kept posting more. One of the many negative consequences to come from the advancement of generative AI has been the rise of nudify apps. They use the technology to generate nonconsensual nude and explicit images of people after being fed photos of the individual. Crush AI had been one of the most prolific advertisers among these apps. An investigation in January from the author of the Faked Up newsletter, Alexios Mantzarlis, found that Meta's platforms ran more than 8,000 Crush AI-related ads during the first two weeks of the year alone. He notes that roughly 90% of Crush AI's website traffic came from either Facebook or Instagram. Crush AI avoided Meta's review process by setting up dozens of advertiser accounts and frequently changed domain names. Crush AI also had a Facebook page promoting its service. Senator Dick Durbin sent a letter to Meta CEO Mark Zuckerberg in February, urging him to address the ads. Durbin wrote that the ads violated Meta's Advertising Standards, including its prohibitions on ads featuring adult nudity, sexual activity, and certain forms of bullying and harassment. Meta says that it has now developed new technology that is designed to find and remove these types of nudify ads more quickly. It has also expanded the list of terms, phrases and emoji that are flagged by its systems. The company is also working with specialist teams to stay up to date with how these app makers evolve their tactics to avoid detection. It will be sharing signals about the apps with other tech companies so they can address them on their respective platforms. In May last year, Google announced a new policy that prohibits ads for deepfake porn or those that promise to digitally undress people without consent. Soon after, the San Francisco City Attorney's office sued 16 of the most-visited "undressing" sites with the aim of shutting them down.
[9]
Meta sues app-maker as part of crack down on 'nudifying'
Meta has taken legal action against a company which ran ads on its platforms promoting so-called "nudify" apps, which typically using artificial intelligence (AI) to create fake nude images of people without their consent. It has sued the firm behind CrushAI apps to stop it posting ads altogether, following a cat-and-mouse battle to remove them over a series of months. In January, the blog FakedUp found 8,010 instances of ads from CrushAI promoting nudifying aps on Meta's Facebook and Instagram platforms. "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," Meta said in a blog post.
[10]
Meta targets AI 'nudify' apps, but not for the reasons you're thinking
Whether we like it or not, AI is actively reshaping the world. But it's not just doom and gloom. There are tangible benefits that have the potential to improve various facets of our daily lives. Companies are using AI to ease traffic congestion at busy intersections, while the technology is also showing promise when it comes to diagnosing diseases before symptoms appear. On an individual scale, AI can be used for real-time translations, coding, automating repetitive tasks, learning complex topics, and for creative applications like video and image generation. The latter, though, is a double-edged sword. Related 10 ways I use AI to simplify my daily life Learn new ways to do work, studies, and hobbies Posts 11 While AI video and image generation offer a creative outlet, the potential for the technology being misused also presents a significant challenge -- a challenge that Meta appears to be currently confronting. For reference, most major AI image and video generation apps have measures in place that prevent the generation of deepfakes, stealing other artists' work, the generation of content that promotes misinformation, and more, complete with watermarking tools to differentiate between what is real and what was generated by AI. Then, on a different side of the spectrum, there are those that actively offer tools to create AI-generated nude and/or sexually explicit images of other individuals without their consent. Meta is now cracking down on one such prominent platform. The company has new defenses in place Related 7 things I learned from using AI-generated images for the first time What you may not know about AI image generators Posts Meta has filed a lawsuit against a Hong Kong-based company (Joy Timeline HK Limited) behind Crush AI and similar "nudify" applications (via Engadget). The app-maker frequently circumvented the tech giant's ad review process, with the company now taking additional steps to ensure such ads are no longer displayed on platforms like Instagram and Facebook. The lawsuit "follows multiple attempts by Joy Timeline HK Limited to circumvent Meta's ad review process and continue placing these ads, after they were repeatedly removed for breaking our rules," wrote Meta in a new news release. So, while this may look like a lawsuit against the company for letting users generate sexually explicit material from images of other individuals, it is essentially just a lawsuit targeting Joy Timeline HK Limited's repeated and intentional circumvention of Meta's advertising policies. The company added that, going forward, when it detects and removes ads, accounts or content promoting said content, it will share information, "starting with URLs to violating apps and websites - with other tech companies through the Tech Coalition's Lantern program, so they can investigate and take action too." It is also rolling out additional defenses that are designed to detect these types of ads, even when the ad content itself does not contain nudity. This comes soon after the Oversight Board, an independent body that looks over content moderation decisions for platforms like Facebook and Instagram, criticized Meta for not doing enough to enforce its own rules. Related AI's ethics complications are reaching a fever pitch If it's really going to help humanity, AI devs and execs have an uphill battle Posts
[11]
Meta Sues Nudify App That Keeps Advertising on Instagram
As part of what it claims is a new crackdown, Meta is suing a nudify app and "strengthening" its enforcement. Meta said it is suing a nudify app that 404 Media reported bought thousands of ads on Instagram and Facebook, repeatedly violating its policies. Meta is suing Joy Timeline HK Limited, the entity behind the CrushAI nudify app that allows users to take an image of anyone and AI-generate a nude image of them without their consent. Meta said it has filed the lawsuit in Hong Kong, where Joy Timeline HK Limited is based, "to prevent them from advertising CrushAI apps on Meta platforms," Meta said. In January, 404 Media reported that CrushAI, also known as Crushmate and other names, had run more than 5,000 ads on Meta's platform, and that 90 percent of Crush's traffic came from Meta's platform, a clear sign that the ads were effective in leading people to tools that create nonconsensual media. Alexios Mantzarlis, now of Indicator, was first to report about Crush's traffic coming from Meta. At the time, Meta told us that "This is a highly adversarial space and bad actors are constantly evolving their tactics to avoid enforcement, which is why we continue to invest in the best tools and technology to help identify and remove violating content." "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," Meta said in a post on its site announcing the lawsuit. "We'll continue to take the necessary steps -- which could include legal action -- against those who abuse our platforms like this." However, CrushAI is far from the only nudify app to buy ads on Meta's platforms. Last year I reported that these ads were common, and despite our reporting leading to the ads being removed and Apple and Google removing the apps from their app stores, new apps and ads continue to crop up. To that end, Meta said that now when it removes ads for nudify apps it will share URLs for those apps and sites with other tech companies through the Tech Coalition's Lantern program so those companies can investigate and take action against those apps as well. Members of that group include Google, Discord, Roblox, Snap, and Twitch. Additionally, Meta said that it's "strengthening" its enforcement against these "adversarial advertisers." "Like other types of online harm, this is an adversarial space in which the people behind it -- who are primarily financially motivated -- continue to evolve their tactics to avoid detection. For example, some use benign imagery in their ads to avoid being caught by our nudity detection technology, while others quickly create new domain names to replace the websites we block," Meta said. "That's why we're also evolving our enforcement methods. For example, we've developed new technology specifically designed to identify these types of ads -- even when the ads themselves don't include nudity -- and use matching technology to help us find and remove copycat ads more quickly. We've worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases and emojis that our systems are trained to detect within these ads." From what we've reported, and according to testing by AI Forensics, a European non-profit that investigates influential and opaque algorithms, in general it seems that content in Meta ads is not moderated as effectively as regular content users post to Meta's platforms. Specifically, AI Forensics found that the exact same image containing nudity was removed as a normal post on Facebook but allowed when it was part of a paid ad. 404 Media's reporting has led to some pressure from Congress, and Meta's press release did mention the passage of the federal Take It Down Act last month, which holds platforms liable for hosting this type of content, but said it was not the reason for taking these actions now.
[12]
Meta sues 'nudify' app Crush AI
The lawsuit highlights a network of deceptive workarounds to Meta's ad review process. Credit: Sebastian Kahnert / picture alliance via Getty Images Meta has struck out against a popular app used to produce AI-generated nonconsensual intimate images -- commonly referred to as "nudify" or "undress" apps -- as the company selectively cracks down on advertisers. In a new lawsuit filed in Hong Kong against the makers behind a commonly-used app known as Crush AI, the tech giant claims parent company Joy Timeline HK intentionally bypassed Meta's ad review process using new domain names and networks of advertiser accounts in order to promote the app's AI-powered deepfake services. "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it. We'll continue to take the necessary steps -- which could include legal action -- against those who abuse our platforms like this," Meta wrote in a press release. Meta has previously been under fire for failing to curb nudify apps from advertising on its platform, including allowing ads featuring explicit deepfake images of celebrities to appear repeatedly on the platform -- in addition to its advertising policies, Meta prohibits the spread of non-consensual intimate imagery and blocks the search terms "nudify," "undress" and "delete clothing." According to an analysis by Cornell researcher Alexios Mantzarlis, Crush AI allegedly ran more than 8,000 ads across Meta platforms between the fall of 2024 and January 2025, with 90 percent of its traffic coming from Meta platforms. Broadly, AI-generated ad content has plagued users, as the company has relaxed its content moderation policies in favor of automated review processes and community-generated fact-checking. Victims of AI-generated nonconsensual intimate imagery have spent years fighting for greater industry regulation and legal pathways for recourse. In May, the Trump Administration signed the Take It Down Act, a law that criminalizes nonconsensual intimate imagery and sets mandatory takedown policies for online platforms. AI-generated child sexual abuse material (CSAM) has also proliferated across the internet in recent years, prompting widespread concern about the safety and regulation of generative AI tools. In addition to taking legal action against Crush AI, Meta announced it was developing a new detection technology to more accurately flag and remove ads for nudify apps. The company is also stepping up its work with the Tech Coalition's Lantern program, an industry initiative to coalesce information on child online safety, and will continue sharing information on violating companies and products. Since March, Meta has reported more than 3,800 unique URLs related to nudify apps and websites and discovered four separate networks trying to promote their services, according to the company.
[13]
Meta files lawsuit against maker of "nudify" app technology
Alain Sherter is a senior managing editor with CBS News. He covers business, economics, money and workplace issues for CBS MoneyWatch. Meta said Thursday that it's suing an app maker that uses artificial intelligence to simulate nude images of real people who appear clothed in pictures. Meta said it filed a lawsuit in Hong Kong against Joy Timeline HK Limited, the entity behind app CrushAI, to prevent it from advertising CrushAI apps on Meta platforms. "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," Meta said in a statement. "We'll continue to take necessary steps -- which could include legal action -- against those who abuse our platforms like this." The legal action comes after Joy Timeline made "multiple attempts" to circumvent Meta's ad review process, Meta alleges. Ads for so-called nudify apps have appeared on Meta's Facebook and instagram platforms despite violating the social media sites' advertising policies. CrushAI, which makes the apps, promoted AI tools that it says lets users upload photos and "see anyone naked," a CBS News investigation found. Meta has said the company bans "non-consensual intimate imagery" on its platforms. The company previously told CBS News that it has removed ads for nudify technology, deleted pages on its platforms that run the spots and permanently blocked websites associated with the apps. Meta on Thursday said it will share information, including ad URLs, about entities that violate its policies with other techn companies through the Tech Coalition's Lantern Program, which tracks behaviors that violate their child safety rules. Since March, Meta has provided the program with information on more than 3,800 sites that is shared with other tech companies, according to the company. Meta said advertisers of nudify apps use various means to to avoid detection on its platforms, including by using inoffensive imagery to try to circumvent tech used to identify such ads on it sites. As a result, it has developed better technology to detect ads from nudify apps that are presented as benign, Meta said Thursday. "We've worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases and emojis that our systems are trained to detect with these ads," Meta said.
[14]
Meta files lawsuit against AI firm behind fake nonconsensual nude images - SiliconANGLE
Meta files lawsuit against AI firm behind fake nonconsensual nude images Meta Platforms Inc. announced today that it's suing a company that advertised generative artificial intelligence apps on Meta's platforms that enabled users to "nudify" people from a clothed image. The lawsuit, launched in Hong Kong against Joy Timeline HK Ltd., states that the company consistently tried to circumvent Meta's review process by advertising the CrushAI app -- which allows users to virtually undress anyone they want, to "see anyone naked," and usually without the person's consent. "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," Meta said. "We'll continue to take necessary steps, which could include legal action against those who abuse our platforms like this." The lawsuit came after the social media giant investigated so-called nudify apps, which, according to that investigation, are being sold and advertised across the internet. It's long been known that people abuse deepfake technology to create sexualized images, but it seems nudifying apps have so far been allowed to proliferate without much pushback. According to research by Alexios Mantzarlis, Director of Cornell Tech's Security, Trust and Safety Initiative, from fall last year to the start of this year, Crush AI had over 8,000 ads on Facebook and Instagram. Meta doesn't allow "non-consensual intimate imagery" on its platforms, but it says such apps are hard to detect by nudity detection technology due to the benign nature of their ads or the fact that they create new domain names after their websites have been blocked. "We've worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases, and emojis that our systems are trained to detect with these ads," Meta said. It's reported that the apps are often used to virtually undress celebrities, but a more pressing concern is that they could be used with images of children. In 2024, two teens in Florida were arrested after they'd been found to have used generative AI to create sexualized images of their classmates. The nudify apps, reportedly currently sold in app stores, would make such activity very easy to accomplish. In the U.S., the Take It Down Act was passed in the House in April this year and was later signed into law by President Donald Trump. The law criminalizes the publication of nonconsensual sexually explicit deepfake videos and images, while making it easier for victims to have the images removed.
[15]
Meta has filed a lawsuit against AI firm behind fake non-consensual nude images - SiliconANGLE
Meta has filed a lawsuit against AI firm behind fake non-consensual nude images Meta Platforms Inc. announced today that it's suing a company that advertised generative artificial intelligence apps on Meta's platforms that enabled users to "nudify" people from a clothed image. The lawsuit, launched in Hong Kong against Joy Timeline HK Ltd., states that the company consistently tried to circumvent Meta's review process by advertising the CrushAI app -- an app that allows users to virtually undress anyone they want, to "see anyone naked," and usually without the person's consent. "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," Meta said. "We'll continue to take necessary steps, which could include legal action against those who abuse our platforms like this." The lawsuit came after the social media giant investigated so-called nudify apps, which, according to that investigation, are being sold and advertised across the internet. It's long been known that people abuse deepfake technology to create sexualized images, but it seems nudifying apps have so far been allowed to proliferate without much pushback. According to research by Alexios Mantzarlis, Director of Cornell Tech's Security, Trust and Safety Initiative, from fall last year to the start of this year, Crush AI had over 8,000 ads on Facebook and Instagram. Meta doesn't allow "non-consensual intimate imagery" on its platforms, but it says such apps are hard to detect by nudity detection technology due to the benign nature of their ads or the fact that they create new domain names after their websites have been blocked. "We've worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases, and emojis that our systems are trained to detect with these ads," Meta said. It's reported that the apps are often used to virtually undress celebrities, but a more pressing concern is that they could be used with images of children. In 2024, two teens in Florida were arrested after they'd been found to have used generative AI to create sexualized images of their classmates. The nudify apps, reportedly currently sold in app stores, would make such activity very easy to accomplish. In the U.S., the Take It Down Act was passed in the House in April this year and was later signed into law by President Donald Trump. The law criminalizes the publication of nonconsensual sexually explicit deepfake videos and images, while making it easier for victims to have the images removed.
[16]
Meta sues Hong Kong firm in crackdown on deepfake nude apps
(Bloomberg) -- Meta Platforms Inc. is suing a Hong Kong-based business for allegedly promoting so-called nudify apps, as the Facebook owner attempts to combat a surge in naked and sexual images created by artificial intelligence. The tech giant said it filed a lawsuit in Hong Kong to stop Joy Timeline HK Ltd. allegedly advertising CrushAI apps on Meta platforms. The apps allow people to create AI-generated nude or sexually explicit images of individuals without their consent, Meta said in a post on its website Thursday. The legal action is part of broader crackdown on online non-consensual sexual images, which can be used for sextortion, blackmail and abuse. Meta, which also owns Instagram, said it will now share with other technology companies information about ads, accounts and content that have been removed for promoting nudify apps. In its post, Meta said it had noticed "concerning growth" in this area. "With nudify apps being advertised across the internet - and available in App Stores themselves - removing them from one platform alone isn't enough," Meta said. Joy Timeline didn't immediately respond to a request for comment sent to its public email address. Meta is under fire around the world for not doing enough to protect teenagers and young people. In Australia, a world-first ban on social media for under-16s will start this year, and other countries are tightening content oversight. But while Meta is fighting back against some fake, auto-generated material, it's pushing deeper into artificial intelligence more broadly. Advertisers promoting nudify apps are changing tactics to avoid being caught, according to Meta. Some use benign images to evade detection, while others swiftly create new domain names to replace blocked websites. In response, Meta said it has developed technology to identify such ads, even when they don't include nudity. In its Hong Kong lawsuit, Meta claims Joy Timeline repeatedly tried to circumvent ad review processes after they were removed for breaking Meta's rules. -With assistance from Kurt Wagner and Newley Purnell. More stories like this are available on bloomberg.com
[17]
Meta sues developer of 'nudify' app CrushAI
Meta filed a lawsuit against a developer for allegedly running advertisements to promote its "nudify" apps which use artificial intelligence to create non-consensual nude or sexually explicit images. The suit accuses Joy TimelineHK Limited, the developer behind CrushAI apps, of violating Meta's rules against non-consensual intimate imagery. Meta noted its policies were updated more than a year ago to further clarify the promotion of nudify apps or related products is not permitted on their platforms. Meta claimed the Hong Kong-based company attempted to "circumvent" Meta's ad review process and continued to run the ads even after the social media firm removed them. The Hill reached out to Joy TimelineHK Limited for comment. "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," Meta wrote in a release Thursday. The Facebook and Instagram parent company touted how it removes these types of ads once its teams are made aware. Meta also blocks links to websites and restricts search terms like "nudify," "delete clothing," or "undress." The lawsuit is part of Meta's broader fight against nudify apps. In addition the work on its own platforms, the technology firm said has started sharing links for violating apps with other tech companies, proviing more than 3,800 links since the end of March. Meta also is developing new technology designed to more easily identify these ads, even if they do not include nudity, and have expert teams tracking down account networks accused of running these ads. Social media companies have faced increased pressure to limit this type of content on its platforms, from both lawmakers and tech safety groups. This comes just weeks after President Trump signed the Take It Down Act, making it a crime to knowingly publish sexually explicit "deepfake" images and videos online. Meta said it "welcomes legislation that helps fight intimate image abuse across the internet" and applauded the Take it Down Act.
[18]
Meta Launches Lawsuit Against 'Nudify' AI Company Over Facebook, Instagram Ads
A previous report from investigative tech outlet 404 Media estimated that 90% of CrushAI's traffic came from ads on Meta's services like Instagram. Meta Platforms (META) on Thursday announced a lawsuit against a Hong Kong-based company that has managed to work around the social media giant's advertising detection technology to promote AI-powered explicit apps. The Facebook and Instagram parent said it is suing Joy Timeline HK Limited, which owns a slate of apps called CrushAI, one of which allows users to take a picture of a person and use AI technology to make the image sexually explicit. The lawsuit was filed in Hong Kong, where Joy Timeline is based, Meta said. The company behind the so-called "nudify" app continued to circumvent Meta's ad review process after previous ads for the service had been removed from Meta's platforms, Meta said. Meta also said it was revamping its detection technology to catch more ads that may not feature explicit content like nudity in its ads but are still promoting a service that violates Meta's rules. The move comes after Meta received a letter from Illinois Senator Dick Durbin earlier this year, who asked Meta to answer for how it safeguards against these types of ads. The letter cited a report from the tech news outlet 404 Media, who found that an estimated 90% of traffic to CrushAI's apps were coming from ads it had placed on Meta's platforms. Joy Timeline did not immediately respond to a request for comment. The suit is the second against an AI company in as many days, coming a day after Disney (DIS) and Universal teamed up to sue AI image generation company Midjourney, alleging the company has made millions in revenue by violating the copyrights of their properties like "Star Wars" and Marvel Comics.
[19]
Meta Sues Hong Kong-Based Deepfake App Maker Accusing It Of Running 87,000 Banned Ads As Zuckerberg Faces Pressure Over AI-Generated Nudity And Exploitation - Meta Platforms (NASDAQ:META)
On Thursday, Meta Platforms, Inc. META filed a lawsuit against a Hong Kong-based developer of an explicit AI deepfake app, alleging the company violated its advertising policies tens of thousands of times and intensified growing concerns over the misuse of artificial intelligence for non-consensual content. What Happened: Meta is suing Joy Timeline HK Limited, the creator of CrushAI, an app capable of generating sexually explicit deepfakes, reported CNN. The company claims the app maker ran over 87,000 ads that violated Meta's rules across Facebook and Instagram by promoting "nudifying" technology -- tools that digitally remove clothing from images of people without consent. According to the complaint, filed in a Hong Kong court, the defendants created a network of 170 business accounts and 135 Facebook pages, managing ads that largely targeted users in the U.S., Canada, Australia, Germany and the U.K. Also Read: Meta Hits 1 Billion Monthly AI Users, Eyes Future With Subscriptions Meta says it has spent $289,000 responding to regulatory inquiries and investigating the violations. "This is an adversarial space," the company said in a statement, noting that such actors "evolve their tactics to avoid detection," the report added. Why It's Important: Tech platforms are under increasing pressure to combat non-consensual, explicit deepfakes, which are easier to create with AI. Victims have included public figures like Taylor Swift and Rep. Alexandria Ocasio-Cortez, as well as high school girls. In response, the Take It Down Act -- banning the sharing of such content and mandating swift removal by platforms -- was signed into law last month. In January, reports revealed that CrushAI, a nudifying app, ran thousands of ads on Meta platforms, despite Meta's policies banning adult content and sexual exploitation. Today's Best Finance Deals This led Sen. Dick Durbin (D-IL) to question Meta CEO Mark Zuckerberg about the company's oversight. Earlier this month, CBS News found more ads promoting similar apps, some using sexualized deepfakes of celebrities. In response, Meta said it removed the ads, deleted associated Pages and permanently blocked the related URLs. Price Action: Meta shares declined 0.11% on Thursday and dropped another 1.75% in pre-market trading on Friday, according to Benzinga Pro. Benzinga's Edge Stock Rankings indicate a continued upward trend for META across short, medium and long-term periods. More detailed performance data is available here. Read Next: Cathie Wood Dumps Palantir As Stock Touches Peak Prices, Bails On Soaring Flying-Taxi Maker Archer Aviation Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors. Photo courtesy: Skorzewiak / Shutterstock.com METAMeta Platforms Inc$680.70-1.83%Stock Score Locked: Edge Members Only Benzinga Rankings give you vital metrics on any stock - anytime. Unlock RankingsEdge RankingsMomentum85.09Growth92.72Quality89.49Value27.51Price TrendShortMediumLongOverviewMarket News and Data brought to you by Benzinga APIs
[20]
Meta Sues Company Behind AI App That Creates Fake Nude Images
Meta Platforms is suing a company that ran ads for an app that allows users to create artificial-intelligence-created nude or sexually explicit images of individuals without their consent. The social-media company said Thursday that it filed its lawsuit against Joy Timeline HK, the Hong Kong-based entity behind CrushAI. Meta alleges the company repeatedly tried to circumvent its ad-review systems after ads for the app were removed for violating its policies. "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," Meta said. "We'll continue to take the necessary steps--which could include legal action--against those who abuse our platforms like this." Joy Timeline didn't immediately respond to a request for comment. Meta said it has seen a troubling rise in so-called "nudify" apps, which use AI to create fake, non-consensual nude or sexually explicit images. In response, the company said it is ramping up enforcement efforts, including new measures to detect and remove such apps, help other platforms do the same and pursue legal action against their creators. Since March, Meta has begun sharing data on apps and websites that violate its advertising rules, distributing over 3,800 URLs to other tech companies to aid in coordinated enforcement, it said. The company is also developing new technology to detect nudify apps more effectively and is supporting legislation aimed at combating intimate-image abuse--whether AI-generated or real--across the internet.
Share
Copy Link
Meta has filed a lawsuit against Joy Timeline HK Limited, the maker of the AI 'nudify' app Crush AI, for repeatedly violating advertising policies on Facebook and Instagram. The company is also implementing new measures to combat the spread of AI-generated explicit content across its platforms.
Meta, the parent company of Facebook and Instagram, has taken a significant step in combating the spread of AI-generated explicit content by filing a lawsuit against Joy Timeline HK Limited, the entity behind the popular AI "nudify" app Crush AI 1. The lawsuit, filed in Hong Kong, alleges that the company repeatedly violated Meta's advertising policies and attempted to circumvent its review processes 2.
Source: TechCrunch
According to reports, Crush AI ran over 8,000 ads for its "AI undresser" services on Meta's platforms in just the first two weeks of 2025 1. The app, which uses generative AI to create fake, sexually explicit images of real people without their consent, reportedly received about 90% of its traffic from Facebook or Instagram 1. As of February 2025, over 135 Facebook pages displayed more than 87,000 ads for nudify apps, with at least 170 business accounts on Facebook and Instagram placing these ads 5.
In addition to the lawsuit, Meta has announced several new measures to crack down on AI nudify apps:
New Technology: Meta has developed new technology specifically designed to identify ads for AI nudify or undressing services, even when the ads themselves don't include nudity 1.
Enhanced Detection: The company has expanded its list of safety-related terms, phrases, and emojis that its systems are trained to detect within these ads 3.
Information Sharing: Meta will share information about AI nudify apps through the Tech Coalition's Lantern program, a collective effort between tech companies to prevent child sexual exploitation online 1.
Source: The Verge
The issue of AI-generated explicit content is not unique to Meta. Other platforms like X (formerly Twitter), Reddit, and YouTube have also struggled with moderating how AI tools can make their platforms unsafe for users, particularly minors 1. The problem has grown significantly, with links to AI undressing apps soaring in 2024 across various platforms 1.
Source: The Register
This case highlights the growing need for legislation and industry-wide policies to address the challenges posed by AI-generated explicit content. In the United States, the recently passed Take It Down Act makes it illegal to distribute nonconsensual intimate images, requiring websites to make reasonable efforts to remove such content within 48 hours of notification 3.
Meta's lawsuit and new measures represent a significant step in the ongoing battle against the misuse of AI technology for creating non-consensual explicit content. As the technology continues to evolve, it's clear that both tech companies and policymakers will need to remain vigilant and adaptive in their approaches to protect users and maintain the integrity of online platforms.
AMD reveals its new Instinct MI350 and MI400 series AI chips, along with a comprehensive AI roadmap spanning GPUs, networking, software, and rack architectures, in a bid to compete with Nvidia in the rapidly growing AI chip market.
18 Sources
Technology
21 hrs ago
18 Sources
Technology
21 hrs ago
Google DeepMind has launched Weather Lab, an interactive website featuring AI weather models, including an experimental tropical cyclone model. The new AI system aims to improve cyclone predictions and is being evaluated by the US National Hurricane Center.
8 Sources
Technology
21 hrs ago
8 Sources
Technology
21 hrs ago
Meta's new AI app is facing criticism for its "Discover" feature, which publicly displays users' private conversations with the AI chatbot, often containing sensitive personal information.
6 Sources
Technology
21 hrs ago
6 Sources
Technology
21 hrs ago
A major Google Cloud Platform outage affected numerous AI services and popular platforms, highlighting the vulnerabilities of cloud-dependent systems and raising concerns about the resilience of digital infrastructure.
3 Sources
Technology
5 hrs ago
3 Sources
Technology
5 hrs ago
Harvard University and other libraries are releasing vast collections of public domain books and documents to AI researchers, providing a rich source of cultural and historical data for machine learning models.
6 Sources
Technology
21 hrs ago
6 Sources
Technology
21 hrs ago