Over 100 AI nudify apps found on Google and Apple app stores despite policies banning them

Reviewed byNidhi Govil

6 Sources

Share

A Tech Transparency Project investigation uncovered 102 AI nudify apps across Google Play and Apple App Store that generate nonconsensual sexualized images. These apps were downloaded 705 million times and generated $117 million in revenue. While both companies removed dozens of apps after the report, the findings expose significant gaps in content moderation and raise questions about platform accountability.

Tech Transparency Project Exposes Massive Moderation Failure

The Tech Transparency Project (TTP) released a damning report Tuesday revealing that app stores operated by Google and Apple host dozens of AI nudify apps capable of digitally undressing individuals without their consent

1

. The investigation identified 55 such apps on the Google Play Store and 47 on the Apple App Store, with 38 appearing on both platforms

5

. These applications, which clearly violate platform policy violations, were collectively downloaded over 705 million times worldwide and generated $117 million in revenue

1

.

Source: Android Authority

Source: Android Authority

Both Google and Apple take significant cuts—up to 30 percent—from in-app purchases and subscriptions, meaning they directly profited from apps designed to generate nonconsensual sexualized images

5

. Katie Paul, director of Tech Transparency Project, told reporters that Grok is "really the tip of the iceberg," noting that "even more graphic content can be created by some of these other apps"

2

.

How Google and Apple App Moderation Failed Users

Researchers discovered these apps by searching terms like "nudify" and "undress" across both app stores

4

. They tested free versions using AI-generated images of fully clothed women, prompting the AI image editor tools to render subjects completely or partially nude. Face-swapping apps were also tested, successfully superimposing clothed women's faces onto images of nude bodies

2

. "It's very clear, these are not just 'change outfit' apps," Paul told CNBC. "These were definitely designed for non-consensual sexualization of people"

4

.

Source: Engadget

Source: Engadget

Apple removed 28 apps after being contacted by TTP and CNBC, though two were later restored after developers resubmitted versions claiming to address guideline concerns

4

. Google suspended several apps but declined to specify exact numbers, stating its investigation remains ongoing

4

. This marks the second time both companies have faced similar reports—404 Media documented comparable issues in 2024

1

.

Disturbing Marketing to Children and National Security Risks

Perhaps most alarming is how these apps were marketed. DreamFace, an AI image and video generator still available on Google Play Store at the time of reporting, was rated suitable for ages 13 and up on Google's platform and ages nine and up on Apple's

3

. The app accepts lewd prompts to show naked women and generates $1 million in revenue

5

. "These aren't just being marketed to adults, but they're also being pushed to kids," Paul emphasized

2

.

Data privacy concerns compound the issue. Paul warned that apps with connections to China could mean non-consensual deepfake nudes of American citizens end up in the hands of the Chinese government due to data-sharing laws. "That's a major privacy violation on top of being a national security risk, particularly if there are political figures who have non-consensual nude imagery created of them," she said

2

.

The Grok Connection and Selective Enforcement

The Tech Transparency Project report arrives amid global outrage over Elon Musk's xAI chatbot Grok, which generated approximately three million sexualized images, including 22,000 involving children

3

. Grok and X now face investigations in the EU and UK, plus lawsuits from victims

1

. Yet both companies' app stores continue offering access to X and Grok despite removing other nudify apps.

Apple even displayed sponsored ads for nudify apps when users searched banned terms. "When you search the word 'nudify' in the App Store, according to their policy, nothing should come up, but not only does Grok come up first, Apple had a sponsored ad for a nudify app," Paul revealed

2

. The selective enforcement raises questions about content moderation priorities—both companies swiftly removed the ICEBlock app while allowing deepfake technology to flourish

1

.

Source: 9to5Mac

Source: 9to5Mac

What This Means for Platform Accountability

This investigation follows TTP's December report finding 52 apps in Apple's App Store and 18 in Google Play Store connected to U.S.-sanctioned entities tied to the war in Ukraine and human rights abuses in China

2

. Both companies subsequently removed those apps and posted jobs for compliance officers. "That was sanctions. That was an area where there are legal consequences," Paul noted. "You can imagine the other harmful apps that are out there where there aren't legal consequences"

2

.

The gap between policy and enforcement remains stark. While both platforms prohibit apps depicting sexual nudity, nonconsensual imagery tools operated freely, generating substantial revenue. Apps like RemakeFace—capable of creating non-consensual deepfake nudes through face-swapping—remained available on both stores even after the investigation went public

5

. Michelle Kuppersmith, an executive director at the nonprofit running TTP, stated: "Apple and Google are supposed to be vetting the apps in their stores. But they've been offering dozens of apps that can be used to show people with minimal or no clothing—making them ripe for abuse"

3

.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo