11 Sources
11 Sources
[1]
Apple, Google Offer 'Nudify' Apps Despite Policies Against Them
Apple Inc. and Google have continued to offer mobile apps that let users make nonconsensual sexualized images of people despite their policies prohibiting such content, according to a report published Wednesday by the Tech Transparency Project. Searching for terms like "nudify" and "undress" in the Apple and Google app download stores gives customers access to software that can be used to alter images of celebrities and others to make them appear nude or in a state of partial undress, according to the group, a research arm of the nonprofit Campaign for Accountability. The companies also run ads for similar nudifying apps in their search results. Apps identified by the group have been downloaded 483 million times and generated $122 million in revenue, according to the report, which cited revenue estimates from market researcher AppMagic. A spokesperson for AppMagic said the Tech Transparency Project's work has resulted in several apps being removed and prompted others to change their user policies. Over the past year, politicians around the world have ratcheted up calls to curb the spread of nudifying apps. Earlier this year, the companies removed apps flagged by the Tech Transparency Project. But just a few months later, dozens of similar ones could be found, researchers from the organization said. "It's not just that the companies are failing to actually appropriately review these apps and continue to approve them and profit from them," Katie Paul, director of the project, said in an interview. "They are actually directing users to the apps themselves." From its app store searches, the group identified 18 apps with nudifying capabilities in the Apple App Store and 20 in the Google Play Store. In addition, both Apple and Google sometimes directed users to the apps via their autocomplete feature by suggesting the names of more nudifying apps as users typed, the researchers said. Some of the apps used names and images that cast them in a sexual light. Others could easily be used for that purpose despite not being marketed as such, making them more convenient than traditional photo-editing software. Some offered subscriptions, the Tech Transparency Project said. Apple's App Store guidelines for developers ban "overtly sexual or pornographic material." The Google Play Store bans "apps that degrade or objectify people, such as apps that claim to undress people or see through clothing, even if labeled as prank or entertainment apps. Google said that many of the apps referenced in the report have been suspended from Google Play for violations of its policies, and an investigation is ongoing. "When violations of our policies are reported to us, we investigate and take appropriate action," the company said in an email. Apple said it removed 15 apps identified by the group after Bloomberg reached out about their presence. Among the apps taken down were PicsVid AI Hot Video Generator, which offered templates that featured women sucking on phallic lollipops, according to the researchers. PicsVid's developer didn't respond to a request for comment. Another app identified by the Tech Transparency Project, Uncensored AI -- No Filter Chat, stripped clothes from an image of a woman uploaded by the researchers. A representative of Uncensored AI's developer said the app no longer allows removal of clothes. Apple said it contacted the developers of six apps to alert them to issues that need to be addressed and that they are at risk of being removed. Other apps mentioned by the Tech Transparency Project didn't violate the company's guidelines, Apple said. The company added that it has proactively rejected many apps and removed others. The tech giants' enforcement efforts are "uneven and largely opaque," according to Anne Helmond, a professor at Utrecht University in the Netherlands. "If an app presents itself as a generic image generator, it may pass review, even if it can be misused in practice," said Helmond, who is a director of the App Studies Initiative, an international research group. "Visibility is shaped by ranking and search systems that reward engagement, meaning that controversial uses can increase an app's prominence." One of the apps identified by the researchers in the Google Play Store, Video Face Swap AI: DeepFace, advertised swapping actress Anya Taylor-Joy's face onto Game of Thrones character Daenerys Targaryen. But inside the app, under a category called Girls, users could paste people's faces onto video templates of women bouncing their breasts or shaking their hips, Bloomberg found. The app, which is rated "E" for Everyone, has been downloaded over 1 million times from the store, where users could find it by typing "face swap" into the search bar. Okapi Software, the company that offers Video Face Swap AI, said it had launched an investigation into the issues raised by Bloomberg and removed some of the content, which it said had been uploaded by users. "Our app does not offer 'nudify' functionality, and we do not permit the generation of nude or sexually explicit content," Okapi said. "We take content safety and compliance seriously." A growing chorus of regulators are calling for the companies to do more to uphold their policies. Last year, President Donald Trump signed the Take It Down Act, which criminalizes the publication of non-consensual sexual content and compels social media and websites to remove such posts." In April, the UK government plans to introduce legislation that would open a path to prosecute tech executives whose companies do not take down such images.
[2]
Apple and Google are reportedly pointing users to nudify apps
Earlier this year it was revealed that Apple and Google were offering "nudify" apps on their stores despite having clear policies barring such content. Nearly three months later, such apps are not only still available, but being actively promoted on the iOS App Store and Google Play, according to a new report from the Tech Transparency Project (TTP). Many of those were labeled "E" for Everyone, meaning they can be downloaded by children. Searching for "nudify," "undress" and other terms in those stores gives users access to apps that can make real people nude or put them into pornographic videos. The new report alleges that "the platforms are key participants in the spread of AI tools that can turn real people into sexualized images," TTP wrote. The app stores even ran ads for similar nudifying apps in the search results. The group identified 18 nudify apps in Apple's App Store and 20 in Google Play. Some were marketed with sexual images, while others weren't advertised as such but could still be used for deepfakes. Those apps have collectively generated around $122 million in revenue and been downloaded 483 million times, according to the report. "It's not just that the companies are failing to actually appropriately review these apps and continue to approve them and profit from them," TTP director Katie Paul told Bloomberg. "They are actually directing users to the apps themselves." Apple and Google both have policies banning sexual or pornographic material, and Google has a specific policy against nudifying apps. Apple told Bloomberg that it removed 15 apps identified by the group, while Google said that it suspended a number of them. One of the apps cited in the report called Video Face Swap AI: DeepFace, advertises itself by showing an actress's face swapped onto another actress's body and allows users to put a real person's face on the bodies of partially undressed women. The app was rated "E" for Everyone. The proliferation of nudify and deepfake apps has pushed some governments to propose laws against them. The UK's Children's Commissioner recently called for a ban on AI deepfake apps that create nude or sexual images of children. The US and other countries have proposed or created laws banning explicit deepfakes, and the California Attorney General recently sent Elon Musk's X a cease and desist order over Grok's explicit deepfakes.
[3]
Apple and Google reportedly point users to 'nudify' apps despite banning them
Some of these apps are said to be rated "E," meaning even kids can download them. Apple and Google are supposed to be cracking down on harmful apps. However, a new report suggests they are still helping users find one of the most controversial types called "nudify" apps. A new report from the Tech Transparency Project (TTP) says both Apple and Google are steering users toward apps that use AI to create fake nude images, even though their policies clearly ban this kind of content (via Bloomberg). TTP, a research arm of the nonprofit Campaign for Accountability, previously shed light on the proliferation of this type of apps earlier this year. Both companies officially ban apps that create non-consensual sexual content. Apple's App Store guidelines and Google Play policies clearly restrict apps that promote exploitation or abuse. Nudify apps fall into this category, especially because they often need someone's photo to make explicit images without their consent. But TTP's report found that when you search for terms like "nudify" or "undress" on the iOS App Store or Google Play, you'll find dozens of apps that do exactly that. Even more concerning, the stores advertise these tools and suggest them through autocomplete. The group found 18 of these apps on Apple's store and 20 on Google Play. Together, they have been downloaded 483 million times and have made $122 million in revenue, according to AppMagic data. Many are rated "E" for Everyone, which means even children can legally download them. These apps use generative AI models similar to those behind popular image generators. Users upload a photo, and the AI predicts what a nude version might look like. The results are often disturbingly realistic. A big concern is how these platforms boost the reach of these apps. Even if Apple and Google do not host all of them directly, their systems still help users find them. This raises questions about responsibility for both what is allowed and what is promoted. After Bloomberg requested a comment, Apple said it removed 15 apps. Android Authority has reached out to Google for a statement. However, the main issue is that the same group reported similar apps earlier this year. The companies removed some, but within months, new ones appeared again.
[4]
AI "nudify" apps are being offered to everyone on the Google Play Store
I've been covering Android and other mobile technology for close to ten years now, with a specific interest in phone accessories, e-readers, and what makes each individual phone different from another. I delight in looking at the phone market from as many angles as possible, and while my opinions may be odd, at times, they're always from the heart as much as the head. I have a background in the mobile accessories world, which explains my odd enthusiasm for cases and things that clip onto smartphones. I worked for Digital Trends from 2017 to 2025. AI has some useful real-world applications, but the potential for harm is also extremely high, with AI "deepfakes" being one of the most obvious and concerning areas. This was highlighted at the start of this year, when it was revealed "nudify" apps were prevalent on both major smartphone app stores. A report from the Tech Transparency Project (TTP) has claimed that nothing has changed for Google's Play Store and Apple's App Store (via Engadget). According to the report, both stores still have a number of these controversial apps, and even worse, many are marketed as "E" for everyone, meaning children can download and use them. A new scourge in app stores So-called "nudify" apps are simple to explain. Image editors at their core, these apps are able to use generative AI to edit images, remove unwanted elements, and create new details to slot in. You might recognize the basic idea being identical to features like Google Photo's Magic Editor. However, those features have safety features, while many of these nudify apps do not. There are no prizes for guessing what these nudify apps are used for. Generative AI gives them the ability to digitally strip people in images, potentially creating pornographic images from shots of anyone. This is bad enough by itself, but the TTP's report also claims 31 of these apps have not been adequately vetted, and were rated as "E" for everyone. As such, these were available to and pushed to children, opening them to various avenues for harm. The full report contains a long rundown of some of the apps in question, and exactly what they're capable of, but suffice it to say here that the potential for harm is extremely high. Even worse, it's clear they're also a lucrative business, with the TTP's report saying the apps they looked at had generated around $122 million in revenue, and had been downloaded almost 500 million times. When asked for comment by the TTP, Google stated that many of the apps mentioned in the report had been removed from its store. Apple also removed a number of the apps, despite refusing to comment. The problems caused by these apps are obvious and clear, and there have already been moves made to ban them where possible. The UK government has begun an attempt to ban them for good, which is a good idea, considering the damage these sorts of apps are already causing in schools and elsewhere. Subscribe to our newsletter for AI safety coverage Follow developments in our newsletter for clear, evidence-based coverage of AI risks -- from deepfake image misuse and unsafe image-editing apps to app-store vetting and regulatory responses. Subscribing connects you to informed, focused analysis. Get Updates By subscribing, you agree to receive newsletter and marketing emails, and accept our Terms of Use and Privacy Policy. You can unsubscribe anytime. I have no doubt that Apple and Google would have removed these apps the moment they were made aware of them, as they have both claimed. However, the propensity of these apps and their apparent value must mean a change in how apps of this sort are rated and dealt with, as apps of this nature will only become more common if generative AI continues to exist in this form.
[5]
App Store search suggestions reportedly steered users to 'nudify' apps - 9to5Mac
The Tech Transparency Project (TTP) has followed up on its January report that revealed dozens of "nudify" apps on the App Store with a new investigation focused on how Apple's own search and ad systems may be helping users find them. Here are the details. According to the new report, both the App Store and Google Play Store "are helping users to find apps that create deepfake nude images of women," sometimes through promoted search results and autocomplete search suggestions. In the report, TTP says Apple and Google are still failing to prevent nudify apps from appearing in their app stores, some of which appear as suitable for minors. The group found that nearly 40% of the top 10 apps returned for searches such as "nudify," "undress," and "deepnude" could "render women nude or scantily clad." Additionally, some searches surfaced sponsored results for these apps. From the report: "(...)the first result from an App Store search for "deepfake" was an ad for FaceSwap Video by DuoFace. The app allows users to swap anyone's face from a still image onto a video. To test the app, TTP uploaded an image of a woman in a white sweater standing on a sidewalk and a video of a topless woman. After first showing a short ad, the app generated a video showing the clothed woman's face on the nude woman's body." And "Another App Store search for the term "face swap" yielded an ad for app called AI Face Swap. The app offers preset face swap templates and allows users to swap faces on images they upload themselves. TTP uploaded a photo of a woman in a blue sweater standing in a living room and an image of topless woman, and the app swapped their faces with no restrictions." Interestingly, in addition to contacting Apple and Google about these findings, TPP also contacted the developers of several of these apps. In at least one instance, the app developer confirmed they were using Grok for image generation, but claimed they "had no idea it was capable of producing such extreme content." The developer pledged to tighten moderation settings for image generation. Back to the report, TTP noted that typing "AI NS" as part of a search that could lead to "AI NSFW" prompted the App Store to suggest "image to video ai nsfw." That search, in turn, returned several nudify apps in the top ten results. Despite declining to comment on TTP's request, Apple responded to the report by removing most of the apps TTP identified.
[6]
Apple and Google Direct Users to AI 'Nudify' Apps: Report
Apple and Google are helping users find apps that create deepfake nude images, according to a new investigation. So-called "nudify" apps use AI to alter photographs of real people, making them appear naked, placing them in pornographic videos, or turning them into sexually explicit chatbots. According to a report published on Wednesday by the Tech Transparency Project (TTP), Apple and Google play a significant role in the spread of these tools. TTP first reported in January that both the Apple App Store and Google Play hosted dozens of apps designed to digitally remove clothing from photographs of women. The latest investigation found that the platforms' own search and advertising systems direct users toward these apps, increasing their visibility. Both Apple and Google have policies that prohibit apps enabling the creation of nonconsensual sexualized images. However, the TTP report found that not only do such apps remain available, the search tools on both platforms actively point users to them. According to the investigation, Apple and Google displayed ads for nudify apps within search results and suggested related terms through autocomplete. Bloomberg News reports that searching for terms like "nudify" and "undress" in the companies' app stores provides access to software that can alter photographs of celebrities and others to make them appear nude or partially undressed. The companies also run ads for similar apps in those results. TTP identified 18 nudify apps on Apple's platform and 20 on Google Play. Some were marketed with sexual images, while others were not explicitly advertised that way but could still be used to create deepfakes. Data compiled by a mobile analytics firm shows the apps identified by TTP have been downloaded 483 million times and have generated more than $122 million in lifetime revenue. The investigation also found 31 nudify apps rated as suitable for minors, a notable finding amid a rise in sexual deepfake incidents in schools. After TTP and Bloomberg News shared the report's findings, Apple removed 15 apps and Google removed seven. The report comes as lawmakers in Minnesota are reportedly close to banning AI nudification apps outright. In the U.K., the Children's Commissioner has also called for an immediate ban on such apps, citing concerns that they enable "deepfake sexual abuse of children."
[7]
Damning report finds Apple and Google's app stores boosting nudify apps
The app stores aren't just hosting nudify apps -- they're promoting them If you assumed Apple and Google were just slow to catch harmful apps on their platforms, a new investigation suggests the problem is far worse than that. The Tech Transparency Project (TTP) found that both the App Store and Google Play aren't just hosting nudify apps. Their search and advertising systems are actively pointing users toward them. Nudify apps are AI tools that can digitally strip clothing from photos of real people. They can also generate pornographic videos or create sexually explicit chatbots using someone's likeness. The scary part? 31 of the apps TTP found were rated suitable for minors. How Apple and Google are sending users straight to these nudify apps TTP ran searches using terms like "nudify," "undress," "deepfake," and "AI NSFW" on both app stores. About 40% of the top 10 results for each term returned apps that could render women nude or scantily clad. But it doesn't stop at search results. Both platforms ran paid ads for nudify apps within those results. In Google's case, that included a carousel of sponsored apps, some of which were openly pornographic. Recommended Videos The autocomplete feature made things worse. When TTP typed "AI NS" into the App Store search field, it suggested "image to video ai nsfw," which led to more nudifying apps in the top results. Apple controls all advertising in its App Store and has a stated policy against ads promoting adult content. Despite that, 3 of TTP's App Store searches still returned a nudify ad as the very first result. Why this is a bigger issue than you think The apps identified across both stores have been downloaded 483 million times and earned over $122 million in lifetime revenue. Apple and Google take a cut of that through paid subscriptions and in-app purchases, which TTP says may explain why enforcement has been lax. After TTP and Bloomberg flagged these apps, Apple removed 15 of them, and Google suspended several others. However, both companies declined to explain how these apps had passed review or why age ratings allowed minors to download them. How long before Apple and Google are forced to act? The UK government has begun proposing and enacting laws against explicit deepfakes, and the US recently recorded its first criminal conviction under one such law. Pressure on Apple and Google to act more decisively is only likely to grow. Apple's own enforcement record is already under scrutiny. A letter obtained by NBC News revealed that Apple privately threatened to pull Grok from the App Store in January over sexualized deepfakes, even rejecting xAI's first fix as insufficient. Apple eventually let Grok stay, but with reports like this one piling up, both companies are running out of room to look the other way.
[8]
Apple, Google Accused of Steering Users to Nudify Apps
Apple and Google are directing users to apps that can create non-consensual nude and sexually explicit images, despite having policies that prohibit such content, according to a new investigation by Tech Transparency Project (TTP). The report found that the Apple App Store and Google Play Store not only host these so-called "nudify" apps but also promote them through search results, autocomplete suggestions, and sponsored advertisements. These apps use artificial intelligence (AI) to digitally remove clothing from photos of real people, generate pornographic videos, and create sexually explicit AI chatbots. Massive Reach and Revenue: TTP discovered that users have downloaded the Nudify apps identified during its investigation 483 million times, generating more than $122 million in lifetime revenue. Alarmingly, app stores rated 31 of these apps as suitable for minors, raising concerns, particularly amid increasing incidents of sexual deepfake abuse in schools. How the Investigation Was Conducted: The investigation builds on TTP's January 2026 report, which identified more than 100 such apps across both platforms. For the latest study, researchers conducted searches on new iOS and Android devices using terms such as "nudify," "undress," "deepfake," "deepnude," "adult AI," "face swap," and "AI NSFW." In several cases, sponsored ads for these apps appeared at the top of search results, and autocomplete suggestions led users to additional Nudify-related queries, increasing their visibility. Apps and Features Identified: Many of the apps tested allowed users to upload photos of real individuals and generate explicit images or videos. Some face-swapping tools allowed users to place a person's face onto nude bodies, while others enabled the creation of sexualized AI companions based on real people. Even when certain apps blocked full nudity, they often produced images of women in bikinis or sexually suggestive poses, which can still violate platform policies against degrading or objectifying individuals. Examples of apps identified in the investigation include Best Body AI -- Fashion Editor, AI Replace & Remove -- Fill App, FaceTool: Face Swap & Generate, DreamFace: AI Video Generator, RemakeFace: AI Face Swap, and Reface: Face Swap AI Generator, many of which allow users to generate nude or sexually suggestive images, swap faces onto explicit bodies, or create AI-generated videos. Company Responses and Enforcement: Apple declined to comment on the findings. However, after TTP and Bloomberg News alerted the company, it removed 15 apps from its App Store. Google said enforcement actions are ongoing. Google spokesperson Dan Jackson said, "When violations of our policies are reported to us, we investigate and take appropriate action." He also noted that age ratings on the Google Play Store are assigned by the International Age Rating Coalition (IARC). Following the investigation, Google removed seven apps. Policy and Privacy Concerns: The findings highlight gaps between the companies' stated policies and the functioning of their app store ecosystems. Apple prohibits apps containing "overtly sexual or pornographic material," while Google bars apps that "contain or promote sexual content" or "claim to undress people or see through clothing." Despite these rules, TTP concluded that both platforms play an active role in amplifying the reach of such applications. The report also flagged privacy risks associated with some apps subject to Chinese law, which could potentially compel developers to share user data, including sensitive manipulated images, with government authorities. The TTP findings echo concerns previously raised in India. A December 2025 report by MediaNama highlighted the widespread availability of deepfake-generating apps on the Google Play Store, exposing gaps in the Ministry of Electronics and Information Technology's (MeitY) regulatory approach. While MeitY issued a Standard Operating Procedure (SOP) in November 2025 requiring intermediaries to remove non-consensual intimate imagery within 24 hours and proposed amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, these measures largely focus on social media platforms. The report noted that app stores, key distribution channels for such AI tools, remain outside the scope of specific regulatory guidelines, allowing applications capable of generating non-consensual sexual deepfakes to continue operating despite platform policies. Broader Implications: TTP's investigation suggests that Apple and Google are not merely passive hosts but key intermediaries in the spread of AI tools capable of generating non-consensual sexual deepfakes. The companies may also benefit financially through advertising placements and commissions on in-app purchases and subscriptions. As concerns about the misuse of AI to target women and minors grow, the report indicates that the role of major app store operators is likely to face increased regulatory and public scrutiny.
[9]
Apple and Google Face Backlash Over 'Nudify' App Search Results
Amazon Launches "Sassy" Alexa Mode: Adults-Only Assistant With Humor and Attitude In addition to finding ads for those apps in both stores, TTP noted the platforms' autocomplete functions, which anticipate and suggest queries before users finish typing. It can also guide users to additional apps. For example, after TTP typed the letters 'AI NS', a partial spelling of "AI NSFW", the Apple App Store recommended the search term 'image to video ai nsfw.' TTP stated that some developers may not fully understand the capabilities of the AI tools they use. In one case, a developer admitted to using AI image generation technology and promised to implement stronger moderation measures after the issue was raised. "As we hear more and more about nonconsensual n**e images targeting women and girls, Apple and Google need to reckon with their role in this ecosystem," Michelle added. As evolve, app stores will face increasing pressure to strengthen moderation systems, refine algorithms, and promote stricter compliance. The future of platform governance will hinge on balancing innovation with user safety, transparency, and accountability in digital ecosystems.
[10]
Apple and Google reportedly hosting deepfake nudity apps, despite breach of policy
A new investigation by the Tech Transparency Project (TTP) has found that Apple and Google are not just failing to remove AI-powered "nudify" apps from their platforms, their own search and advertising systems are actively pushing them up. Type "nudify," "undress," or "deepnude" into either app store's search bar, and the results speak for themselves. Also read: App Store, Google Play Store accused of promoting nudify apps through search suggestions: All details The findings are concerning to say the least. Around 40 percent of the top apps returned for those search terms were capable of digitally stripping women's clothes in photographs. Apple and Google have both also run sponsored advertisements for nudify apps within those same search results. Even the autocomplete suggested new search terms that can lead you down the rabbit hole even more. The apps being available so easily is one thing, the scale at which they are used is another. The nudify apps from the TTP report have been downloaded a combined 483 million times and generated over $122 million in revenue. Apple and Google collect a cut of in-app purchases and subscription fees from these apps. This does make it difficult to take their policies seriously. When that much profit gets involved, the incentive to act just vanishes altogether. Also read: Apple threatens to remove Elon Musk's Grok from App Store, leaked letter reveals If you think that is bad, it gets worse. 31 of the apps identified were rated suitable for minors. At a time when schools across the world are struggling with sexual deepfake scandals involving students, the idea that a minor could not just stumble across but also be directed to an app capable of generating nonconsensual nude imagery is very dangerous. Both companies have long maintained policies that should, in theory, not allow these apps. Apple bars content that is "offensive" or "pornographic." Google bans apps that "claim to undress people or see through clothing." Yet TTP's testing found that nearly half of the apps surfacing in search results violated those very standards. After the report was shared with the companies and Bloomberg News, Apple removed 14 apps and Google removed seven. So I guess it was always possible to do it, it simply wasn't a priority. This is not a case of technology outpacing regulation, or AI evolving faster than policy can keep up. This is a case of two of the most powerful companies in the world knowing that harmful apps exist on their platforms and choosing revenue over accountability. The tools to act were always there. The will, it seems, was not. Also read: Elon Musk's R-rated rule for Grok Imagine: A disaster waiting to happen?
[11]
App Store, Google Play Store accused of promoting nudify apps through search suggestions: All details
Apple has removed some apps, but watchdog calls for stricter and more consistent enforcement A new investigation by the Tech Transparency Project has raised new concerns about the presence of so-called nudify apps in the Apple and Google app stores. According to the report, both Apple's App Store and Google Play Store continue to feature apps capable of creating deepfake nude images, sometimes through search suggestions and paid promotions. According to the report, many of the top search results for terms related to such content included apps that can digitally alter images to depict women in explicit or semi-nude forms. The report also mentioned several instances in which promoted listings appeared at the top of search results. In one case, a face-swapping app was displayed as a sponsored result for a deepfake-related query, and testing revealed that it can insert a person's face into explicit video content with no meaningful safeguards. This was observed in the other apps as well. All these apps offered face swap templates that can help users combine clothed and naked images with few restrictions. Also read: Microsoft unveils MAI Image 2 Efficient AI model, calls it production workhorse: How to access Furthermore, the report also mentions autocomplete suggestions as a concern. Partial search inputs were discovered to direct users to more explicit queries, which in turn revealed additional apps of this type among the top results. TTP also stated that some developers may not fully comprehend the capabilities of the AI tools they are utilising. In one case, a developer admitted to using AI image generation technology and promised to implement stronger moderation measures after the issue was raised. However, the companies have not issued a detailed public response to the discovery, but Apple has taken action by removing several of the apps, the report said. However, the watchdog group believes that more consistent enforcement and stronger safeguards are required to keep such apps from reappearing on mainstream platforms.
Share
Share
Copy Link
A new Tech Transparency Project report reveals Apple and Google are not just hosting AI-powered nudify apps—they're actively directing users to them through search suggestions and sponsored ads. The apps have generated $122 million in revenue and been downloaded 483 million times, with many rated suitable for children despite creating nonconsensual sexualized images.
Apple and Google are actively steering users toward nudify apps that create deepfake nude images, despite having explicit app store policies banning such content, according to a new report published Wednesday by the Tech Transparency Project
1
. The investigation reveals a troubling pattern: both tech giants are not merely failing to block these AI-powered nudify apps but are actually promoting them through autocomplete features and sponsored search results2
.Searching for terms like "nudify," "undress," and "deepnude" in the Apple and Google app stores gives users access to software designed to alter images of people—often without their consent—to make them appear nude or partially undressed
1
. The Tech Transparency Project, a research arm of the nonprofit Campaign for Accountability, identified 18 apps with nudifying capabilities in the Apple App Store and 20 in the Google Play Store3
.
Source: Digit
The financial scope of this issue is staggering. Apps identified by the Tech Transparency Project have collectively been downloaded 483 million times and generated $122 million in revenue, according to estimates from market researcher AppMagic
1
. "It's not just that the companies are failing to actually appropriately review these apps and continue to approve them and profit from them," Katie Paul, director of the Tech Transparency Project, explained in an interview. "They are actually directing users to the apps themselves"1
.What makes this situation particularly alarming is that many of these apps were rated "E" for Everyone, meaning children can legally download and use them
2
. This rating classification suggests a fundamental breakdown in content moderation and app rating processes at both companies.Both Apple and Google app stores don't just host these applications—their systems actively help users discover them. When users typed "AI NS" as part of a search, the App Store suggested "image to video ai nsfw," which then returned several nudify apps in the top ten results
5
.
Source: Digit
Additionally, both platforms ran sponsored ads for nudifying apps in search results, with some appearing as the first result for relevant searches
5
.One app identified in the Google Play Store, Video Face Swap AI: DeepFace, advertised face-swapping capabilities but contained a category called "Girls" where users could paste faces onto video templates of women in sexualized poses
1
. The app, rated "E" for Everyone, had been downloaded over 1 million times1
.Apple's App Store guidelines explicitly ban "overtly sexual or pornographic material," while the Google Play Store specifically prohibits "apps that degrade or objectify people, such as apps that claim to undress people or see through clothing, even if labeled as prank or entertainment apps"
1
. Yet enforcement remains inconsistent.After Bloomberg reached out about the Tech Transparency Project report, Apple removed 15 apps and contacted developers of six others to alert them to policy violations
1
. Google stated that many apps referenced in the report had been suspended and an investigation was ongoing1
. However, this pattern has repeated itself—earlier this year, both companies removed apps flagged by the Tech Transparency Project, only for dozens of similar ones to appear months later1
.
Source: Analytics Insight
Related Stories
These apps leverage generative AI models similar to those powering popular image generators. Users upload a photo, and the AI predicts what a nude version might look like, often producing disturbingly realistic results
3
. The potential for creating nonconsensual sexualized images raises serious ethical and legal concerns, particularly as deepfake technology becomes more sophisticated.Anne Helmond, a professor at Utrecht University in the Netherlands and director of the App Studies Initiative, noted that enforcement efforts are "uneven and largely opaque." She explained that "if an app presents itself as a generic image generator, it may pass review, even if it can be misused in practice"
1
. This highlights a critical gap in content safety measures: apps marketed as general photo editors can easily bypass review processes while offering harmful capabilities.The proliferation of these harmful AI tools has prompted governments worldwide to consider stricter regulations. The UK's Children's Commissioner recently called for a ban on AI deepfake apps that create nude or sexual images of children
2
. The US and other countries have proposed or enacted laws banning explicit deepfakes, and California's Attorney General recently sent Elon Musk's X a cease and desist order over Grok's explicit deepfakes2
.For users and parents, the immediate concern centers on protecting vulnerable individuals from nonconsensual imagery. The fact that face-swapping apps rated for everyone can generate pornographic content with minimal restrictions suggests that current app store policies need fundamental restructuring. As generative AI continues to advance, the challenge of distinguishing legitimate photo-editing tools from those designed for harm will only intensify, demanding more proactive and transparent content moderation from platform holders.
Summarized by
Navi
[3]
[4]
27 Jan 2026•Technology

09 Jan 2026•Policy and Regulation

15 Jul 2025•Technology

1
Technology

2
Policy and Regulation

3
Policy and Regulation
