13 Sources
[1]
Apple and Google Broke Their Own Rules by Promoting 'Nudify' Apps, Report Says
If you want an app you built to be downloadable from the Apple App Store or Google Play Store, it has to pass a slew of criteria, including safety standards. But a new report on Wednesday alleges that Apple and Google broke their own rules by promoting "nudify" apps that are outlawed in their app store policies. The Tech Transparency Project, part of a nonprofit tech watchdog, first revealed in January that Apple and Google app stores had over 100 nudify or undressing apps. These are apps with the sole purpose of taking images of people, usually women, and editing them to appear to be that person without clothing, creating what's called nonconsensual intimate imagery. Many of these apps use generative AI to create deepfakes. Apple removed some of the prohibited apps at the time. But many are still out there, as evidenced in a subsequent investigation. In April, TTP found that Apple and Google still allowed users to search for a number of troubling keywords, including "nudify," "undress" and "deepnude." After a deep dive on the top 10 apps across both app stores, TTP found that 40% of the apps advertised themselves as able to "render women nude or scantily clad," according to the report. The new report also found that Google and Apple actually promoted such apps in their stores, increasing their visibility, with Google in particular creating "a carousel of ads for some of the most sexually explicit apps encountered in the investigation." Read More: How to Keep Kids Safe Online? Europe Believes Its Age-Verification App Is the Answer Apple and Google both have language in their policies that prohibits apps with "overtly sexual or pornographic material" (Apple) and "sexually suggestive poses in which the subject is nude, blurred or minimally clothed" (Google). And they've both enforced these policies in the past -- particularly by going after porn apps. But Apple and Google make money from app developers by running advertising and taking a part of paid app subscriptions. Analytics firm AppMagic found that these "nudify" apps were downloaded 483 million times and made more than $122 million in lifetime revenue. "This revenue stream may be why the two companies have been less than vigilant when it comes to nudify apps that violate their policies," TTP writes. After news broke this week, Apple told Bloomberg News that it removed 15 of the reported apps. Google confirmed it removed seven. Apple also said it blocked several of the search terms TTP flagged in its report. Apple and Google did not immediately respond to CNET's requests for comment and any updates since Wednesday. Nonconsensual graphically sexual content is a growing issue, due in part to AI. We saw in startling clarity how apps with AI can be used to make this illegal and abusive content at the beginning of the year, when Grok users made 1.4 million sexualized deepfakes over a nine-day period. Some US senators at the time called on Apple and Google to remove Grok from their app stores, but neither removed it. We learned this week that Apple privately reached out to Grok to express its concerns about its abusive AI capabilities and threatened to remove it. Grok is still available in the Apple and Google app stores and is still reportedly able to create abusive AI sexual images, despite the company saying otherwise.
[2]
Apple, Google Offer 'Nudify' Apps Despite Policies Against Them
Apple Inc. and Google have continued to offer mobile apps that let users make nonconsensual sexualized images of people despite their policies prohibiting such content, according to a report published Wednesday by the Tech Transparency Project. Searching for terms like "nudify" and "undress" in the Apple and Google app download stores gives customers access to software that can be used to alter images of celebrities and others to make them appear nude or in a state of partial undress, according to the group, a research arm of the nonprofit Campaign for Accountability. The companies also run ads for similar nudifying apps in their search results. Apps identified by the group have been downloaded 483 million times and generated $122 million in revenue, according to the report, which cited revenue estimates from market researcher AppMagic. A spokesperson for AppMagic said the Tech Transparency Project's work has resulted in several apps being removed and prompted others to change their user policies. Over the past year, politicians around the world have ratcheted up calls to curb the spread of nudifying apps. Earlier this year, the companies removed apps flagged by the Tech Transparency Project. But just a few months later, dozens of similar ones could be found, researchers from the organization said. "It's not just that the companies are failing to actually appropriately review these apps and continue to approve them and profit from them," Katie Paul, director of the project, said in an interview. "They are actually directing users to the apps themselves." From its app store searches, the group identified 18 apps with nudifying capabilities in the Apple App Store and 20 in the Google Play Store. In addition, both Apple and Google sometimes directed users to the apps via their autocomplete feature by suggesting the names of more nudifying apps as users typed, the researchers said. Some of the apps used names and images that cast them in a sexual light. Others could easily be used for that purpose despite not being marketed as such, making them more convenient than traditional photo-editing software. Some offered subscriptions, the Tech Transparency Project said. Apple's App Store guidelines for developers ban "overtly sexual or pornographic material." The Google Play Store bans "apps that degrade or objectify people, such as apps that claim to undress people or see through clothing, even if labeled as prank or entertainment apps. Google said that many of the apps referenced in the report have been suspended from Google Play for violations of its policies, and an investigation is ongoing. "When violations of our policies are reported to us, we investigate and take appropriate action," the company said in an email. Apple said it removed 15 apps identified by the group after Bloomberg reached out about their presence. Among the apps taken down were PicsVid AI Hot Video Generator, which offered templates that featured women sucking on phallic lollipops, according to the researchers. PicsVid's developer didn't respond to a request for comment. Another app identified by the Tech Transparency Project, Uncensored AI -- No Filter Chat, stripped clothes from an image of a woman uploaded by the researchers. A representative of Uncensored AI's developer said the app no longer allows removal of clothes. Apple said it contacted the developers of six apps to alert them to issues that need to be addressed and that they are at risk of being removed. Other apps mentioned by the Tech Transparency Project didn't violate the company's guidelines, Apple said. The company added that it has proactively rejected many apps and removed others. The tech giants' enforcement efforts are "uneven and largely opaque," according to Anne Helmond, a professor at Utrecht University in the Netherlands. "If an app presents itself as a generic image generator, it may pass review, even if it can be misused in practice," said Helmond, who is a director of the App Studies Initiative, an international research group. "Visibility is shaped by ranking and search systems that reward engagement, meaning that controversial uses can increase an app's prominence." One of the apps identified by the researchers in the Google Play Store, Video Face Swap AI: DeepFace, advertised swapping actress Anya Taylor-Joy's face onto Game of Thrones character Daenerys Targaryen. But inside the app, under a category called Girls, users could paste people's faces onto video templates of women bouncing their breasts or shaking their hips, Bloomberg found. The app, which is rated "E" for Everyone, has been downloaded over 1 million times from the store, where users could find it by typing "face swap" into the search bar. Okapi Software, the company that offers Video Face Swap AI, said it had launched an investigation into the issues raised by Bloomberg and removed some of the content, which it said had been uploaded by users. "Our app does not offer 'nudify' functionality, and we do not permit the generation of nude or sexually explicit content," Okapi said. "We take content safety and compliance seriously." A growing chorus of regulators are calling for the companies to do more to uphold their policies. Last year, President Donald Trump signed the Take It Down Act, which criminalizes the publication of non-consensual sexual content and compels social media and websites to remove such posts." In April, the UK government plans to introduce legislation that would open a path to prosecute tech executives whose companies do not take down such images.
[3]
Apple and Google are reportedly pointing users to nudify apps
Earlier this year it was revealed that Apple and Google were offering "nudify" apps on their stores despite having clear policies barring such content. Nearly three months later, such apps are not only still available, but being actively promoted on the iOS App Store and Google Play, according to a new report from the Tech Transparency Project (TTP). Many of those were labeled "E" for Everyone, meaning they can be downloaded by children. Searching for "nudify," "undress" and other terms in those stores gives users access to apps that can make real people nude or put them into pornographic videos. The new report alleges that "the platforms are key participants in the spread of AI tools that can turn real people into sexualized images," TTP wrote. The app stores even ran ads for similar nudifying apps in the search results. The group identified 18 nudify apps in Apple's App Store and 20 in Google Play. Some were marketed with sexual images, while others weren't advertised as such but could still be used for deepfakes. Those apps have collectively generated around $122 million in revenue and been downloaded 483 million times, according to the report. "It's not just that the companies are failing to actually appropriately review these apps and continue to approve them and profit from them," TTP director Katie Paul told Bloomberg. "They are actually directing users to the apps themselves." Apple and Google both have policies banning sexual or pornographic material, and Google has a specific policy against nudifying apps. Apple told Bloomberg that it removed 15 apps identified by the group, while Google said that it suspended a number of them. One of the apps cited in the report called Video Face Swap AI: DeepFace, advertises itself by showing an actress's face swapped onto another actress's body and allows users to put a real person's face on the bodies of partially undressed women. The app was rated "E" for Everyone. The proliferation of nudify and deepfake apps has pushed some governments to propose laws against them. The UK's Children's Commissioner recently called for a ban on AI deepfake apps that create nude or sexual images of children. The US and other countries have proposed or created laws banning explicit deepfakes, and the California Attorney General recently sent Elon Musk's X a cease and desist order over Grok's explicit deepfakes.
[4]
Apple and Google reportedly point users to 'nudify' apps despite banning them
Some of these apps are said to be rated "E," meaning even kids can download them. Apple and Google are supposed to be cracking down on harmful apps. However, a new report suggests they are still helping users find one of the most controversial types called "nudify" apps. A new report from the Tech Transparency Project (TTP) says both Apple and Google are steering users toward apps that use AI to create fake nude images, even though their policies clearly ban this kind of content (via Bloomberg). TTP, a research arm of the nonprofit Campaign for Accountability, previously shed light on the proliferation of this type of apps earlier this year. Both companies officially ban apps that create non-consensual sexual content. Apple's App Store guidelines and Google Play policies clearly restrict apps that promote exploitation or abuse. Nudify apps fall into this category, especially because they often need someone's photo to make explicit images without their consent. But TTP's report found that when you search for terms like "nudify" or "undress" on the iOS App Store or Google Play, you'll find dozens of apps that do exactly that. Even more concerning, the stores advertise these tools and suggest them through autocomplete. The group found 18 of these apps on Apple's store and 20 on Google Play. Together, they have been downloaded 483 million times and have made $122 million in revenue, according to AppMagic data. Many are rated "E" for Everyone, which means even children can legally download them. These apps use generative AI models similar to those behind popular image generators. Users upload a photo, and the AI predicts what a nude version might look like. The results are often disturbingly realistic. A big concern is how these platforms boost the reach of these apps. Even if Apple and Google do not host all of them directly, their systems still help users find them. This raises questions about responsibility for both what is allowed and what is promoted. After Bloomberg requested a comment, Apple said it removed 15 apps. Android Authority has reached out to Google for a statement. However, the main issue is that the same group reported similar apps earlier this year. The companies removed some, but within months, new ones appeared again.
[5]
AI "nudify" apps are being offered to everyone on the Google Play Store
I've been covering Android and other mobile technology for close to ten years now, with a specific interest in phone accessories, e-readers, and what makes each individual phone different from another. I delight in looking at the phone market from as many angles as possible, and while my opinions may be odd, at times, they're always from the heart as much as the head. I have a background in the mobile accessories world, which explains my odd enthusiasm for cases and things that clip onto smartphones. I worked for Digital Trends from 2017 to 2025. AI has some useful real-world applications, but the potential for harm is also extremely high, with AI "deepfakes" being one of the most obvious and concerning areas. This was highlighted at the start of this year, when it was revealed "nudify" apps were prevalent on both major smartphone app stores. A report from the Tech Transparency Project (TTP) has claimed that nothing has changed for Google's Play Store and Apple's App Store (via Engadget). According to the report, both stores still have a number of these controversial apps, and even worse, many are marketed as "E" for everyone, meaning children can download and use them. A new scourge in app stores So-called "nudify" apps are simple to explain. Image editors at their core, these apps are able to use generative AI to edit images, remove unwanted elements, and create new details to slot in. You might recognize the basic idea being identical to features like Google Photo's Magic Editor. However, those features have safety features, while many of these nudify apps do not. There are no prizes for guessing what these nudify apps are used for. Generative AI gives them the ability to digitally strip people in images, potentially creating pornographic images from shots of anyone. This is bad enough by itself, but the TTP's report also claims 31 of these apps have not been adequately vetted, and were rated as "E" for everyone. As such, these were available to and pushed to children, opening them to various avenues for harm. The full report contains a long rundown of some of the apps in question, and exactly what they're capable of, but suffice it to say here that the potential for harm is extremely high. Even worse, it's clear they're also a lucrative business, with the TTP's report saying the apps they looked at had generated around $122 million in revenue, and had been downloaded almost 500 million times. When asked for comment by the TTP, Google stated that many of the apps mentioned in the report had been removed from its store. Apple also removed a number of the apps, despite refusing to comment. The problems caused by these apps are obvious and clear, and there have already been moves made to ban them where possible. The UK government has begun an attempt to ban them for good, which is a good idea, considering the damage these sorts of apps are already causing in schools and elsewhere. Subscribe to our newsletter for AI safety coverage Follow developments in our newsletter for clear, evidence-based coverage of AI risks -- from deepfake image misuse and unsafe image-editing apps to app-store vetting and regulatory responses. Subscribing connects you to informed, focused analysis. Get Updates By subscribing, you agree to receive newsletter and marketing emails, and accept our Terms of Use and Privacy Policy. You can unsubscribe anytime. I have no doubt that Apple and Google would have removed these apps the moment they were made aware of them, as they have both claimed. However, the propensity of these apps and their apparent value must mean a change in how apps of this sort are rated and dealt with, as apps of this nature will only become more common if generative AI continues to exist in this form.
[6]
App Store search suggestions reportedly steered users to 'nudify' apps - 9to5Mac
The Tech Transparency Project (TTP) has followed up on its January report that revealed dozens of "nudify" apps on the App Store with a new investigation focused on how Apple's own search and ad systems may be helping users find them. Here are the details. According to the new report, both the App Store and Google Play Store "are helping users to find apps that create deepfake nude images of women," sometimes through promoted search results and autocomplete search suggestions. In the report, TTP says Apple and Google are still failing to prevent nudify apps from appearing in their app stores, some of which appear as suitable for minors. The group found that nearly 40% of the top 10 apps returned for searches such as "nudify," "undress," and "deepnude" could "render women nude or scantily clad." Additionally, some searches surfaced sponsored results for these apps. From the report: "(...)the first result from an App Store search for "deepfake" was an ad for FaceSwap Video by DuoFace. The app allows users to swap anyone's face from a still image onto a video. To test the app, TTP uploaded an image of a woman in a white sweater standing on a sidewalk and a video of a topless woman. After first showing a short ad, the app generated a video showing the clothed woman's face on the nude woman's body." And "Another App Store search for the term "face swap" yielded an ad for app called AI Face Swap. The app offers preset face swap templates and allows users to swap faces on images they upload themselves. TTP uploaded a photo of a woman in a blue sweater standing in a living room and an image of topless woman, and the app swapped their faces with no restrictions." Interestingly, in addition to contacting Apple and Google about these findings, TPP also contacted the developers of several of these apps. In at least one instance, the app developer confirmed they were using Grok for image generation, but claimed they "had no idea it was capable of producing such extreme content." The developer pledged to tighten moderation settings for image generation. Back to the report, TTP noted that typing "AI NS" as part of a search that could lead to "AI NSFW" prompted the App Store to suggest "image to video ai nsfw." That search, in turn, returned several nudify apps in the top ten results. Despite declining to comment on TTP's request, Apple responded to the report by removing most of the apps TTP identified.
[7]
AI nudify apps are still on the App Store, report warns
Apple removed 15 apps after the findings but some remain available, highlighting ongoing enforcement challenges for both Apple and Google stores. A study released by the Tech Transparency Project shows that it is not difficult to find software on the Apple App Store and the Google Play Store that can be used on real images to "make them look naked, put them into pornographic videos, or turn them into sexually explicit chatbots." These apps exist on Apple's App Store despite rules against them. TTP's research involved using search terms such as "nudify," "undress," "AI NSFW," and "deepnude," and found that about 40 percent of the apps found were able to "render women nude or scantily clad." TTP also found that the App Store made autocomplete suggestions that led to recommendations for new terms for finding such apps. Apple did not comment on TTP's study, but TTP reported that Apple removed 15 apps after TTP shared their findings. As of this writing, the search terms "nudify" and "undress" resulted in no results, but "deepnude" was successful and included several apps that offered outfit or body transformations. Macworld did not check these apps to see if they could take real images and create AI nude versions. One of the apps recommended by the "deepnude" search was Grok, the AI chatbot created by X. Earlier this week, a report stated that Apple privately threatened to remove Grok from the App Store because of the app's ability to generate deepfake nude images. It's not clear if the apps that were TTP reported as removed by Apple were given the same warnings as X. Grok remains in the App Store. In addition to successful searches, TTP found that the App Store responded to search queries with ads for nudify apps. Even though Apple's App Store ad policies state that "Ad content that promotes adult-oriented themes or graphic content" is prohibited content, TTP found that the App Store failed to enforce the policy. TTP's report comes after reports earlier this week that two apps in the App Store, Ledger Live and Freecash, were not legitimate services but scams. Ledger Live stole bitcoin accounts, while Freecash harvested user data secretly.
[8]
Apple and Google Direct Users to AI 'Nudify' Apps: Report
Apple and Google are helping users find apps that create deepfake nude images, according to a new investigation. So-called "nudify" apps use AI to alter photographs of real people, making them appear naked, placing them in pornographic videos, or turning them into sexually explicit chatbots. According to a report published on Wednesday by the Tech Transparency Project (TTP), Apple and Google play a significant role in the spread of these tools. TTP first reported in January that both the Apple App Store and Google Play hosted dozens of apps designed to digitally remove clothing from photographs of women. The latest investigation found that the platforms' own search and advertising systems direct users toward these apps, increasing their visibility. Both Apple and Google have policies that prohibit apps enabling the creation of nonconsensual sexualized images. However, the TTP report found that not only do such apps remain available, the search tools on both platforms actively point users to them. According to the investigation, Apple and Google displayed ads for nudify apps within search results and suggested related terms through autocomplete. Bloomberg News reports that searching for terms like "nudify" and "undress" in the companies' app stores provides access to software that can alter photographs of celebrities and others to make them appear nude or partially undressed. The companies also run ads for similar apps in those results. TTP identified 18 nudify apps on Apple's platform and 20 on Google Play. Some were marketed with sexual images, while others were not explicitly advertised that way but could still be used to create deepfakes. Data compiled by a mobile analytics firm shows the apps identified by TTP have been downloaded 483 million times and have generated more than $122 million in lifetime revenue. The investigation also found 31 nudify apps rated as suitable for minors, a notable finding amid a rise in sexual deepfake incidents in schools. After TTP and Bloomberg News shared the report's findings, Apple removed 15 apps and Google removed seven. The report comes as lawmakers in Minnesota are reportedly close to banning AI nudification apps outright. In the U.K., the Children's Commissioner has also called for an immediate ban on such apps, citing concerns that they enable "deepfake sexual abuse of children."
[9]
Damning report finds Apple and Google's app stores boosting nudify apps
The app stores aren't just hosting nudify apps -- they're promoting them If you assumed Apple and Google were just slow to catch harmful apps on their platforms, a new investigation suggests the problem is far worse than that. The Tech Transparency Project (TTP) found that both the App Store and Google Play aren't just hosting nudify apps. Their search and advertising systems are actively pointing users toward them. Nudify apps are AI tools that can digitally strip clothing from photos of real people. They can also generate pornographic videos or create sexually explicit chatbots using someone's likeness. The scary part? 31 of the apps TTP found were rated suitable for minors. How Apple and Google are sending users straight to these nudify apps TTP ran searches using terms like "nudify," "undress," "deepfake," and "AI NSFW" on both app stores. About 40% of the top 10 results for each term returned apps that could render women nude or scantily clad. But it doesn't stop at search results. Both platforms ran paid ads for nudify apps within those results. In Google's case, that included a carousel of sponsored apps, some of which were openly pornographic. Recommended Videos The autocomplete feature made things worse. When TTP typed "AI NS" into the App Store search field, it suggested "image to video ai nsfw," which led to more nudifying apps in the top results. Apple controls all advertising in its App Store and has a stated policy against ads promoting adult content. Despite that, 3 of TTP's App Store searches still returned a nudify ad as the very first result. Why this is a bigger issue than you think The apps identified across both stores have been downloaded 483 million times and earned over $122 million in lifetime revenue. Apple and Google take a cut of that through paid subscriptions and in-app purchases, which TTP says may explain why enforcement has been lax. After TTP and Bloomberg flagged these apps, Apple removed 15 of them, and Google suspended several others. However, both companies declined to explain how these apps had passed review or why age ratings allowed minors to download them. How long before Apple and Google are forced to act? The UK government has begun proposing and enacting laws against explicit deepfakes, and the US recently recorded its first criminal conviction under one such law. Pressure on Apple and Google to act more decisively is only likely to grow. Apple's own enforcement record is already under scrutiny. A letter obtained by NBC News revealed that Apple privately threatened to pull Grok from the App Store in January over sexualized deepfakes, even rejecting xAI's first fix as insufficient. Apple eventually let Grok stay, but with reports like this one piling up, both companies are running out of room to look the other way.
[10]
Apple, Google Accused of Steering Users to Nudify Apps
Apple and Google are directing users to apps that can create non-consensual nude and sexually explicit images, despite having policies that prohibit such content, according to a new investigation by Tech Transparency Project (TTP). The report found that the Apple App Store and Google Play Store not only host these so-called "nudify" apps but also promote them through search results, autocomplete suggestions, and sponsored advertisements. These apps use artificial intelligence (AI) to digitally remove clothing from photos of real people, generate pornographic videos, and create sexually explicit AI chatbots. Massive Reach and Revenue: TTP discovered that users have downloaded the Nudify apps identified during its investigation 483 million times, generating more than $122 million in lifetime revenue. Alarmingly, app stores rated 31 of these apps as suitable for minors, raising concerns, particularly amid increasing incidents of sexual deepfake abuse in schools. How the Investigation Was Conducted: The investigation builds on TTP's January 2026 report, which identified more than 100 such apps across both platforms. For the latest study, researchers conducted searches on new iOS and Android devices using terms such as "nudify," "undress," "deepfake," "deepnude," "adult AI," "face swap," and "AI NSFW." In several cases, sponsored ads for these apps appeared at the top of search results, and autocomplete suggestions led users to additional Nudify-related queries, increasing their visibility. Apps and Features Identified: Many of the apps tested allowed users to upload photos of real individuals and generate explicit images or videos. Some face-swapping tools allowed users to place a person's face onto nude bodies, while others enabled the creation of sexualized AI companions based on real people. Even when certain apps blocked full nudity, they often produced images of women in bikinis or sexually suggestive poses, which can still violate platform policies against degrading or objectifying individuals. Examples of apps identified in the investigation include Best Body AI -- Fashion Editor, AI Replace & Remove -- Fill App, FaceTool: Face Swap & Generate, DreamFace: AI Video Generator, RemakeFace: AI Face Swap, and Reface: Face Swap AI Generator, many of which allow users to generate nude or sexually suggestive images, swap faces onto explicit bodies, or create AI-generated videos. Company Responses and Enforcement: Apple declined to comment on the findings. However, after TTP and Bloomberg News alerted the company, it removed 15 apps from its App Store. Google said enforcement actions are ongoing. Google spokesperson Dan Jackson said, "When violations of our policies are reported to us, we investigate and take appropriate action." He also noted that age ratings on the Google Play Store are assigned by the International Age Rating Coalition (IARC). Following the investigation, Google removed seven apps. Policy and Privacy Concerns: The findings highlight gaps between the companies' stated policies and the functioning of their app store ecosystems. Apple prohibits apps containing "overtly sexual or pornographic material," while Google bars apps that "contain or promote sexual content" or "claim to undress people or see through clothing." Despite these rules, TTP concluded that both platforms play an active role in amplifying the reach of such applications. The report also flagged privacy risks associated with some apps subject to Chinese law, which could potentially compel developers to share user data, including sensitive manipulated images, with government authorities. The TTP findings echo concerns previously raised in India. A December 2025 report by MediaNama highlighted the widespread availability of deepfake-generating apps on the Google Play Store, exposing gaps in the Ministry of Electronics and Information Technology's (MeitY) regulatory approach. While MeitY issued a Standard Operating Procedure (SOP) in November 2025 requiring intermediaries to remove non-consensual intimate imagery within 24 hours and proposed amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, these measures largely focus on social media platforms. The report noted that app stores, key distribution channels for such AI tools, remain outside the scope of specific regulatory guidelines, allowing applications capable of generating non-consensual sexual deepfakes to continue operating despite platform policies. Broader Implications: TTP's investigation suggests that Apple and Google are not merely passive hosts but key intermediaries in the spread of AI tools capable of generating non-consensual sexual deepfakes. The companies may also benefit financially through advertising placements and commissions on in-app purchases and subscriptions. As concerns about the misuse of AI to target women and minors grow, the report indicates that the role of major app store operators is likely to face increased regulatory and public scrutiny.
[11]
Apple and Google Face Backlash Over 'Nudify' App Search Results
Amazon Launches "Sassy" Alexa Mode: Adults-Only Assistant With Humor and Attitude In addition to finding ads for those apps in both stores, TTP noted the platforms' autocomplete functions, which anticipate and suggest queries before users finish typing. It can also guide users to additional apps. For example, after TTP typed the letters 'AI NS', a partial spelling of "AI NSFW", the Apple App Store recommended the search term 'image to video ai nsfw.' TTP stated that some developers may not fully understand the capabilities of the AI tools they use. In one case, a developer admitted to using AI image generation technology and promised to implement stronger moderation measures after the issue was raised. "As we hear more and more about nonconsensual n**e images targeting women and girls, Apple and Google need to reckon with their role in this ecosystem," Michelle added. As evolve, app stores will face increasing pressure to strengthen moderation systems, refine algorithms, and promote stricter compliance. The future of platform governance will hinge on balancing innovation with user safety, transparency, and accountability in digital ecosystems.
[12]
Apple and Google reportedly hosting deepfake nudity apps, despite breach of policy
A new investigation by the Tech Transparency Project (TTP) has found that Apple and Google are not just failing to remove AI-powered "nudify" apps from their platforms, their own search and advertising systems are actively pushing them up. Type "nudify," "undress," or "deepnude" into either app store's search bar, and the results speak for themselves. Also read: App Store, Google Play Store accused of promoting nudify apps through search suggestions: All details The findings are concerning to say the least. Around 40 percent of the top apps returned for those search terms were capable of digitally stripping women's clothes in photographs. Apple and Google have both also run sponsored advertisements for nudify apps within those same search results. Even the autocomplete suggested new search terms that can lead you down the rabbit hole even more. The apps being available so easily is one thing, the scale at which they are used is another. The nudify apps from the TTP report have been downloaded a combined 483 million times and generated over $122 million in revenue. Apple and Google collect a cut of in-app purchases and subscription fees from these apps. This does make it difficult to take their policies seriously. When that much profit gets involved, the incentive to act just vanishes altogether. Also read: Apple threatens to remove Elon Musk's Grok from App Store, leaked letter reveals If you think that is bad, it gets worse. 31 of the apps identified were rated suitable for minors. At a time when schools across the world are struggling with sexual deepfake scandals involving students, the idea that a minor could not just stumble across but also be directed to an app capable of generating nonconsensual nude imagery is very dangerous. Both companies have long maintained policies that should, in theory, not allow these apps. Apple bars content that is "offensive" or "pornographic." Google bans apps that "claim to undress people or see through clothing." Yet TTP's testing found that nearly half of the apps surfacing in search results violated those very standards. After the report was shared with the companies and Bloomberg News, Apple removed 14 apps and Google removed seven. So I guess it was always possible to do it, it simply wasn't a priority. This is not a case of technology outpacing regulation, or AI evolving faster than policy can keep up. This is a case of two of the most powerful companies in the world knowing that harmful apps exist on their platforms and choosing revenue over accountability. The tools to act were always there. The will, it seems, was not. Also read: Elon Musk's R-rated rule for Grok Imagine: A disaster waiting to happen?
[13]
App Store, Google Play Store accused of promoting nudify apps through search suggestions: All details
Apple has removed some apps, but watchdog calls for stricter and more consistent enforcement A new investigation by the Tech Transparency Project has raised new concerns about the presence of so-called nudify apps in the Apple and Google app stores. According to the report, both Apple's App Store and Google Play Store continue to feature apps capable of creating deepfake nude images, sometimes through search suggestions and paid promotions. According to the report, many of the top search results for terms related to such content included apps that can digitally alter images to depict women in explicit or semi-nude forms. The report also mentioned several instances in which promoted listings appeared at the top of search results. In one case, a face-swapping app was displayed as a sponsored result for a deepfake-related query, and testing revealed that it can insert a person's face into explicit video content with no meaningful safeguards. This was observed in the other apps as well. All these apps offered face swap templates that can help users combine clothed and naked images with few restrictions. Also read: Microsoft unveils MAI Image 2 Efficient AI model, calls it production workhorse: How to access Furthermore, the report also mentions autocomplete suggestions as a concern. Partial search inputs were discovered to direct users to more explicit queries, which in turn revealed additional apps of this type among the top results. TTP also stated that some developers may not fully comprehend the capabilities of the AI tools they are utilising. In one case, a developer admitted to using AI image generation technology and promised to implement stronger moderation measures after the issue was raised. However, the companies have not issued a detailed public response to the discovery, but Apple has taken action by removing several of the apps, the report said. However, the watchdog group believes that more consistent enforcement and stronger safeguards are required to keep such apps from reappearing on mainstream platforms.
Share
Copy Link
Apple and Google are actively directing users to nudify apps that create nonconsensual deepfake images, despite explicit policies banning such content. A Tech Transparency Project report reveals these apps have generated $122 million in revenue and been downloaded 483 million times. Many were rated 'E for Everyone,' making them accessible to children.
Apple and Google are not just hosting nudify apps that violate their own policies—they're actively promoting them. A Tech Transparency Project report published Wednesday reveals that both tech giants have been directing users to apps designed to create nonconsensual deepfake images, with some appearing in automated search suggestions and ad carousels
1
2
. The Tech Transparency Project (TTP), a research arm of the nonprofit Campaign for Accountability, identified 18 nudify apps in the App Store and 20 in the Google Play Store3
.
Source: Digit
These undress apps use generative AI tools to alter images of people—predominantly women—to make them appear nude or partially clothed, creating what constitutes nonconsensual imagery. When users search for terms like "nudify," "undress," and "deepnude," both platforms surface these apps and even suggest additional ones through autocomplete features
2
. Google went further by creating carousels of ads for some of the most sexually explicit apps encountered during the investigation1
.
Source: Digit
Both Apple and Google maintain app store policies that explicitly prohibit this content. Apple's guidelines ban "overtly sexual or pornographic material," while Google Play specifically prohibits "apps that claim to undress people or see through clothing, even if labeled as prank or entertainment apps"
2
. Yet these policies appear to conflict with financial incentives. Analytics firm AppMagic found that the identified apps generated more than $122 million in lifetime app revenue and were downloaded 483 million times1
3
.
Source: Analytics Insight
Apple and Google profit from app developers through advertising and subscription revenue shares, which may explain their inconsistent enforcement. "It's not just that the companies are failing to actually appropriately review these apps and continue to approve them and profit from them," Katie Paul, director of the Tech Transparency Project, told Bloomberg. "They are actually directing users to the apps themselves"
2
3
.A particularly alarming finding involves content moderation failures affecting child safety. Many of these apps were rated "E" for Everyone, meaning children could legally download and access tools capable of creating sexualized images
3
4
. The TTP report identified 31 apps rated E for Everyone that had not been adequately vetted5
.One example, Video Face Swap AI: DeepFace, advertised face-swapping capabilities but contained a "Girls" category where users could paste faces onto video templates of partially undressed women. Despite this content, the app maintained an "E" rating and accumulated over 1 million downloads from the Google Play Store
2
3
.Related Stories
After Bloomberg reached out about the Tech Transparency Project report, Apple removed 15 apps and contacted developers of six others to alert them to policy violations
2
. Google confirmed it suspended seven apps and stated that investigations are ongoing1
. Apple also said it blocked several search terms flagged in the report1
.However, this marks the second time in months that TTP has exposed these apps. In January, the organization first revealed that Apple and Google app stores hosted over 100 nudify apps
1
. While some were removed then, the April investigation found that dozens of similar apps had reappeared4
. This pattern suggests that enforcement efforts are "uneven and largely opaque," according to Anne Helmond, a professor at Utrecht University and director of the App Studies Initiative2
.The nudify app problem reflects broader challenges with deepfake technology and nonconsensual content. Earlier this year, Grok users created 1.4 million sexualized AI deepfake images over just nine days
1
. Apple privately contacted Grok to express concerns about abusive AI capabilities and threatened removal, yet the app remains available in both stores and reportedly still capable of generating explicit content1
.Governments are beginning to respond. The UK's Children's Commissioner has called for bans on AI deepfake apps that create nude or sexual images of children
3
. The UK government has initiated attempts to ban these apps entirely, particularly given damage already occurring in schools5
. California's Attorney General recently sent Elon Musk's X a cease and desist order over Grok's explicit deepfakes3
.As generative AI becomes more sophisticated and accessible, the tension between user safety and platform profitability will likely intensify. The question remains whether Apple and Google will implement systemic changes to their app rating and review processes, or continue reactive removals that allow harmful apps to resurface within months.
Summarized by
Navi
[4]
[5]
27 Jan 2026•Technology

15 Apr 2026•Policy and Regulation

09 Jan 2026•Policy and Regulation

1
Health

2
Technology

3
Technology
