6 Sources
6 Sources
[1]
Apple's new App Review Guidelines clamp down on apps sharing personal data with 'third-party AI' | TechCrunch
Apple on Thursday introduced a new set of App Review Guidelines for developers, which now specifically state that apps must disclose and obtain users' permission before sharing personal data with third-party AI. At the same time, Apple is ensuring other apps aren't leaking personal data to AI providers or other AI businesses. What's interesting about this particular update is not the requirements being described but that Apple has specifically called out AI companies as needing to come into compliance. Before the revised language, the guideline known as rule 5.1.2(i) included language around disclosure and obtaining user consent for data sharing, noting that apps could not "use, transmit or share" someone's personal data without their permission. This rule served as part of Apple's compliance with data privacy regulations like the EU's GDPR (General Data Protection Regulation), California's Consumer Privacy Act, and others, which ensure that users have more control over how their data is collected and shared. Apps that don't follow the policy can be removed from the App Store. The newly revised guideline adds the following sentence (emphasis ours): You must clearly disclose where personal data will be shared with third parties, including with third-party AI, and obtain explicit permission before doing so. This change could impact apps that intend to use AI systems to collect or process information about their users, perhaps to personalize their apps or provide certain functionality. It's unclear how stringently Apple will enforce the rule, given that the term "AI" could include a variety of technologies -- not just LLMs, but also things like machine learning. The updated rule is one of several revisions to the App Review Guidelines out on Thursday. Other changes are focused on supporting Apple's new Mini Apps Program, also announced today, as well as tweaks to rules involving creator apps, loan apps, and more. One addition also added crypto exchanges to the list of apps that provide services in highly regulated fields.
[2]
Apple's New App Store Rules Take Aim at Personal Data Sharing With AI
Blake has over a decade of experience writing for the web, with a focus on mobile phones, where he covered the smartphone boom of the 2010s and the broader tech scene. When he's not in front of a keyboard, you'll most likely find him playing video games, watching horror flicks, or hunting down a good churro. Apple updated its App Review Guidelines page on Thursday, introducing changes to the handling of personal data sharing and the requirements that must be met before doing so. Noncompliant apps could be removed from the App Store. The updated language in the rules calls out that personal data shared with third parties must be both clearly disclosed and only with the explicit permission of the user. The language echoes previous guidelines, but points out that these third parties also include artificial intelligence: "You must clearly disclose where personal data will be shared with third parties, including with third-party AI, and obtain explicit permission before doing so. Data collected from apps may only be shared with third parties to improve the app or serve advertising (in compliance with the Apple Developer Program License Agreement)," it says. Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source. Apple didn't immediately respond to a request for comment. The small change is a win in a world where privacy often feels more like a concept than a reality, especially as AI technology continues to surge.
[3]
AI shut out of Apple's App Store
If you hate the idea of your information being used to train AI, you're going to love the minor but vital tweak Apple just made to the iOS App Store. "You must clearly disclose where personal data will be shared with third parties, including with third-party AI," the company told app developers -- adding that all apps must "obtain explicit permission before doing so." The updated language -- Apple's first guidance on third-party AI -- is part of a document called App Review Guidelines. And lest the name fool you, the introduction makes clear that adhering to these guidelines is pretty much mandatory. "We will reject apps for any content or behavior that we believe is over the line," Apple tells developers later in the guidelines. "What line, you ask? Well, as a Supreme Court Justice once said, 'I'll know it when I see it.' And we think that you will also know it when you cross it." The update, which dropped last week, marks the first time that AI has even been mentioned in the guidelines. Apple under CEO Tim Cook has been highly skeptical about AI, slow to include AI features in Siri, and sometimes hesitant to even use the letters "AI"; Cook has preferred to use the similar term "machine learning" in past keynotes. Sourcing data to train AI models has become one of the most legally contentious activities in Silicon Valley. (Disclosure: Ziff Davis, Mashable's parent company, filed a lawsuit in April against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.) And even Apple, the AI laggard that is reportedly going to use Google Gemini to power Siri soon, isn't immune. Last month saw two lawsuits alleging Apple has improperly used other people's work for its own AI training. In separate filings, two neuroscientists and two authors said Cook's company had used data from "shadow libraries," or pirated content available online. While Apple's response remains to be seen, the legal landscape doesn't look all that promising for the company. AI giant Anthropic settled a class-action lawsuit over shadow library usage in September for $1.5 billion. But at least Apple can now legitimately claim to be protecting its users from AI data-scraping within its apps.
[4]
Apple is tightening the rules on apps that share your data with AI
This new policy specifically adds "third-party AI" to the list of entities for which apps must disclose data sharing a change from the previous, more general rule. Apple implemented new App Review Guidelines for developers Thursday, mandating apps disclose and obtain user permission before sharing personal data with third-party AI. This policy change precedes Apple's planned introduction of an AI-upgraded Siri in 2026. The updated Siri will enable users to perform cross-app actions via commands, partially powered by Google's Gemini technology, as reported by Bloomberg. Apple aims to prevent other applications from transmitting personal data to AI providers or related businesses. The specificity of this update lies in its direct mention of AI companies for compliance. Previously, guideline 5.1.2(i) required disclosure and user consent for data sharing, prohibiting apps from using, transmitting, or sharing personal data without permission. This rule addressed data privacy regulations, including the EU's GDPR and California's Consumer Privacy Act. Non-compliant apps risk removal from the App Store. The revised guideline now includes: "You must clearly disclose where personal data will be shared with third parties, including with third-party AI, and obtain explicit permission before doing so." This revision may affect applications leveraging AI systems to collect or process user information for personalization or specific functionalities. The rigor of Apple's enforcement remains to be seen, particularly as "AI" encompasses various technologies beyond large language models, such as machine learning. Other revisions to the App Review Guidelines, also announced Thursday, support Apple's new Mini Apps Program. Further adjustments involve rules for creator apps, loan apps, and other categories.
[5]
Apple Cracks Down on AI Data Sharing With New App Review Guidelines
The company said apps not following guidelines might be removed Apple has introduced new guidelines for data sharing with third-party artificial intelligence (AI) services for app developers. The new rules were mentioned in the company's App Review Guidelines last week, highlighting the company's intent to crack down on those developers that do not seek explicit permission from users or use their data in unauthorised ways. Interestingly, this update to the guidelines is also the first time the Cupertino-based tech giant has added the word AI in the document, highlighting a broader acceptance within the company towards accepting the technology. Apple Lays Down Rules for Third-Party AI Data Sharing The changes to the App Review Guidelines (first spotted by TechCrunch) were made last week. Under the "Data Use and Sharing" section (5.1.2), Apple now mentions, "You must clearly disclose where personal data will be shared with third parties, including with third-party AI, and obtain explicit permission before doing so." Looking beyond the fact that the company actually used the AI acronym, this is also the first time Apple has taken a step against the industry-wide issue of AI models being trained on user data. The guidelines now make it mandatory for app developers to disclose how a user's conversational data or their personal information, such as email ID and mobile number, is being used and with whom it would be shared. The developers are also required to share the details about the third-party AI models that will receive this information. Developers also need to obtain explicit permission from users regarding the same. Additionally, Apple highlights that data collected from apps can only be shared with third parties to improve the app or show ads to users. Apps that do not follow these guidelines or with the region's data privacy laws "may be removed from sale and may result in your removal from the Apple Developer Program." The rest of the guidelines remain the same. However, this new development should assure users that the AI apps in the App Store will not be sneakily collecting their data or using any information to train large language models (LLMs) without explicit permission.
[6]
Apple Tightens App Rules: Mandatory User Consent for Third-Party AI Data
Apple has updated its App Store Review Guidelines to require developers to notify users directly. They also have to obtain the user's consent before sharing personal data with third-party AI services. The updated rule, which was revised on November 13, 2025, arrives just months ahead of Apple's bold ambitions to introduce a more advanced, AI-powered Siri in 2026. The timing is fitting. According to a report by Bloomberg, the new Siri will execute specific tasks within using simple voice commands. As previously rumored, Google's Gemini AI will partly power this new functionality, providing Apple with an even more integrated form of external AI models. The iPhone maker aims to gain more control over how other apps handle user data, particularly regarding the transfer of data to external AI companies for processing without users fully understanding the implications.
Share
Share
Copy Link
Apple has updated its App Review Guidelines to specifically require apps to disclose and obtain explicit user permission before sharing personal data with third-party AI services, marking the first time the company has directly addressed AI in its developer policies.
Apple has introduced significant updates to its App Review Guidelines, specifically targeting how applications handle personal data sharing with third-party artificial intelligence services. The new requirements, announced Thursday, mandate that developers must clearly disclose and obtain explicit user permission before sharing any personal data with AI companies
1
.
Source: CNET
The updated guideline 5.1.2(i) now includes specific language stating: "You must clearly disclose where personal data will be shared with third parties, including with third-party AI, and obtain explicit permission before doing so"
2
. This represents the first time Apple has explicitly mentioned AI in its developer guidelines, marking a notable shift in the company's approach to artificial intelligence regulation.The new policy builds upon existing data privacy protections that were designed to comply with regulations such as the EU's General Data Protection Regulation (GDPR) and California's Consumer Privacy Act. However, the specific callout of AI companies demonstrates Apple's recognition of the unique challenges posed by artificial intelligence systems in data handling
1
.
Source: TechCrunch
Apps that fail to comply with these guidelines face serious consequences, including potential removal from the App Store. Apple's enforcement approach maintains its characteristic firmness, with the company stating it will "reject apps for any content or behavior that we believe is over the line"
3
. The guidelines emphasize that data collected from apps may only be shared with third parties to improve the app or serve advertising, and only in compliance with the Apple Developer Program License Agreement5
.This policy change comes at a time when data sourcing for AI model training has become one of the most legally contentious activities in Silicon Valley. The update could significantly impact applications that intend to use AI systems to collect or process user information for personalization or specific functionality
4
.
Source: Analytics Insight
The timing of this announcement is particularly noteworthy given Apple's own AI initiatives. The company is reportedly planning to introduce an AI-upgraded Siri in 2026, which will enable users to perform cross-app actions via commands, partially powered by Google's Gemini technology
4
. This suggests Apple is establishing clear boundaries for third-party AI integration while developing its own AI capabilities.Related Stories
The new guidelines reflect growing concerns about unauthorized data usage in AI training. Apple itself has faced legal challenges, with two recent lawsuits alleging the company improperly used content from "shadow libraries" or pirated content for AI training purposes
3
. The legal landscape has proven challenging for AI companies, with Anthropic settling a class-action lawsuit over shadow library usage for $1.5 billion in September.The enforcement rigor remains uncertain, particularly given that the term "AI" encompasses various technologies beyond large language models, including traditional machine learning systems. This broad definition could affect a wide range of applications currently operating in the App Store
1
.The updated guidelines were part of several revisions announced Thursday, which also included support for Apple's new Mini Apps Program and adjustments to rules involving creator apps, loan apps, and cryptocurrency exchanges
1
.Summarized by
Navi
[1]
[3]
1
Technology

2
Technology

3
Science and Research
