Nearly 200 AI Apps on App Store Expose Millions of User Records in Massive Security Breach

Reviewed byNidhi Govil

5 Sources

Share

Security firm CovertLabs has uncovered a widespread security crisis affecting nearly 200 apps on Apple's App Store, with AI apps dominating the list of offenders. The Firehound project reveals that Chat & Ask AI by Codeway alone has exposed over 406 million records from 18 million users, including chat histories, email addresses, and phone numbers—all accessible to anyone who knows where to look.

Security Researchers Uncover Massive Data Exposure Across App Store

Security firm CovertLabs security firm has launched the Firehound project, a public registry that tracks iPhone AI applications leaking private user data across Apple's App Store. As of the initial disclosure, the registry has identified 198 iOS apps exposing sensitive information, with 196 of them actively leaking user data through improperly secured databases and cloud storage systems

1

. The discovery raises serious questions about App Store security and Apple's vetting process, which the company frequently cites as justification for maintaining control over its app ecosystem

4

.

Source: MakeUseOf

Source: MakeUseOf

The Chat & Ask AI app by Codeway leads Firehound's rankings for both most files exposed and most records exposed, with more than 406 million records from over 18 million users now accessible

1

. This includes entire chat histories totaling 380 million messages, along with user phone numbers and email addresses

4

. Security researchers describe the situation as "as bad as it gets," particularly given the sensitive nature of information users typically share with AI chatbot security risks

5

.

Source: Macworld

Source: Macworld

AI Apps Dominate List of Security Vulnerabilities

The top offenders in the Firehound registry reveal a troubling pattern: AI apps overwhelmingly dominate the list of applications with sensitive user data exposure. Following Chat & Ask AI, GenZArt has exposed 18 million records, YPT - Study Group has leaked information from over 13 million records, Adult Coloring Book - Pigment accounts for 7 million records, and Kmstry has exposed another 7 million records

5

. These five apps alone represent more than 20 million unique users whose personal information sits exposed to potential bad actors.

The prevalence of AI apps among the worst offenders likely stems from developers rushing to capitalize on the AI boom, potentially cutting corners on security implementations to get products to market quickly

4

. This is particularly concerning because users tend to provide AI apps with highly personal information about mental health, relationships, and finances—data that becomes significantly more dangerous when tied to identifiable information like email addresses and phone numbers

5

.

How the User Data Leak Occurred and What's Being Exposed

Most apps listed on Firehound expose data through improperly secured databases or cloud storage configurations, with many listings disclosing the underlying data schemas and record counts

1

. The exposed records include chat histories, user names, email addresses, phone numbers, AI tokens, user IDs, and in some cases, user locations

3

4

.

Source: Mashable

Source: Mashable

While there's no indication the leaks are intentional or that apps are actively sending data to third parties, the information sits accessible to anyone with knowledge of where to look

5

. The Firehound project intentionally limits public access to the most sensitive scan results, requiring users to register and request restricted datasets. Priority access goes to journalists, law enforcement, and security professionals who undergo manual review

1

.

Developer Response and Responsible Disclosure Process

CovertLabs has implemented a responsible disclosure framework through Firehound, allowing developers to contact the firm, learn how to fix security vulnerabilities, and have their apps removed from the registry once issues are resolved

2

. Security researchers behind the project, including OSINT specialist @Harrris0n, have already seen some progress. According to a January 20 post, the Chat & Ask AI app's "problem has been addressed, and the vulnerability no longer exists"

2

.

However, questions remain about the timeline of fixes reaching users. The App Store showed Chat & Ask AI at version 3.3.8, released on January 7, while Firehound's registry entry for the app is dated January 15, suggesting the fixed version may not have been released at the time of the registry update

2

.

What Users Should Do and Watch For

Users should immediately stop using any apps listed in the Firehound registry and delete their data from these applications if possible

5

. Those who have used affected apps should change passwords for any accounts sharing the same email address and monitor their accounts for suspicious activity

4

.

Before downloading new AI apps, users can check Firehound to verify an app's security status

2

. The incident serves as a critical reminder that users must remain mindful of the platforms they use and the information they share, particularly with AI chatbots where the barrier to entry for developers has become increasingly low

1

. The data privacy risks extend beyond just AI apps—affected categories also include image generators, photo animators, and study applications

2

.

This widespread exposure of user data highlights the tension between rapid AI development and developer security practices, raising urgent questions about how such vulnerable apps bypass Apple's review process and what measures will prevent similar incidents as AI continues to expand across mobile platforms.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo