5 Sources
5 Sources
[1]
App Store apps are exposing data from millions of users - 9to5Mac
An effort led by security research lab CovertLabs is actively uncovering troves of (mostly) AI-related App Store apps that leak and expose user data, including names, emails, and chat history. Here are the details. It's the slopocalypse. OSINT nerd @Harrris0n has created "Firehound". He (or others, I don't know) have begun the daunting task of hunting AI slop in the Apple app store. They have identified (as of this writing) 198 iOS apps which leak information on users (in some capacity). Unsurprisingly, the top are all related to AI. Of the 198 apps listed so far, 196 expose user data. App "Chat & Ask AI" leads Firehound's "Most files exposed" and "Most records exposed" rankings, with more than 406 million records from over 18 million users exposed. In addition to the listing on Firehound, @Harris0n also took to X to comment on his initial findings on "Chat & Ask AI": Most apps on Firehound appear to expose data via improperly secured databases or cloud storage, and many listings disclose the underlying data schemas and record counts. While most apps seem AI-related, affected app categories include: Firehound limits free data access and requires users to register to request restricted datasets and detailed scan results: Some scan results are highly sensitive. Until we can responsibly review and redact them, we can't publish everything in full. What this means The public registry is intentionally limited. If you create an account, you can request access to restricted datasets and views. Request review Access requests are reviewed manually. Priority is given to journalists, law enforcement, and security professionals. After signing in, you'll be prompted to submit a request from your dashboard. Despite @vxunderground's initial claim that Firehound is cataloguing "AI Slop", that information is not directly stated on @Harrris0n's X profile, nor on the Firehound website. While many apps seem AI-related, it is currently impossible to claim with certainty whether they were launched as a result of vibe coding or other AI-assisted, autonomous development tools. Still, Firehound is a reminder that users should be mindful of the platforms they use and the information they share (especially when it comes to AI chatbots), and that developers must take responsibility for properly securing user data, regardless of how low the barrier of entry may be to develop and release an app.
[2]
These iPhone AI apps expose your data, and they're all over the App Store
Users should check Firehound before downloading AI apps to verify security, while developers can use the tool for responsible disclosure and vulnerability fixes. AI apps are everywhere, and they sure seem like they can be incredibly useful, don't they? However, users need to be mindful of AI slop, inaccuracies, and hallucinations-and it turns out a lot of AI apps are a security risk, as well. A new project by AI security firm CovertLabs takes a look at AI apps in the App Store and indexes the apps that expose user data. The index, called Firehound, is available to view online and provides a tally of the files exposed by the app. Nearly 200 apps are listed in Firehound, with a large number of them still available in the App Store. There are tons of image generators, chatbots, and photo animators, the exact kind of apps people would be searching for. The app with the most files exposed on Firehound's registry is Chat & Ask AI by Codeway, a chatbot that has Deep Flow Software Services-FZCO listed as the seller. The app has exposed over 406 thousand files that include user chats and user information. A January 20th X post by Harrris0n (whose bio includes a direct link to CovertLabs) states that the app's "problem has been addressed, and the vulnerability no longer exists." But according to the App Store, Chat & Ask AI is at version 3.3.8, which was released on January 7. Firehound's registry for the app is dated January 15, 2026, so it does not appear that the fixed version has been released. The purpose of Firehound is to let developers know that breaches have been found in their apps so they can be fixed. When visiting Firehound, a "Responsible Disclosure" pop-up appears (see below) to provide developers a way to contact CovertLabs, learn how to fix the app, and have the app removed from the registry. Registration is required to access CovertLabs' research and results. Users can make good use of Firehound, as well. It can be used as a source to check the security of an AI app they may be considering in the App Store. How did these apps get onto the App Store with their security holes in the first place? That is unknown. Firehound is a good reminder to users that all AI apps rely on personal information, and that users need to be aware of the data being provided and how much of it they are willing to expose. With AI being the new frontier, companies are quick to develop tools to stake a claim, but those tools may lack the proper security implementations.
[3]
Firehound ranks apps that leak your data. These are the 10 worst.
You should perhaps double-check the apps you use -- especially if you're into AI. A new project called Firehound from security firm CovertLabs tracks the apps that leak the most data, and the Top 10 was rife with AI apps. Here's a screenshot of the data from Firehound's website. As you can see, the Top 10 list from Firehound has lots of AI apps. The project claims all kinds of data, from email addresses, to chat history, to names were left accessible by apps. One would hope that apps would work to fix such vulnerabilities moving forward, should they exist. This Tweet is currently unavailable. It might be loading or has been removed. Moving forward, it's probably best for users to tread carefully when downloading an app. It's worth stopping to think if it's a trustworthy product. And, of course, it's probably worth being cautious when sharing personal or professional information with an AI chatbot. You never know if it'll become public.
[4]
Welcome to the 'AI slop' security crisis - these 198 iOS apps were found leaking private chats and user locations
* Security researchers have discovered scores of mobile apps leaking data * Private messages of over 20 million people are exposed * The affected apps have been grouped under the Firehound name Apple often uses the security of its App Store as a reason why regulators shouldn't force it to open up its app ecosystem to rival stores. After all, the argument goes, Apple vets its App Store for security and ejects apps that are careless with user data. Yet a recent discovery suggests that the App Store isn't quite as watertight as it seems. According to malware researchers VX Underground on X, security firm CovertLabs is working on a project to document iOS apps that leak user information into the wild. At the time of VX Underground's X post, 198 guilty apps had been identified, with the top culprits all being related to artificial intelligence (AI) in some way. The worst offender was an app named Chat & Ask AI by Codeway, which according to CovertLabs has exposed the entire chat history of some 18 million users - that's a total of 380 million messages - as well as user phone numbers and email addresses. This information is apparently "completely accessible to anyone who knows where to look" which, considering the sensitive information people often feed into AIs, is "as bad as it gets," CovertLabs says. Study app 'YPT - Study Group' was also found to be at fault, with research indicating that information from over two million users was exposed. That includes chat messages, AI tokens, user IDs and user keys, according to VX Underground. CovertLabs has created a repository of affected apps, which it has named Firehound. You can browse through redacted sample data to see what information was leaked, as well as which apps have been exposed the most. Much of the data is sensitive and has been restricted, with interested parties needing to request access to the information. CovertLabs says that affected developers should reach out to the firm, at which point the app will be removed from the repository and the developers will receive help on how to fix their apps. Bad for users, developers and Apple The fact that many of the leakiest apps - including Chat & Ask AI, GenZArt, Kmstry and Genie - are related to AI isn't too surprising. In the rush to capitalize on the AI goldmine, it's likely that many developers have cut corners or implemented lax security measures in order to get their app out the door and onto the App Store. But some of the blame should probably also fall at the feet of Apple. The company takes pride in the security of its App Store compared to the likes of the Google Play Store, which is often found to contain more malicious and insecure apps than Apple's effort. Yet that's not always the case - Apple's App Store has problems of its own, and the fact that such vulnerable apps have seemingly made it past the App Store's review process is not a good look for Apple. If you use any of the affected apps, you should stop immediately. You won't be able to do much about the data that's already exposed, but you can at least stop adding more. You should also start using one of the best password managers and change the passwords of any accounts that share the email address you used for the compromised apps. If you know anyone else using these apps, warn them about the dangers. Hopefully, the affected developers will be able to secure their apps - and other developers will learn about the risks before it's too late. Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button! And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.
[5]
These highly-rated apps are leaking your data -- find out if you're affected
When not writing, Dave enjoys spending time with his family, running, playing the guitar, camping, and serving in his community. His favorite place is the Blue Ridge Mountains, and one day he hopes to retire there (hopefully his fear of heights will have retired by then, too!). * Nearly 200 App Store apps have been found to leak personal data, exposing millions of user records. * Most of the top offenders are AI apps. * If you're using any of the affected apps, stop immediately and delete your data from them if possible. Security firm CovertLabs has found that nearly 200 apps on the Apple App Store are leaking user data for millions of users. In a post on X, CovertLabs described the situation as "as bad as it gets." There's a theme among the affected apps -- most of the top offenders are AI-focused. This is problematic, as people tend to provide AI apps with more personal information -- think questions about mental health, relationships, or finances. In some cases, this personal information is tied to email addresses and phone numbers and available for anyone to see. Which apps are affected? CovertLabs has put together a database of affected apps called Firehound. It ranks them by the number of files exposed and lets you browse redacted samples of the types of records being leaked. Here are the worst offenders: * Chat & Ask AI by Codeway -- 406 million records * GenZArt -- 18 million records * YPT - Study Group -- 13 million records * Adult Coloring Book - Pigment -- 7 million records * Kmstry -- 7 million records These five apps alone represent over 20 million unique users. Chat & Ask AI has a 4.8-star rating with 318,000 reviews on the App Store. This is not a small-scale issue, unfortunately. The cause of the leaks Sloppy coding, or something else? The cause of the leaks is unclear. Given how many of these apps are AI-centric, it could be that in the rush to get AI tools to market, developers are cutting corners and skipping safety checks. It's also not entirely clear how these apps are making it past Apple's vetting process, which is meant to be strict. We don't want to go too hard on Apple, though -- privacy concerns exist on Android, too. There doesn't appear to be any indication that the leaks are intentional or malicious in nature, or that the apps are sending the data to third parties -- it's more a case of personal user data sitting exposed in places that are easily accessible to bad actors. According to a post from a CovertLabs researcher, the data from the worst offender, Chat & Ask AI by Codeway, was just sitting there, "completely accessible to anyone who knows where to look." What to do if you're affected Stop using the apps immediately CovertLabs had offered to help app developers resolve these issues -- in fact, the Chat & Ask AI app mentioned above has already been fixed. In the meantime, if you're using any of the apps on the list, you should stop immediately. If possible, delete your data from the app and remove it from your device. Subscribe to our newsletter for privacy and app security insights Looking for deeper app-privacy coverage? Subscribe to the newsletter for expert analysis, practical steps to assess and protect your personal data, and curated investigations into app leaks and security issues so you can better understand and respond. Subscribe By subscribing, you agree to receive newsletter and marketing emails, and accept Valnet's Terms of Use and Privacy Policy. You can unsubscribe anytime. There doesn't appear to be any indication that this data has made its way into nefarious hands, but that's always a possibility, so keep an eye on your accounts. And if you're feeling extra concerned about privacy, consider taking additional measures, like installing security and privacy extensions for Chrome or adjusting the settings on your phone.
Share
Share
Copy Link
Security firm CovertLabs has uncovered a widespread security crisis affecting nearly 200 apps on Apple's App Store, with AI apps dominating the list of offenders. The Firehound project reveals that Chat & Ask AI by Codeway alone has exposed over 406 million records from 18 million users, including chat histories, email addresses, and phone numbers—all accessible to anyone who knows where to look.
Security firm CovertLabs security firm has launched the Firehound project, a public registry that tracks iPhone AI applications leaking private user data across Apple's App Store. As of the initial disclosure, the registry has identified 198 iOS apps exposing sensitive information, with 196 of them actively leaking user data through improperly secured databases and cloud storage systems
1
. The discovery raises serious questions about App Store security and Apple's vetting process, which the company frequently cites as justification for maintaining control over its app ecosystem4
.Source: MakeUseOf
The Chat & Ask AI app by Codeway leads Firehound's rankings for both most files exposed and most records exposed, with more than 406 million records from over 18 million users now accessible
1
. This includes entire chat histories totaling 380 million messages, along with user phone numbers and email addresses4
. Security researchers describe the situation as "as bad as it gets," particularly given the sensitive nature of information users typically share with AI chatbot security risks5
.
Source: Macworld
The top offenders in the Firehound registry reveal a troubling pattern: AI apps overwhelmingly dominate the list of applications with sensitive user data exposure. Following Chat & Ask AI, GenZArt has exposed 18 million records, YPT - Study Group has leaked information from over 13 million records, Adult Coloring Book - Pigment accounts for 7 million records, and Kmstry has exposed another 7 million records
5
. These five apps alone represent more than 20 million unique users whose personal information sits exposed to potential bad actors.The prevalence of AI apps among the worst offenders likely stems from developers rushing to capitalize on the AI boom, potentially cutting corners on security implementations to get products to market quickly
4
. This is particularly concerning because users tend to provide AI apps with highly personal information about mental health, relationships, and finances—data that becomes significantly more dangerous when tied to identifiable information like email addresses and phone numbers5
.Most apps listed on Firehound expose data through improperly secured databases or cloud storage configurations, with many listings disclosing the underlying data schemas and record counts
1
. The exposed records include chat histories, user names, email addresses, phone numbers, AI tokens, user IDs, and in some cases, user locations3
4
.
Source: Mashable
While there's no indication the leaks are intentional or that apps are actively sending data to third parties, the information sits accessible to anyone with knowledge of where to look
5
. The Firehound project intentionally limits public access to the most sensitive scan results, requiring users to register and request restricted datasets. Priority access goes to journalists, law enforcement, and security professionals who undergo manual review1
.Related Stories
CovertLabs has implemented a responsible disclosure framework through Firehound, allowing developers to contact the firm, learn how to fix security vulnerabilities, and have their apps removed from the registry once issues are resolved
2
. Security researchers behind the project, including OSINT specialist @Harrris0n, have already seen some progress. According to a January 20 post, the Chat & Ask AI app's "problem has been addressed, and the vulnerability no longer exists"2
.However, questions remain about the timeline of fixes reaching users. The App Store showed Chat & Ask AI at version 3.3.8, released on January 7, while Firehound's registry entry for the app is dated January 15, suggesting the fixed version may not have been released at the time of the registry update
2
.Users should immediately stop using any apps listed in the Firehound registry and delete their data from these applications if possible
5
. Those who have used affected apps should change passwords for any accounts sharing the same email address and monitor their accounts for suspicious activity4
.Before downloading new AI apps, users can check Firehound to verify an app's security status
2
. The incident serves as a critical reminder that users must remain mindful of the platforms they use and the information they share, particularly with AI chatbots where the barrier to entry for developers has become increasingly low1
. The data privacy risks extend beyond just AI apps—affected categories also include image generators, photo animators, and study applications2
.This widespread exposure of user data highlights the tension between rapid AI development and developer security practices, raising urgent questions about how such vulnerable apps bypass Apple's review process and what measures will prevent similar incidents as AI continues to expand across mobile platforms.
Summarized by
Navi
[4]
05 Feb 2025•Technology

16 Jul 2025•Technology

08 Feb 2025•Policy and Regulation

1
Policy and Regulation

2
Technology

3
Technology
