4 Sources
4 Sources
[1]
AI and ID verification apps leaked data on millions of Android users
There are millions of apps in the Google Play Store, but not all of them are safe to use. Security researchers have recently identified several apps that contain serious security vulnerabilities. The first app in question According to a Forbes contributor, a seemingly harmless app called Video AI Art Generator & Maker by developer Codeway -- which has been installed nearly half a million times -- leaked all of its users' images and videos. Over 12 TB of data, including 1.5 million images and nearly 400,000 videos, ended up freely available on the internet. The incident wasn't malicious, but due to a configuration error in Google Cloud. It allowed anyone to access the stored data without having to identify themselves first. For users of the app, it was a disaster. The app is no longer available in the Google Play Store, as Google responded quickly to user complaints and removed it. It had been listed since June 2023 and was used to generate images and videos quickly and easily with AI. The leaked images were all created using the app, but possibly contained private content.
[2]
AI Image App Leaks 1.5 Million User-Generated Photos
An Android app that promised AI-powered photo and video makeovers instead left a large volume of user content publicly exposed, according to researchers from Cybernews. An app called Video AI Art Generator & Maker, which has been downloaded more than 500,000 times from the Google Play Store, leaked user data through a misconfigured Google Cloud Storage bucket. The bucket required no authentication, allowing anyone who discovered it to access the stored files. Cybernews reports that the exposed storage contained more than 1.5 million user-uploaded images and over 385,000 user-uploaded videos. In addition, it held millions of AI-generated files, including approximately 2.87 million AI-generated images, 2.87 million AI-generated videos, and over 386,000 AI-generated audio files. In total, the bucket stored about 8.27 million media files amounting to more than 12 terabytes of data. The app, which offered cinematic-style AI makeovers for photos and videos, launched in mid-June 2023. The researchers found that the storage bucket appeared to contain every file uploaded since the app's launch, with the oldest files dating to just before it became publicly available. The database was allegedly linked to Codeway Dijital Hizmetler Anonim Sirketi, a private company registered in Turkey. However, the app was not visible on the developer's official website, and Codeway's public Play Store profile listed only a small number of other apps. Cybernews notes that another app associated with the company, Chat & Ask AI, had previously been found to expose a large volume of user messages due to a separate backend misconfiguration. After Cybernews contacted the developers behind Video AI Art Generator & Maker, they secured the exposed database shortly afterward. "This data leak shows how some AI apps prioritize fast product delivery, skipping crucial security features, such as enabling authentication for the critical cloud storage bucket used to store user data, including images and videos," the researchers say. The researchers warn that many AI apps store sensitive user uploads alongside AI-generated content and often embed secrets such as API keys or passwords directly into their code. Cybernews found that roughly 72 percent of the hundreds of Google Play apps it analyzed showed similar security vulnerabilities, raising concerns about how safely user data is handled across the rapidly growing AI app ecosystem.
[3]
Top Android AI photo and video editor exposes nearly two million user images and videos
Misconfigured database holding millions of images and videos found online * Cybernews found misconfigured database in "Video AI Art Generator & Maker" app * Leak exposed 8.27m media files, including 2m private user photos and videos * Developers secured database after disclosure; similar flaws seen in another Codeway app Yet another misconfigured database leaking sensitive user data was found, but this one is even more worrying since the data being leaked is - user-uploaded photos and videos. Researchers from Cybernews recently discovered an Android app called "Video AI Art Generator & Maker" contained a misconfigured Google Cloud storage bucket which was accessible to anyone who knew where to look. In total, more than 1.5 million user images, and more than 385,000 videos were stored in the bucket. Furthermore, it stored more than 2.87 million AI-generated videos, more than 386,000 AI-generated audio files, and more than 2.87 million AI-generated images. Several vulnerable apps The app offered AI-generated makeovers for photos and videos - something that is particularly popular these days. It was launched in mid-June 2023 and according to Cybernews, has been storing the multimedia people upload since then. So, in total, the exposed bucket contained 8.27 million media files, 2 million of which were private, user-uploaded content. The app was allegedly developed by Codeway Dijital Hizmetler Anonim Sirketi, a private company registered in Turkey, but we could not find this app on the Play Store, or the developer's official website. Codeway's dedicated Play Store page shows only three apps. However, the official website does demonstrate a different app, called Chat & Ask AI, and this one also had a misconfigured backend using Google Firebase. In early February 2026, an independent researcher found that this app, one of the most popular ones in its category, exposed 300 million messages tied to 25 million users. Still, Cybernews said it managed to get in touch with the developers, who secured the Video AI Art Generator & Maker database soon after. "This data leak shows how some AI apps prioritize fast product delivery, skipping crucial security features, such as enabling authentication for the critical cloud storage bucket used to store user data, including images and videos," the researchers explained. Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button! And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.
[4]
Unsecured AI apps are leaking personal data of Android users
AI apps are a "treasure trove" of leaked data, researchers say. Credit: Anadolu Agency / Contributor / Anadolu via Getty Images Not every AI tool you stumble across in your phone's app marketplace is the same. In fact, many of them may be more of a privacy gamble than you would have previously thought. A plethora of unlicensed or unsecured AI apps on the Google Play store for Android, including those marketed for identity verification and editing, have exposed billions of records and personal data, cybersecurity experts have confirmed. A recent investigation by Cybernews found that one Android-available app in particular, "Video AI Art Generator & Maker," has leaked 1.5 million user images, over 385,000 videos, and millions of user AI-generated media files. The security flaw was spotted by researchers, who discovered a misconfiguration in a Google Cloud Storage bucket that left personal files vulnerable to outsiders. In total, the publication reported, over 12 terabytes of users' media files were accessible via the exposed bucket. The app had 500,000 downloads at the time. Another app, called IDMerit, exposed know-your-customer data and personally identifiable information from users across 25 countries, predominantly in the U.S. Information included full names and addresses, birthdates, IDs, and contact information constituting a full terabyte of data. Both of the apps' developers resolved the vulnerabilities after researchers notified them. Still, cybersecurity experts warn that lax security trends among these types of AI apps pose a widespread risk to users. Many AI apps, which often store user-uploaded files alongside AI-generated content, also use a highly criticized practice known as "hardcoding secrets," embedding sensitive information such as API keys, passwords, or encryption keys directly into the app's source code. Cybernews found that 72 percent of the hundreds of Google Play apps researchers analyzed had similar security vulnerabilities.
Share
Share
Copy Link
A popular Android AI app exposed over 12 terabytes of user data, including 1.5 million images and 385,000 videos, through a misconfigured Google Cloud Storage bucket. Cybersecurity researchers warn that 72 percent of AI apps analyzed show similar security vulnerabilities, raising concerns about how these rapidly deployed tools handle user privacy.
A significant data leak affecting Android users has revealed how unsecured AI applications are putting millions at risk. Video AI Art Generator & Maker, an Android app downloaded more than 500,000 times from the Google Play Store, exposed over 12 terabytes of user content through a misconfigured Google Cloud Storage bucket that required no authentication
1
2
. The exposed storage contained more than 1.5 million user-uploaded images and over 385,000 user-uploaded videos, alongside approximately 2.87 million AI-generated images, 2.87 million AI-generated videos, and over 386,000 AI-generated audio files2
. In total, the bucket stored about 8.27 million media files, with 2 million of those being private user-generated photos and videos3
.
Source: PCWorld
Cybernews researchers discovered the backend misconfiguration allowed anyone who knew where to look to access the stored files without authentication
3
. The app, which offered cinematic-style AI makeovers for photos and videos, launched in mid-June 2023, and the storage bucket appeared to contain every file uploaded since the app's launch2
. The database was linked to Codeway Dijital Hizmetler Anonim Sirketi, a private company registered in Turkey2
3
. Google responded quickly to user complaints and removed the app from the Google Play Store after the vulnerability was disclosed1
.
Source: PetaPixel
This isn't an isolated incident for Codeway. Another app associated with the company, Chat & Ask AI, had previously been found to expose a large volume of user messages due to a separate backend misconfiguration
2
. In early February 2026, an independent researcher discovered that this app exposed 300 million messages tied to 25 million users3
. Beyond Codeway's applications, another app called IDMerit exposed know-your-customer data and personally identifiable information from users across 25 countries, predominantly in the U.S., including full names and addresses, birthdates, IDs, and contact information constituting a full terabyte of data4
.
Source: Mashable
Related Stories
Cybersecurity experts warn that lax security trends among AI apps pose a widespread risk to user privacy. Researchers found that roughly 72 percent of the hundreds of Google Play apps analyzed showed similar security vulnerabilities
2
4
. Many AI apps store sensitive user uploads alongside AI-generated content and often use a highly criticized practice known as hardcoding secrets, embedding sensitive information such as API keys, passwords, or encryption keys directly into the app's source code4
. According to Cybernews researchers, "This data leak shows how some AI apps prioritize fast product delivery, skipping crucial security features, such as enabling authentication for the critical cloud storage bucket used to store user data, including images and videos"2
3
. After Cybernews contacted the developers behind Video AI Art Generator & Maker, they secured the exposed database shortly afterward2
. The incident wasn't malicious but due to a configuration error in Google Cloud that allowed anyone to access the stored data without having to identify themselves first1
. For Android users relying on these tools, the leaked personal data represents a significant privacy disaster, particularly as the rush to deploy AI-powered features appears to be outpacing basic security protocols across the rapidly growing AI app ecosystem.Summarized by
Navi
[2]
[3]
20 Jan 2026•Technology

16 Jul 2025•Technology

27 Jan 2026•Technology

1
Technology

2
Technology

3
Business and Economy
