2 Sources
2 Sources
[1]
Roblox's AI-Powered Age Verification Is a Complete Mess
Roblox's face scanning system, which estimates peoples' ages before they can access the platform's chat functions, rolled out in the US and other countries around the world last week, after initially launching in a few locations in December. Roblox says it is implementing the system to allow users to safely chat with users of similar ages. But players are already in revolt because they can no longer chat to their friends, developers are demanding Roblox roll back the update, and crucially, experts say that not only is the AI mis-aging young players as adults and vice versa, the system does little to help address the problem it was designed to tackle: the flood of predators using the platform to groom young children. In fact, WIRED has found multiple examples of people advertising age-verified accounts for minors as young as 9 years old on eBay for as little as $4. After WIRED flagged the listings, eBay spokesperson Maddy Martinez said the company was removing them for violating the site's policies. In an email, Roblox's chief safety officer Matt Kaufman told WIRED that a change of this magnitude on a platform with over 150 million daily users takes time. "You can't flip a switch while building something that hasn't existed before," he said. "Expecting the system to be flawless overnight is ignoring the scale of this undertaking." Kaufman said the company was happy with the uptake, adding that "tens of millions of users" have already verified their age, which he claimed proved that "the vast majority of our community values a safer, more age-appropriate environment." The company also addressed some of the criticism in an update on Friday, writing: "We are aware of instances where parents age check on behalf of their children leading to kids being aged to 21+. We are working on solutions to address this and we'll share more here soon." Roblox announced the age verification requirement last July as part of a raft of new features designed to make the platform safer. The company has come under intense pressure in recent months after multiple lawsuits allege the company failed to protect its youngest users and facilitated predators to groom children. The attorneys general of Louisiana, Texas, and Kentucky also filed lawsuits against the company last year making similar claims, while Florida's attorney general issued criminal subpoenas to assess whether Roblox is "aiding predators in accessing and harming children." Roblox claims that requiring people to verify their ages before allowing them to chat to others will prevent adults from being able to freely interact with children they don't know. While the process is optional, refusing to do it means a person will no longer have access to the platform's chat functions, one of the key reasons most people use Roblox. To verify their ages, people are asked to take a short video using their device's camera, which is processed by a company called Persona that estimates their age. Alternatively, users can upload a government-issued photo ID if they are 13 or older. Roblox says all personal information is "deleted immediately after processing." However many users online say they are unwilling to conduct age verification over privacy concerns. Peope who have verified their ages are only allowed to chat to a small group of other players around their own age. For example, those verified as under 9 can only chat with players up to the age of 13. Players deemed to be 16 can chat with players between 13 and 20.
[2]
Act surprised - Roblox AI-powered age verification doesn't work
At this point, I kind of have to feel sorry for Roblox. The company came under increasing criticism for failing to adequately protect children, and then courted even greater controversy when it started requiring children as young as nine years old to submit a video selfie for age verification. The latest development in the saga is that the age verification process appears to be failing badly, with the company saying that you can't expect everything to work on day one ... Following growing criticism that the app was putting children at risk, Roblox introduced an age verification process . The company said it would do this to limit communication between adults and children in chat. The new feature got a partial launch in December of last year before being rolled out globally this month. In theory, Roblox offers a choice between age verification based on photo ID and AI age estimation from a video selfie. The reality, however, is that few young children have government-issued photo ID, and so need to go the video selfie route. While the company told us that the video selfies are only required for access to chat features, and are deleted after the checks are complete, many parents were extremely unhappy about the requirement. Wired took a look at how well the process is working and described the results as "a complete mess." Players, developers, and parents alike are unhappy, and the system simply isn't doing the job it was designed to do. The site found that the AI system was identifying children as adults, adults as children, and that predators can buy accounts which have been age-verified as children for as little as $4. Players are already in revolt because they can no longer chat to their friends, developers are demanding Roblox roll back the update, and crucially, experts say that not only is the AI mis-aging young players as adults and vice versa, the system does little to help address the problem it was designed to tackle: the flood of predators using the platform to groom young children. In fact, WIRED has found multiple examples of people advertising age-verified accounts for minors as young as 9 years old on eBay for as little as $4. The report cites Roblox's chief safety officer Matt Kaufman as saying that teething problems have to be expected. "You can't flip a switch while building something that hasn't existed before," he said. "Expecting the system to be flawless overnight is ignoring the scale of this undertaking" [...] The company also addressed some of the criticism in an update on Friday, writing: "We are aware of instances where parents age check on behalf of their children leading to kids being aged to 21+. We are working on solutions to address this and we'll share more here soon." I know I'm going to sound like a broken record at this point, but it's further fodder for my argument that Apple age verification with a privacy focus is infinitely preferable to thousands of developers using their own random processes. I also know that some commenters are going to want to wish away the need for it ...
Share
Share
Copy Link
Roblox rolled out its AI-powered age verification system globally, requiring video selfies to access chat functions. But the face-scanning technology is misidentifying users, sparking a player revolt, and failing its core mission. Age-verified accounts for minors as young as 9 are being sold on eBay for as little as $4, allowing predators to bypass protections entirely.
Roblox's face-scanning system for age verification rolled out globally last week after an initial December launch in select locations, but the implementation has triggered widespread criticism from players, developers, and safety experts alike
1
. The platform, which serves over 150 million daily users, introduced the system to restrict chat functions and prevent adults from freely interacting with children they don't know1
. Users must now submit video selfies processed by a company called Persona for AI age estimation, or upload a photo ID if they're 13 or older, to maintain access to the platform's chat featuresβone of the primary reasons people use Roblox1
.
Source: 9to5Mac
The move comes after mounting pressure from multiple lawsuits alleging the company failed to protect children and facilitated grooming by predators. Attorneys general from Louisiana, Texas, and Kentucky filed suits last year, while Florida's attorney general issued criminal subpoenas to investigate whether Roblox is "aiding predators in accessing and harming children"
1
.Experts report that the AI is misidentifying users at alarming rates, with young players being classified as adults and vice versa
1
. Wired's investigation revealed that the system does little to address the core problem it was designed to solve: the flood of predators using the platform to groom young children1
. The player revolt intensified as users discovered they could no longer chat with friends, while developers demanded the company roll back the update entirely1
.Roblox acknowledged one critical flaw in a Friday update: "We are aware of instances where parents age check on behalf of their children leading to kids being aged to 21+. We are working on solutions to address this and we'll share more here soon"
1
. This admission highlights how easily the system can be manipulated, even unintentionally.Perhaps most concerning, Wired found multiple examples of people advertising age-verified accounts for minors as young as 9 years old on eBay for as little as $4
1
2
. This discovery undermines the entire premise of the verification system, as predators can simply purchase verified child accounts to bypass age restrictions. After Wired flagged the listings, eBay spokesperson Maddy Martinez confirmed the company was removing them for violating site policies1
.
Source: Wired
The verified age groups are segregated into narrow bandsβthose verified as under 9 can only chat with players up to age 13, while those deemed 16 can chat with players between 13 and 20
1
. However, these protections become meaningless when accounts can be purchased and transferred.Related Stories
Many users have refused to complete age verification due to privacy concerns, despite Roblox's assurance that all personal information is "deleted immediately after processing"
1
. The reality is that few young children possess government-issued photo ID, forcing them down the video selfies route if they want to maintain chat access2
.Matt Kaufman, Roblox's chief safety officer, defended the rollout by emphasizing the scale of the undertaking: "You can't flip a switch while building something that hasn't existed before. Expecting the system to be flawless overnight is ignoring the scale of this undertaking"
1
. Kaufman claimed that "tens of millions of users" have already verified their age, arguing this proves "the vast majority of our community values a safer, more age-appropriate environment"1
.Critics argue the debacle strengthens the case for centralized solutions like Apple age verification with privacy-focused design, rather than thousands of developers implementing their own systems
2
. As Roblox works to address these issues, the question remains whether AI-based age verification can ever be reliable enough to protect minors on platforms of this scale, or if the company will need to fundamentally rethink its approach to child safety.Summarized by
Navi
18 Nov 2025β’Policy and Regulation

04 Sept 2025β’Technology

18 Jul 2025β’Technology

1
Technology

2
Policy and Regulation

3
Technology
