5 Sources
5 Sources
[1]
Roblox Rolls Out Age-Verification Requirement for Chat Amid Child Safety Criticism
The age-verification tool uses facial recognition to put players into specific age groups before they can chat online. Roblox, the online game platform that has been under fire due to child safety concerns, has introduced age-verification software that uses facial scanning to estimate the age of players. The system is currently voluntary, but by the first week of December it will be a requirement in markets such as Australia, the Netherlands and New Zealand in order for players to chat with others online. By early January, players in all Roblox markets, including the US, will be required to use the software if they want to engage in chats with other players. Roblox said it has also launched a Safety Center hub with information for parents and parental control tools. Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source. Roblox says the age-verification system is being put in place to limit contact between adults and children, which has been a chief concern among child-safety advocates. How it works Roblox's new age-verification takes a 3D scan of a player's face, using a webcam or a mobile device's phone, in order to estimate the person's age. Based on that estimate, a player can use online chat with other players in their age group. In a video about the software, Roblox says it immediately deletes captured images or video after the age check is complete. The age check is performed by a vendor of Roblox's called Persona. Once they complete the check, players are grouped into ages: under 9, 9-12, 13-15, 16-17, 18-20, or 21+. The company said that those under nine won't be allowed to chat without parental permission. The chats won't be strictly limited to those age groups, necessarily. Roblox said players "can chat only with peers in their group or similar groups, as appropriate." The company said it's also taking measures such as restricting media sharing among players and using AI to monitor chats. Ongoing controversy One of the aims of the launch, which was first announced in the summer, was to address criticism that the platform has not adequately protected underage players of Roblox. Roblox is currently facing dozens of lawsuits related to claims of sexual abuse and child exploitation from families of children who played Roblox. It is also the target of investigations or lawsuits from states including Florida, Texas, Louisiana and Kentucky. Roblox was dealt a setback earlier this month when a California judge declined the company's motion to move one of these suits into private resolution. The company says its safety features are moving beyond what other game platforms offer to protect minors: "Roblox is the first online gaming or communication platform to require facial age checks to access chat, establishing what we believe will become a new industry standard," said Matt Kaufman, Chief Safety Officer and Rajiv Bhatia, Vice President, Product in a post about the safety features.
[2]
Roblox set to start checking people's ages. But it will need to do more to keep kids safe
RMIT University provides funding as a strategic partner of The Conversation AU. Online gaming giant Roblox has just announced it will start checking users' ages from early December in an attempt to stop children and teenagers talking with adults. In what the company has described as a move that sets a "safety gold standard" for the industry, it says it will be the first online gaming or communication platform to require facial age assurance to access chat features for all users. This requirement comes into effect in Australia just days before the country's social media age restrictions launch on December 10. It also comes at a time when Roblox - which boasts nearly 380 million active monthly users worldwide - finds itself embroiled in several lawsuits and facing growing public concerns about child grooming and other harms on the platform. So how exactly will the age requirement work? And will it actually help to keep users - more than half of whom are under 16 - safe? A global rollout The age check requirement will be rolled out first in Australia, New Zealand and the Netherlands in early December. It will be expanded globally in early January. Roblox will require the checks for all users who want to access chat features. Age checks will involve either facial age estimation enabled by artificial intelligence (AI) or ID verification. Once the age check is complete, users will then be grouped by age and only allowed to chat with people of similar ages. Roblox says its age checks (to be run by Persona, a third-party identity verification platform) will be "fast" and "secure", with the Roblox app using the camera on the user's device. Users will take a video selfie and be required to move their face in specific directions, to ensure a real person is being checked, to estimate their age. Once the video is processed it will be deleted, immediately. Roblox under fire At the moment Roblox will not be included in Australia's social media ban for under 16s. However, the company has come under fire in recent months over concerns about grooming, gambling behaviour, and other potential harms for children on its platform. In April 2025, a California man was accused of kidnapping and engaging in unlawful sexual conduct with a 10-year-old child he met on Roblox. This year, several lawsuits have been launched against Roblox. Earlier this month, Texas Attorney General Ken Paxton sued Roblox for "ignoring [American] online safety laws while deceiving parents about the dangers of its platform". Separate lawsuits were filed in Kentucky in October, and Louisiana in August, accusing Roblox of harming children. Florida also filed a criminal subpoena in October alleging Roblox was "a breeding ground for predators". Roblox announced in September that it would implement safety measures in Australia "as a result of eSafety's engagement with the platform". These measures include: making accounts for users under age 16 private by default introducing tools to prevent adult users from contacting under 16s, without parental consent switching off by default direct chat and "experience chat" within Roblox games, until a user has completed an age check not allowing voice chat between adults and children 15 and under. Unlike many other platforms, Roblox does not encrypt private chats. This enables the company to monitor and moderate the conversations. Age checks won't fix other problems While these measures will likely be welcomed by parents and others concerned for child safety online, they are not foolproof. There are limitations to age assurance technologies, which can estimate a person to be between one to three years older - or younger - than their actual age. This means some children may be assigned into an incorrect age grouping. It also means some adults over 18 may be estimated to be under 18, enabling them to chat with younger people. Parents whose accounts are linked to their child's account will be able to correct their child's age. All users over 13 will be able to correct their age by uploading ID into the system, which may raise concerns about data privacy for users. There may also be people who lack the appropriate ID necessary to make the corrections, which may restrict their access to age-appropriate features on the platform. Roblox also allows users to be "trusted connections" and chat with age-checked users 13 and older, with whom they have an existing real-world connections. This will be verified via a QR code or phone number. This means parents will need to check these connections carefully and continue to monitor children's interactions. While Roblox's restrictions will limit interactions to users of similar ages, that doesn't mean many of the other potential harms - such as cyberbullying - won't occur within a peer group. There are also other potential harms that young users may encounter that may not involve chat features. These include virtual sexual assault, as highlighted by a recent investigation by Guardian Australia into Roblox. The eSafety Commissioner will continue to monitor Roblox and other platforms in future, and these may be classed as age-restricted social media under the legislation if warranted. Meanwhile, parents and other carers should review eSafety's advice about the upcoming ban and steps they can take to keep their kids safe online.
[3]
Roblox steps up age checks and groups younger users into age-based chats
Roblox is stepping up its age verification system for users who want to privately message other players and implementing age-based chats so kids, teens and adults will only be able to message people around their own age. The moves come as the popular gaming platform continues to face criticism and lawsuits over child safety and a growing number of states and countries are implementing age verification laws. The company had previously announced the age estimation tool, which is provided by a company called Persona, in July. It requires players to take a video selfie that will be used to estimate their age. Roblox says the videos are deleted after the age check is processed. Users are not required to submit a face scan to use the platform, only if they want to chat with other users. Roblox doesn't allow kids under 13 to chat with other users outside of games unless they have explicit parental permission -- and unlike different platforms, it does not encrypt private chat conversations, so it can monitor and moderate them. While some experts have expressed caution about the reliability of facial age estimation tools, Matt Kaufman, chief safety officer at Roblox, said that between the ages of about five to 25, the system can accurately estimate a person's age within one or two years. "But of course, there's always people who may be well outside of a traditional bell curve. And in those cases, if you disagree with the estimate that comes back, then you can provide an ID or use parental consent in order to correct that," he said. After users go through the age checks, they will be assigned to age groups ranging from under nine, nine to 12, 13 to 15, 16 to 17, 18 to 20 and over 21. Users will then be able to chat with their age group or similar age groups, depending on their age and the type of chat. Roblox said it will start enforcing age checks in Australia, New Zealand, and the Netherlands in the first week of December and the rest of the world in early January. A growing number of tech companies are implementing verification systems to comply with regulations or ward off criticism that they are not protecting children. This includes Google, which recently started testing a new age-verification system for YouTube that relies on AI to differentiate between adults and minors based on their watch histories. Instagram is testing an AI system to determine if kids are lying about their ages.
[4]
Roblox announces measures to strengthen protections for minors
Roblox, the popular online gaming platform that hosts millions of user-created games, said Tuesday it is strengthening protections for minors. Roblox said it will soon require players to use AI-powered facial age-estimation technology to help verify their age. The system, combined with ID-based age verification and confirmed parental consent, is intended to "provide a more accurate measure of a user's age than simply relying on what someone types in when they create an account," the company said in a statement. Roblox CEO David Baszucki described the guardrails in an interview with CBS Mornings' Tony Dokoupil as "what we believe will become the gold standard for safety and civility on the internet." Doukopil questioned Baszucki about potential parental concerns about the AI verification, given that it involves minors sending pictures of themselves into the app to verify the user's age. Some parents "are already skeptical about their children's safety on Roblox," Doukopil said. Roblox is "not storing these images," Baszucki said. The photos "are deleted soon after [Roblox] process[es] them." The image is used to determine the age of the Roblox user, and "then to assign them to the right people that might connect with," he added. The company will start enforcing the age-check requirement in select global markets, including Australia, New Zealand and the Netherlands, before expanding the system to other countries in early January. Roblox is also launching a dedicated online safety center to help families understand and set up the platform's parental controls. The company's enhanced protections arrive as dozens of families, along with attorneys general in Kentucky and Louisiana, are suing Roblox, Discord and other technology companies for allegedly failing to deter sexual predators from approaching children on their platforms. Florida is also investigating Roblox, with Attorney General James Uthmeier accusing the platform of failing to protect minors. Roblox in September outlined plans to expand age checks for all users who want to access communication features on the gaming platform. The guardrails are meant to limit communication between adults and minors unless they already know each other in the real world.
[5]
Roblox cracks down: AI age checks block kids from chatting with adults after lawsuits
Roblox is introducing AI age verification for its users. This new system will use facial scans or government IDs to confirm ages. The platform faces lawsuits claiming it endangers children by allowing adults to connect with them. This move aims to enhance safety and prevent inappropriate interactions on Roblox. The verification will become mandatory globally next year. Roblox cracks down: Given the growing number of lawsuits accusing Roblox of endangering children, the company is implementing one of its largest safety updates to date. The company claims that in the near future, users will have to use a government ID or an AI-powered face scan to confirm their age in order to chat. The broad action is being taken in response to strong public pressure and aims to restrict how kids engage with adults online. As lawmakers, parents, and multiple lawsuits allege the platform has failed to protect children from predators, Roblox is getting ready for one of the biggest safety overhauls in its history, CNN reports. The new system, which employs technology from identity verification company Persona, is intended to keep kids from interacting with adult strangers, the company claims, addressing a long-standing criticism of Roblox. Over 150 million people use Roblox globally, with about one-third of them being under the age of 13. Roblox has long promoted itself as a creative environment where children can learn and experiment. However, when CNN reported that children were being groomed, abused, and even abducted by adults they met on Roblox, the platform came under heavy fire. ALSO READ: Gemini 3 release imminent - here's what to expect from the Google's latest release Roblox's safety measures have been under the microscope as attorneys general from Kentucky and Louisiana filed separate lawsuits accusing the company of harming children. Florida's attorney general also issued a criminal subpoena last month, calling Roblox a "breeding ground for predators." Families have echoed those concerns in their own legal actions, including one mother, Becca Dallas, who says her 15-year-old son died by suicide after being groomed through Roblox and the chat platform Discord. These cases have intensified public calls for stronger protections, especially for the platform's youngest users. Roblox already blocks sharing of photos and personal information, offers parental controls, and relies on both human and AI moderation. Users must also verify their age to enter "18+" environments with violence or crude humor. But executives say the new update will result in far more users completing the verification process, which they view as essential to reducing risk, as per a report by CNN. ALSO READ: Spotify not working? Users report widespread outage, as they "can't even open it" "Our priority is safety and civility," said Roblox Chief Safety Officer Matt Kaufman. "We want to make Roblox a safe, positive, age-appropriate experience for everybody. We set extremely high standards for ourselves, and we understand that the public expects the same from us." The announcement also lands on the same day a "virtual protest" organized by ParentsTogether Action is scheduled to take place on Roblox, demanding stronger default privacy settings for kids under 13. It's part of a broader movement urging online platforms to adopt more robust age checks, a trend already seen at companies like YouTube and Meta, as per a report by CNN. Under the updated policy, every user will have to verify their age before accessing chat features. Those who choose not to upload a government ID will instead be guided through an AI-powered age estimation process. Using the device's front camera, the system prompts a user to move their face in specific directions so the tool can confirm the person is real and not using an image, as per a report by CNN. ALSO READ: New poll delivers big blow to Trump as approval rating takes sharp dive The AI will place users into ranges such as under 9, 9-12, 13-15, 16-17, 18-20, or 21+. Chat will then be restricted based on proximity within those age brackets -- meaning a user estimated to be 12, for example, will only be allowed to chat with users 15 and under. If the AI misjudges someone's age, users over 13 can upload an ID to correct it. Parents connected to their child's account can also update the age. Roblox emphasized that ID verification is only permitted for users 13 and older due to stricter privacy rules for younger children. The company also claims it will delete facial images after age assignment, as per a report by CNN. ALSO READ: What does 67 mean, who made the 67 meme and why is it so popular? "We see these changes as a way to help ensure users are able to socialize with others in age groups that are appropriate but also help limit contact between minors and adults that they do not know," said Rajiv Bhatia, Roblox vice president and head of user and discovery product. Roblox declined to release an exact accuracy rate for the age estimation tool, though Kaufman said it is "typically pretty accurate within one or two years" for people ages 5 to 25. He also noted that the system includes "fairly robust fraud checks" to prevent users from tricking the AI with images of other people or digital characters. It looks for "live" movement and flags repeated or suspicious attempts, as per a report by CNN. Age verification will begin as a voluntary option immediately. It will become mandatory in Australia, New Zealand, and the Netherlands in December, with a global rollout scheduled for early next year. For families, this means that chat access will soon be strictly tied to age detection, creating a new layer of separation between minors and adults on the platform, as per a report by CNN. ALSO READ: Trump got a priceless gold clock from the Rolex CEO, and then he cut Switzerland's tariffs to 15% The update marks a major shift in how Roblox manages communication, and it reflects growing accountability demands from regulators and parents alike. With lawsuits piling up and public tension rising, Roblox's new verification system aims to reestablish trust, and draw clearer boundaries for the millions of young people who log on every day. Why is Roblox adding age verification? To reduce contact between kids and unrelated adults after lawsuits accused the platform of enabling predators. How will users verify their age? By uploading a government ID or letting an AI tool scan their face for age estimation. (You can now subscribe to our Economic Times WhatsApp channel)
Share
Share
Copy Link
Roblox introduces facial recognition technology for age verification to prevent adults from chatting with minors, responding to mounting lawsuits and safety criticism. The system will be mandatory globally by January 2025.

Roblox, the popular online gaming platform with nearly 380 million monthly active users, is implementing a comprehensive age verification system using artificial intelligence-powered facial recognition technology
1
. The system, developed in partnership with identity verification company Persona, will become mandatory for all users who want to access chat features on the platform.The rollout begins in early December 2024 in Australia, New Zealand, and the Netherlands, with global implementation scheduled for early January 2025
3
. Currently voluntary, the age verification will soon be required for all users seeking to communicate with others on the platform.The age verification process requires users to take a video selfie using their device's camera, moving their face in specific directions to confirm they are a real person rather than using a static image
1
. The AI system analyzes facial features to estimate age within one to two years of accuracy for users between ages five and 25, according to Chief Safety Officer Matt Kaufman3
.Once processed, users are categorized into age groups: under 9, 9-12, 13-15, 16-17, 18-20, and 21+
1
. Chat functionality is then restricted to users within the same or similar age brackets. Roblox emphasizes that captured images and videos are immediately deleted after the age verification process is complete.The implementation comes amid mounting legal challenges and public criticism regarding child safety on the platform. Roblox currently faces dozens of lawsuits from families alleging sexual abuse and child exploitation
1
. State attorneys general from Texas, Kentucky, Louisiana, and Florida have initiated legal action or investigations against the company.Florida Attorney General James Uthmeier has described Roblox as "a breeding ground for predators," while Texas Attorney General Ken Paxton sued the company for allegedly "ignoring online safety laws while deceiving parents about the dangers of its platform"
2
. These legal challenges have intensified following reports of children being groomed, abused, and even abducted by adults they met through the platform5
.Related Stories
Beyond age verification, Roblox has implemented several complementary safety features. The platform has made accounts for users under 16 private by default and introduced tools to prevent adult users from contacting minors without parental consent
2
. Direct chat and experience chat within games are switched off by default until users complete age verification.Unlike many other platforms, Roblox does not encrypt private chat conversations, allowing the company to monitor and moderate communications using both human reviewers and AI systems
3
. The company has also launched a Safety Center hub providing information and parental control tools for families.Roblox positions itself as setting "a new industry standard" by becoming the first online gaming platform to require facial age checks for chat access
1
. CEO David Baszucki described the measures as "what we believe will become the gold standard for safety and civility on the internet"4
.However, experts note limitations in age assurance technologies, which can misestimate ages by one to three years
2
. This margin of error could result in some children being placed in incorrect age groups or adults being estimated as minors. Users over 13 can correct their age by uploading government identification, while parents linked to their child's account can make corrections for younger users.Summarized by
Navi
[2]
04 Sept 2025•Technology

18 Jul 2025•Technology

15 Aug 2025•Policy and Regulation
