Roblox Strengthens Child Safety with Mandatory Age Checks
Roblox, one of the world’s largest online gaming platforms, is rolling out mandatory age verification to prevent children from chatting with adults. The platform aims to enhance safety, comply with global regulations, and provide age-appropriate online experiences for millions of users under 18.
New Safety Measures Across Regions
Starting in December, Roblox will introduce compulsory age checks for accounts that use chat features in Australia, New Zealand, and the Netherlands. The measures will expand worldwide in January 2026.
The platform has faced ongoing criticism and legal action over children accessing inappropriate content and interacting with adults. Several U.S. states, including Texas, Kentucky, and Louisiana, have filed lawsuits citing child safety concerns.
Roblox CEO Dave Baszucki told the BBC that parents concerned about online risks should carefully monitor their children’s activity on the platform.
Industry and Regulatory Context
These changes arrive as governments increase oversight of digital platforms. Australia is considering a ban on social media for under-16s, which may extend to gaming platforms such as Roblox. Meanwhile, the UK’s Online Safety Act imposes strict obligations on tech firms to protect children online.
Anna Lucas, online safety supervision director at Ofcom, emphasized the importance of the new age verification system. “Platforms must now take steps to keep kids safe, and we’re ensuring they meet their responsibilities,” she said. “There’s more to do, but change is happening.”
How Age Verification Works
Roblox will become the first major gaming platform to require facial age estimation for accessing chat features. The technology, built into the Roblox app, estimates users’ ages using the device camera. Images are processed by an external provider and deleted immediately after verification, according to the company.
Users who complete the verification process are placed into age groups: under nine, 9–12, 13–15, 16–17, 18–20, and 21+. Chat interactions are restricted to users in the same age bracket unless they are added as “trusted connections,” which is limited to people known personally.
Children under 13 will remain blocked from private messages and certain chat functions without parental consent. Roblox already limits image and video sharing in chats and restricts links to external sites to prevent potential risks.
Addressing Existing Risks
Previously, Roblox users could circumvent age restrictions, raising safety concerns. In a BBC test earlier this year, a 27-year-old and a 15-year-old on separate devices were able to exchange messages. Roblox acknowledged that users sometimes move conversations to external platforms, highlighting the need for stronger safeguards.
Rani Govender, policy manager for child safety online at the NSPCC, welcomed the platform’s new measures but urged continued vigilance. “Young people are exposed to unacceptable risks on Roblox, leaving many vulnerable to harm and online abuse,” she said. “The platform must ensure these changes prevent adult perpetrators from targeting children in practice.”
Parental Control and Privacy
Parents will continue to manage their child’s account, including updating age information after verification. Roblox states that the new system will allow for more “age-appropriate experiences” across the platform and hopes the approach sets a precedent for other tech companies.
Matt Kaufman, Roblox’s chief safety officer, said the facial age estimation technology is accurate within one to two years for users aged five to 25.
Campaigns and Community Action
The platform’s changes coincide with virtual protests organized by advocacy groups ParentsTogether Action and UltraViolet. The campaign staged a first-of-its-kind demonstration within Roblox, delivering a digital petition signed by over 12,000 supporters demanding stronger child safety measures.
The groups called for sweeping reforms, warning that “Roblox must stop being a playground for predators.”
Global User Base
Roblox maintains more than 80 million daily active users in 2024, roughly 40% under the age of 13. By strengthening age verification and restricting adult contact, the platform aims to safeguard younger players while complying with international regulations and growing parental expectations.
This article was rewritten by JournosNews.com based on verified reporting from trusted sources. The content has been independently reviewed, fact-checked, and edited for accuracy, neutrality, tone, and global readability in accordance with Google News and AdSense standards.
All opinions, quotes, or statements from contributors, experts, or sourced organizations do not necessarily reflect the views of JournosNews.com. JournosNews.com maintains full editorial independence from any external funders, sponsors, or organizations.
Stay informed with JournosNews.com — your trusted source for verified global reporting and in-depth analysis. Follow us on Google News, BlueSky, and X for real-time updates.









