Roblox Introduces Age Verification for Communication Features to Enhance Child Safety

Roblox, one of the world’s most popular platforms for creating, sharing, and playing 3D virtual worlds, has announced new safety measures that will require age estimation checks for all users accessing its communication features by the end of this year.

The move comes after years of criticism and lawsuits alleging that Roblox has not done enough to protect children from online predators who exploit messaging, avatar customization, and role-playing functions to approach minors. Most recently, the state of Louisiana sued Roblox, claiming the platform created an environment where predators could “thrive, unite, hunt, and victimize kids.”

How Roblox’s Age Estimation Works

According to Roblox, the new age verification process uses facial age estimation technology:

“We estimate your age by analyzing a selfie of your face and examining your facial features. Based on this analysis, users are grouped into age categories (under 13, 13+, and 18+), and features are adjusted accordingly. If placed in the under-13 category, certain personal data, such as email and phone number, will be removed from Roblox.”

The primary objective is to reduce unsafe interactions between adults and minors, ensuring that communication features, games, and content align with a user’s developmental stage. Roblox plans to combine selfie-based age estimation, ID verification, and parental consent to enforce these restrictions.

Challenges and Risks

While facial age estimation is more effective than simple self-reported birthdays or checkboxes, it is not without flaws. The technology relies on AI tools that can be fooled by deepfakes, spoofing techniques, or inaccurate readings, and it introduces new concerns about data privacy since scans and IDs must be stored securely.

Experts argue that approaches like double-anonymity verification could reduce risk by separating personal information from verification results, ensuring platforms only receive a user’s verified age category rather than sensitive identity details.

Addressing Predatory Behavior

Roblox has also admitted that malicious actors often attempt to lure users off the platform, where safety standards are weaker. To combat this, the company has implemented AI-driven moderation tools designed to detect early signs of child endangerment. In the first half of 2025 alone, these tools enabled Roblox to file 1,200 reports of suspected child exploitation attempts with the National Center for Missing and Exploited Children.

Tips for Parents to Keep Children Safe on Roblox

Since parental supervision cannot always be constant, Roblox and cybersecurity experts recommend the following safety measures:

  • Enable Parental Controls to restrict access to age-appropriate games and set screen-time limits.
  • Use Anonymized Accounts, avoiding real names or personal details.
  • Limit Friend Requests and Chats by adjusting account privacy settings.
  • Encourage Kids to Stay On-Platform, refusing to take conversations to external apps.
  • Educate Children on Online Safety, including not sharing personal information or clicking suspicious links.
  • Play Together, making online gaming a shared experience while monitoring activities.
  • Stay Informed about Roblox’s evolving safety updates and features.
  • Secure the Device with up-to-date software and active cybersecurity protections.

The Bigger Picture

Roblox’s decision reflects a broader trend: governments and platforms are increasingly requiring robust age verification to limit minors’ exposure to inappropriate content online. While the debate over privacy and effectiveness continues, Roblox’s adoption of AI-powered age checks marks a significant step toward safer digital environments for young users.

Source: https://www.malwarebytes.com/blog/news/2025/09/roblox-introduces-age-checks-to-use-communication-features