This week, Roblox dropped a bombshell: anyone who wants to use the in-game chat will now need to verify their age by scanning their face. For a platform with 150 million daily users — many of them children — this is a massive shift.
In an interview on the Hard Fork podcast, Roblox CEO David Baszucki explained the move. We dove into the details to pull out what parents really need to know.

For nearly 20 years, any adult could chat with any child on Roblox. The platform's popularity, especially among kids as young as 5, has made it a target. In 2025 alone, over 20 federal lawsuits were filed against Roblox in the U.S. for failing to protect children from sexual exploitation.
With 11 billion hours of gameplay logged every month, the pressure to act has become immense.
To unlock chat features, users must now:
Baszucki insists the image data is "ephemeral" — it's analyzed and immediately deleted, not stored. But is this system foolproof?

Roblox claims this isn't just about a single face scan. They're combining signals:
However, when the EU introduced similar tech, kids quickly learned to bypass it by taking pictures of video game characters. Roblox says it will conduct periodic re-verification to combat this, but skepticism remains high.
The Unsolved Problems
Even with this new layer, major risks persist.
1. The "Platform Migration" Problem
Predators often use Roblox to make initial contact and then move the conversation to less-moderated platforms like Discord or Snapchat. Baszucki admits that while simple phrases like "find me on Discord" are blocked, determined individuals use coded language to share usernames.
2. AI Moderators vs. Human Oversight
Roblox has shifted heavily from human moderators to AI systems, claiming they are more efficient. But as we saw with Facebook, AI is not a silver bullet for content moderation. It struggles with nuance, sarcasm, and coded language, leaving gaps in safety.
3. The Open Chat Dilemma
Why does a game for 5-year-olds even have open chat with strangers? Baszucki defends it as a vital social tool, stating that for many isolated kids, "Roblox is where they find their friends." While online communities can be a lifeline, it doesn't erase the platform's responsibility to make them safe.

Technology is only one part of the solution. Here’s what parents can and should do.
1. Configure Roblox Parental Controls (Don't Trust the Defaults)
Go into your child's account settings and manually restrict chat and interactions. You can limit communication to "Friends Only" or turn it off completely. Don't rely on Roblox's new system to be the only line of defense.
2. Have the "Digital Stranger Danger"
Talk Explain why they should never share personal information or move conversations to other apps. Frame it as a rule to protect their account and their safety, not just a limitation.
3. Know What Games They're Playing
Roblox isn't one game; it's a universe of millions of user-created games. Some are inappropriate. Check their play history and look up reviews for the games they frequent.
4. Teach Critical Thinking, Not Just Rules
The most powerful defense is a child who can recognize suspicious behavior and knows they can come to you without fear of judgment or having their devices taken away.
At CODDY, we teach kids to build their own games in Roblox Studio. We see its power to inspire creativity and teach valuable coding skills. But we also believe that digital literacy is just as important as programming.
This new verification is a step in the right direction, but it's not a complete solution. The ultimate goal is to empower kids to be safe, responsible digital citizens.
Source: Adapted from a New York Times Hard Fork podcast interview for the CoddySchool blog.
