Saturday, July 26, 2025
HomeGamingRoblox’s New Age Verification Feature Uses AI to Scan Teens’ Video Selfies

Roblox’s New Age Verification Feature Uses AI to Scan Teens’ Video Selfies

Roblox is rolling out new features aimed at making the platform safer for minors, including a revamped friend system, privacy tools, and age verification services users submit by recording a video selfie.

In Roblox’s old friend system, players have no distinction between people they know casually or online versus someone they consider a close friend. The platform’s new tiered system introduces Connections and Trusted Connections specifically for people that players know and trust. To access Trusted Connections and its benefits, users first need to complete an age verification, which requires them to submit a video selfie. Once they’ve submitted their video, the company says it’s run against an AI-driven “diverse dataset” to get an age estimation. If the user appears to be under 13, they will automatically lose access to any features not deemed age-appropriate.

For users whose ages cannot be determined with “high confidence,” according to a blog on the company’s site, their age remains unconfirmed; they’ll need to use ID verification to pass. The company says it will allow for parental consent in the future; biometric data is deleted after 30 days, except where required in the case of a warrant or subpoena. WIRED raised the issue of 13-year-olds not having government-issued IDs to chief safety officer Matt Kaufman. “That is a problem,” Kaufman says. “In North America or maybe the United States in particular, that’s not common. In other parts of the world, it is much more common to have photo ID.” If a child is unable to obtain verification due to lack of ID, they can get verified through their parents. If their parents are unable to do so for any reason, kids won’t be able to use Trusted Connections.

Teen users who pass the age check will be able to use the Trusted Connections feature to add anyone ages 13 to 17. Anyone 18 or older will need to be added either via an in-person QR code scan or via a phone number. With Trusted Connections, Roblox removes filters—which includes inappropriate language and personally identifiable information—on party voice and text chats for users 13 and up. Those communications are still subject to Roblox’s community standards and moderation, but the company hopes removing filters will keep users on their platform rather than moving to spaces like Discord. By keeping players within Roblox, the company can monitor their activity. A spokesperson told The Verge that includes “any predatory behavior aimed at manipulating or harming minors, the sexualization of minors, engaging in inappropriate sexual conversations with or requesting sexual content, and any involvement with child sexual abuse material.”

Kaufman says the company wants to make Roblox “safe by default.” That’s why the company filters communications even for teenagers who haven’t verified their age. “If parents are uncomfortable with that, and it’s the right decision for their family, parents can turn off communications through parental controls,” Kaufman says.

Roblox is one of the biggest platforms worldwide in video games, especially with kids. Kaufman, in a press briefing, said roughly 98 million people from 180 countries use the platform; Kaufman says that over 60 percent of users are over age 13. The company has struggled, however, with predators and minors’ safety. According to a 2024 Bloomberg report, police have arrested at least two dozen people who’ve used Roblox as a platform for grooming, abuse, or abduction. Roblox has also been the subject of several lawsuits. This includes a class action lawsuit alleging the company harvests user data, including that of minors, and a federal lawsuit alleging a 13-year-old girl was exploited and sexually groomed via Roblox and Discord.

In the briefing, Kaufman called Roblox “one of the safest places online for people to come together and spend time with their friends and their family.”

Kirra Pendergast, founder and CEO of Safe on Social—an online safety organization operating worldwide—says Roblox’s latest safety measures are largely opt-in, therefore putting “responsibility on minors to identify and manage risks, something that contradicts everything we know about grooming dynamics.” Features like machine-learning age-estimation tools, for example, can incorrectly categorize users as older or younger; in-person code scanning, she says, “assumes that in-person QR code scanning is inherently safe.”

“Predators frequently use real-world grooming tactics,” says Pendergast. “A QR scan doesn’t verify a trusted relationship. A predator could build trust online, then manipulate the child into scanning a QR code offline, thus validating a ‘Trusted Connection’ in Roblox’s system. Real protection would require guardian co-verification of any connections, not child-initiated permissions.”

Furthermore, says Pendergast, Trusted Connections applies only to chat, which leaves “large surface areas exposed, making it a brittle barrier at best.”

When asked how an in-person QR code keeps minors safe from real-world grooming tactics, Kaufman echoes a press briefing comment that there is no “silver bullet.” Instead, he says, it’s many systems working together. “Those systems begin with our policies, our community standards,” Kaufman says. “It’s our product which does automated monitoring of things, it’s our partnerships, it’s people behind the scenes. So we have a whole suite of things that are in place to keep people safe. It is not just a QR code, or it is not just age estimation, it’s all of these things acting in concert.”

Kaufman says that Roblox is “going farther” than other platforms by not allowing kids age 13 to 17 to have unfiltered communication without going through Trusted Connections. “We feel that we’re really setting the standard for the world in what it means to have safe, open communication for a teen audience.”

According to Roblox’s briefing, the updates are part of Roblox’s typical development process and haven’t been “influenced by any particular event” or feedback. “It’s not a reaction to something,” Kaufman said. “This is part of our long term plan to make Roblox as safe as it can possibly be.”

Kaufman also tells WIRED that the heightened scrutiny and discussion of the game hasn’t had a dramatic impact on the company’s plans. “What we’re doing with this announcement is also trying to set the bar for what we think is appropriate for kids,” he says.

Looking at technology like generative AI, he says, “the technology may have changed, but the principles are still the same. We also look at AI as a real opportunity to be able to do some of the things that we do in safety at scale when we think about moderation. AI is central to that.”

In the briefing he said Roblox believes it’s important for parents and guardians to “build a dialog” with their kids about online safety. “It’s about having discussions about where they’re spending time online, who their friends are, and what they’re doing,” Kaufman said. “ That discussion is probably the most important thing to do to keep people safe, and we’re investing in the tools to make that happen.” He added that Roblox is aware that families have different expectations on what’s appropriate online behavior. “If parents make a decision about what’s appropriate for their kid, it may not match the decisions that we might make or I might make for my own family,” he said. “But that’s OK, and we respect that difference.”

Dina Lamdany, who leads product for user settings and parental controls, said in that briefing that as teenagers are experimenting with their independence, “it’s really a moment where it’s important for them to learn the skills that they need to be safe online.” Teen users can grant dashboard access to their parents, which gives parents the ability to see who their child’s trusted connections are. “We won’t be notifying parents proactively right now,” Lamdany says.

Online safety, especially for minors, is an ongoing problem in games spaces. Nintendo recently introduced GameChat with the Switch 2, a social feature that allows players to connect with friends without leaving the platform. For younger users, it relies heavily on parental controls, while adults are expected to be proactive in who they chat with. The system’s privacy policy warns that it might also “monitor and record your video and audio interactions with other users,” though some people are concerned about that level of surveillance and argue it makes GameChat less appealing than Discord. Kaufman says that Roblox takes privacy seriously. “We’re the only large platform in the world that has a large number of kids and teens on it, and for that reason, privacy has been built into the foundation of our entire platform,” he says.

Pendergast says that if Roblox wants to lead the way in safety, it has to take harder stances.

“It must stop asking children and parents to manage their own protection and start building environments where trust isn’t optional, it’s engineered in as safety by design,” she says. “Age estimation, parental dashboards, and AI behavioral monitoring must be default, not optional, creating a baseline of systemic defense, not user-managed or user-guardian managed risk.”

Otherwise, Pendergast says, “parents and children are left to do the heavy lifting often without the digital and online safety literacy required.”

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Popular

Recent Comments