Roblox can’t keep up. After years of criticism that its platform isn’t safe for the young gamers it caters to, the multibillion-dollar company announced in July that it was rolling out new measures to protect users, including an AI-powered age-verification system and other privacy tools. But researchers, experts, and lawyers have concerns the changes won’t stop Roblox’s bigger problem: staying ahead of individuals using the platform to exploit players.
On Roblox, kids do what they want. Launched in 2006, it was designed to let them use simple tools to create the kind of games their peers want to play. In-game currency—Robux—allows them to buy avatar outfits and other items. They can chat, trade. Most games are open-world. During Covid-19 lockdowns it gave iPad-wielding kids a place to semi-socialize. The objective of this summer’s big hit, Grow a Garden, is pretty much what you’d expect. The game helped push daily active users past 100 million at the end of July, according to the company; various reports have claimed it was created by a 16-year-old.
That freedom has also made the platform difficult to moderate, particularly given that default settings allow anyone to chat with anyone else while they’re playing. With millions of active users, it’s hard for kids to know who’s operating the boxy avatars on any given server. Roblox’s latest safety measures were intended to make it harder for older people to contact younger ones thanks to what the company calls Trusted Connections, but experts remain unsure they’ll completely protect minors.
The new rules also come much too late for many of Roblox’s users. The company, which made nearly $1 billion in revenue last quarter, has faced allegations for years that its platform is a haven for not only pedophiles but also fascists and nihilist groups like No Lives Matter and 764. Now, WIRED has learned that a group of law firms from across the US is looking to file a flood of lawsuits in the coming months, all for clients who accuse the platform of allegedly facilitating the sexual exploitation and grooming of their kids.
“I would assume by the end of September there should be about 100 to hundreds of these [lawsuits] pending, and I would assume by this time next year you’ll probably be looking at over 1,000 of these filed,” Matt Dolman from Dolman Law Group tells WIRED. “We alone already have about 300 of these cases.” Dolman says the vast majority of his clients are under the age of 16 and estimates around 60 percent of the cases involve girls.
“We are deeply troubled by any incident that endangers our users, and safety is a top priority,” Roblox spokesperson Stefanie Notaney said in a statement to WIRED. “While no system is perfect, Roblox has implemented rigorous safeguards, including restrictions on sharing personal information, links, and user-to-user image sharing, and prohibiting sexual conversations.”
To date there have been fewer than a dozen lawsuits filed against Roblox by people accusing the platform of facilitating the sexual exploitation, grooming, and in some cases, sexual assault of minors.
In May, the parents of a 15-year-old girl in Indiana filed a lawsuit alleging she was groomed by a man in his thirties who the suit claims was already on an FBI watch list. In July, the parents of a young teen girl in Alabama filed a lawsuit alleging that as a then-13-year-old she was manipulated by an adult using Robux. According to the lawsuit, the individual convinced her to meet him, and when law enforcement arrived he “was attempting to forcibly remove [the girl’s] pants.” On July 29, a lawsuit was filed on behalf of a 13-year-old girl from Iowa who was allegedly kidnapped and raped by a 37-year-old man. That lawsuit alleged that Roblox has created a “hunting ground for child-sex predators” while falsely marketing their platforms as safe for children.
Those suits are just the beginning. Dolman tells WIRED that he has been meeting with “a working group” consisting of lawyers from seven other law firms, all of whom are speaking with a number of clients who are also accusing Roblox of failing to protect children.
“We are currently representing over 400 victims of sexual abuse and grooming on Roblox and Discord,” alleges Martin Gould, a partner at Stinar Gould Grieco & Hensley, referring to the chat platform where many Roblox players connect outside of the games. “Some of them are pretty horrific cases.”
Discord did not respond to WIRED’s request for comment.
Tom Cartmell, a partner at Wagstaff & Cartmell, who worked on the lawsuit involving the 15-year-old girl in Indiana, tells WIRED that his firm is evaluating dozens of additional potential victims. Alexandra Walsh from Pennsylvania-based firm Anapol Weiss, which has filed half a dozen lawsuits against Roblox, said her team is “talking and working with hundreds of families whose kids have been allegedly harmed by the conduct of Roblox and other online platforms.”
A number of other firms who are part of the working group did not respond when WIRED reached out for comment. Those who did said they were working on cases involving both boys and girls between 10 and 15 years old when the alleged exploitation took place.
Roblox has known about these issues for years; the company has been self-reporting cases of exploitation for several years. But because of the platform’s previous setup—where users don’t have to prove their age and can register an account without a phone number or an email—the lawsuits allege that some of the most depraved individuals and groups online use it to target young children.
At a recent press briefing, Roblox chief safety officer Matt Kaufman said that safety is “something that’s been part of the Roblox DNA since the company’s inception, almost 20 years ago.” As part of the company’s statement to WIRED, Notaney noted, “we dedicate substantial resources, including advanced technology and 24/7 human moderation, to help detect and prevent inappropriate content and behavior, including attempts to direct users off platform, where safety standards and moderation may be less stringent than ours.” But the data show that in reality the number of children who have been exploited on its platform has grown exponentially in recent years.
In 2019, Roblox self-reported 675 cases of suspected child sexual exploitation to the National Center for Missing and Exploited Children (NCMEC). In 2020, that almost quadrupled to over 2,200 cases. By 2024, the number was over 24,000.
A spokesperson for NCMEC tells WIRED that in the first six months of 2025, the company has submitted “nearly as many reports as it did in all of 2024 and has doubled its reports related to online enticement.”
While the spokesperson said the self-reporting numbers are “encouraging developments that suggest an increased effort to address online enticement,” the reality is that the platform is still being weaponized. “In many of these cases, it’s parents, guardians, or the children themselves who are sounding the alarm,” says Fallon McNulty, the executive director of NCMEC’s CyberTipline. “This reinforces the urgent need for platforms to go beyond the basics and implement stronger protections.”
Still, when the biggest game during the summer break involves kids cultivating their own gardens, it’s easy to see why parents let their children continue to use the platform. “Maybe it’s because of the graphics, or maybe it’s because of the way it’s marketed, [but] there’s an assumption that Roblox is made for children, and parents assume that there are copious amounts of safeguards,” says Rachel Kowert, a research psychologist and author whose work focuses on the social impact of video games.
To filter out problematic content, Roblox uses a combination of human moderators and automated systems driven by artificial intelligence and machine learning. But researchers have repeatedly shown that these measures can be circumvented, replacing banned words with coded language the automated systems struggle to track. In many of the lawsuits filed to date, the victims claim the alleged perpetrator moved conversations from Roblox to platforms like Discord or Snapchat, where there is even less oversight.
When contacted by WIRED, a Snap spokesperson said the company is committed to combatting child sexual exploitation and that it is working to improve its technology to detect people who violate Snapchat’s guidelines.
One of the ways Roblox aims to prevent people from moving communications to other platforms is through Trusted Connections, a new feature that allows users over the age of 13 to have unfiltered chats with other users they know. Users who have verified their age—by sharing a video of themselves that is then scanned by AI—can add others between the ages of 13 and 17 without any additional steps.
Users over the age of 18 can only become Trusted Connections if they know a younger user in real life, according to Roblox. This is confirmed by either scanning a QR code in-person or if the adult’s phone number appears in a user’s contacts.
But critics have concerns that these measures could already be doomed to fail. On Discord, which began to roll out similar features in recent months, it took just a few days for gamers to figure out they could fool age verification. Using the photo mode in Death Stranding, they showed the platform an image of the main character, whose model is actor Norman Reedus.
In addition to the lawsuits being filed against Roblox, law enforcement agencies across the US have been tracking grooming on the platform. In 2024, Bloomberg Businessweek reported that authorities in the US had arrested at least two dozen people that law enforcement accused of abducting or abusing victims they’d met or groomed using Roblox since 2018. The arrests have continued since.
In addition to individual predators on the platform, researchers have shown how members of nihilistic groups like No Lives Matter, CVLT, and 764—all of which operate as part of the broader Com network—have repeatedly used Roblox as a place to recruit children, encouraging them to share sexually explicit images and videos, self-harm, or in extreme cases take their own lives. Members often do this to gain clout within the groups.
The new safeguards, claims extremism researcher Ry Terran, can only do so much to prevent this. In her view, they give “young teens more opportunities to chat, not less.” The “parental controls for teens are up to the teens, not the parents,” Terran says. “But they’re calling these ‘extra safety’ features to shift the burden of safety onto kids and parents and away from themselves.”