Los Angeles County has slapped Roblox with a lawsuit accusing the wildly popular kids’ gaming platform of giving predators a gateway to groom and exploit children.
The suit alleges Roblox falsely marketed itself as a safe haven for kids while operating what officials called a “breeding ground for predators.”
The platform boasts more than 151 million daily active users, with over 40% under age 13.
Roblox “has created a massive, largely unsupervised online world where adults and children mingle with little functional oversight,” the complaint states, accusing the company of designing a platform that creates “foreseeable pathways for adults to access and target minors at scale.”
The county alleges the structure of the platform itself — not just rogue users — enables predators to find and groom children.
The suit, filed Thursday in Los Angeles Superior Court, bluntly claims that Roblox’s design “makes children easy prey for pedophiles,” arguing the company “has failed to implement reasonable and readily available safety measures” including age verification, default communication restrictions, meaningful parental controls and effective reporting systems.
Instead, officials say, key protections were either missing or too weak to prevent exploitation.
Prosecutors further allege the company “has refused to invest in basic safety features and has done the exact opposite,” allowing children to create accounts with no age verification or parental involvement while permitting “high-risk interactions” that enable minors to be located and contacted.
“For years, Roblox has knowingly maintained platform conditions and design features that foreseeably allow systemic sexual exploitation and abuse of children,” the filing states.
The platform has faced a wave of legal challenges in recent years, including lawsuits filed by the attorneys general of Texas, Louisiana and Florida, as well as a growing number of private civil suits brought by families across more than 30 states.
The company has repeatedly denied wrongdoing, saying it “strongly disputes” the allegations and will defend itself vigorously.
Father Jason Sokolowski told The Post earlier this month that he believes his 16-year-old daughter Penelope’s suicide last February was the result of a years-long grooming process that began on Roblox.
The grieving dad, who lives in Vancouver, British Columbia, said he thought he was monitoring her online activity, even using a third-party tracking app, but did not fully understand the scope of the platform or how predators operate on it.
Sokolowski alleges his daughter was first contacted on Roblox before being moved to Discord, where a person he describes as a predator encouraged escalating self-harm.
After Penelope took her life, he said he discovered two years’ worth of messages on her phone, including photos she had sent of herself cutting the predator’s username into her chest.
He believes she was targeted by someone linked to “764,” which the FBI has described as a violent online group that grooms minors into acts of self-harm and violence.
“They are grooming girls to do whatever it is they can get a girl to do, whether it’s nudes or cuts or gore or violence,” Sokolowski told The Post.
A year after his daughter’s death, he blames tech companies for failing to stop predators from contacting children, arguing that the platforms “could mitigate this overnight” if they chose to implement stronger safeguards.
The Post has sought comment from Roblox.
The gaming company maintains that safety is “built at the core” of its platform, noting that users cannot send images through Roblox chat, that AI systems monitor communications around the clock and that it reports suspected exploitation to the National Center for Missing & Exploited Children.

1 hour ago
3
English (US)