


Roblox, the $27 billion online gaming platform pitched as a safe creative playground for kids, is now facing serious legal firepower from Louisiana’s top law enforcement officer. On Thursday, Attorney General Liz Murrill sued Roblox Corp. in state court, accusing the California-based company of enabling predators to target children and “facilitate the distribution of child sexual abuse material” on its platform.
"Today I’m suing Roblox — the #1 gaming site for children and teens – and a breeding ground for sex predators," Murrill said in a statement announcing the suit. "Due to Roblox’s lack of safety protocols, it endangers the safety of the children of Louisiana. Roblox is overrun with harmful content and child predators because it prioritizes user growth, revenue, and profits over child safety. Every parent should be aware of the clear and present danger poised to their children by Roblox so they can prevent the unthinkable from ever happening in their own home."
The lawsuit alleges Roblox “knowingly and intentionally fails to implement basic safety controls to protect child users from predators” and fails to adequately warn parents about the dangers on its platform. It cites years of alleged failures, pointing to games that have appeared on Roblox such as Escape to Epstein Island, Diddy Party, and Public Bathroom Simulator Vibe, which the AG’s office says have included simulated sexual activity, including “child gang rape.”
One example described in the filing involves a July 2025 incident in Livingston Parish, Louisiana, where police executing a search warrant on a suspect in possession of child sexual abuse material found the individual actively using Roblox. The person was allegedly employing voice-altering technology to mimic the voice of a young female “for the purpose of luring and sexually exploiting minor users of the platform.”
The AG’s office notes that until late 2024, Roblox’s default settings allowed adults to directly message children under 13 with little restriction. While recent changes now block direct messages from adults to under-13 users outside of “experiences,” the lawsuit says predators can still send friend requests, chat, and even use voice features within those in-game experiences. “As such, despite these recent changes, Roblox continues to make children highly vulnerable to predators,” the filing states.
Thursday’s lawsuit comes less than a year after high-profile short seller Hindenburg Research published a sweeping investigation into Roblox, alleging the platform is not only a haven for sexual predators but also misleads investors about the size and engagement of its user base.
Hindenburg claimed its research uncovered “digital strip clubs, red light districts, sex parties and child predators lurking on Roblox” despite years of scandals and public promises to clean up the platform. Their team was able to register accounts under the names of convicted pedophiles — including Jeffrey Epstein — and find them already in use by apparent fan accounts, some openly proclaiming “I groom minors” in their usernames.
According to Hindenburg, Roblox’s open search system allowed a self-identified under-13 account to join groups like “Adult Studios,” which had thousands of members allegedly trading child pornography and soliciting sexual acts from minors.
From there, the group’s members could be traced to dozens of other public Roblox communities openly sharing illicit material. The report also described finding over 600 “Diddy” games — some of which allowed players to simulate explicit sex acts — and noted that games like Escape to Epstein Island were still accessible to child accounts.
Beyond child safety concerns, Hindenburg accused Roblox of inflating its most important growth metrics, alleging the company conflates “people” with daily active users in a way that overstates real users by 25–42% or more, in part by failing to weed out alternate accounts and bots. Engagement hours, another Wall Street-friendly statistic, were allegedly padded with “AFK” or “zombie” accounts that stayed logged in for 24 hours a day, skewing the average far above what real users actually play.
Interviews with former employees suggested Roblox maintained two sets of user data — one “de-alted” for internal decision-making, and another inflated figure for financial reporting.
Hindenburg also alleged that Roblox cut trust and safety spending by 2% year-over-year in 2024 despite these ongoing risks, and that moderation work was often outsourced to overseas contractors making $12 a day, limiting the company’s ability to permanently ban repeat offenders. “If you’re limiting users’ engagement, it’s hurting your metrics… in a lot of cases, the leadership doesn’t want that,” one former senior product designer told the firm.
In its latest quarterly report, Roblox said it had 111.8 million average daily active users, a 41% increase from the previous year, with nearly half of its players under the age of 13. The Louisiana AG’s office warns that without robust age verification, predators can pose as children and minors can bypass restrictions simply by lying about their birthdate when signing up.
Asked about the lawsuit, Roblox said it “can’t comment on pending litigation” but defended its safeguards: “We dedicate substantial resources, including advanced technology and 24/7 human moderation, to help detect and prevent inappropriate content and behavior… While no system is perfect, Roblox has implemented rigorous technology and enforcement safeguards.” The company points to recent features like video-selfie-based age estimation, parental monitoring tools, and a teen-only “Trusted Connections” chat system as examples of ongoing safety improvements.
Its website says, “Our platform is designed with safety in mind, and we continually evolve our practices to address emerging challenges.”