Roblox accused of enabling systemic sexual exploitation of children in new lawsuit

0
608

Louisiana attorney general sues Roblox for exposing children to sexual predators

Roblox is once again the target of online child safety advocates, as it faces another lawsuit that claims the platform is "choosing profits over child safety."

The lawsuit, file by Louisiana Attorney General Liz Murrill, alleges the platform has "knowingly and intentionally" failed to institute "basic safety controls" that have exposed young players to predatory behavior and child sex abuse materials. Murrill also alleges the platform has failed to properly warn parents of potential dangers children face when playing Roblox.

This Tweet is currently unavailable. It might be loading or has been removed.

In a series of tweets posted to X, Murrill claimed the platform was "perpetuating violence against children and sexual exploitation for profit" and called many of the site's gaming worlds, which are built by users and played by millions of children around the world, "obscene garbage." Murrill also posted several images of what were allegedly publicly available game experiences hosted on the platform, including "Escape to Epstein Island" and "Public Showers." Similar legal actions have been taken against other popular social media platforms — including Meta, TikTok, and Snapchat — amid growing concern for youth online safety and mental health.

Mashable Light Speed

Roblox has been on a mission to reform its image following a series of reports claiming the online gaming site is dangerous for young children, allegedly because it failed to curb a network of predatory adult users. In 2023, a class action lawsuit was filed against the platform on behalf of parents, claiming the company falsely advertised its site as safe for children.

Since then, Roblox has introduced a swath of new safety features, including extensive blocking tools, parental oversight, and messaging controls. The platform recently introduced selfie-based age verification for teen players — in the lawsuit, Murrill claims a lack of age verification policies makes it easier for predators to interact with children on the platforms. Earlier this year, the platform joined other social media companies backing the newly passed Take It Down Act, which establishes takedown policies and repercussions for publishing non-consensual intimate imagery, including deepfakes.

Site içinde arama yapın
Kategoriler
Read More
Science
Wild New Carbon Capture Idea Suggests Tackling Climate Change With Massive Undersea Nuclear Explosions
PhD Student Suggests Tackling Climate Change By Exploding Nuclear Bombs Under The SeaWild New...
By test Blogger3 2025-06-05 12:00:12 0 2K
Oyunlar
Get Dune Awakening free and sense the desert with this new Razer gaming gear
Get Dune Awakening free and sense the desert with this new Razer gaming gear As an Amazon...
By Test Blogger6 2025-06-17 16:00:23 0 2K
Science
80,000-Year-Old Arrowheads Suggest Neanderthals May Have Made Projectile Weapons
80,000-Year-Old Arrowheads Suggest Neanderthals May Have Made Projectile WeaponsUntil now, it was...
By test Blogger3 2025-08-14 16:00:15 0 610
Food
When It Comes To Costco's Frozen Seafood, This Cod Has Us Hooked
When It Comes To Costco's Frozen Seafood, This Cod Has Us Hooked...
By Test Blogger1 2025-07-14 19:00:12 0 946
Music
Iron Maiden Pay Tribute to Late Original Singer Paul Mario Day
See Iron Maiden's Heartfelt Message After Death of Original Singer Paul Mario DayBryan Rolli, UCR...
By Test Blogger4 2025-07-30 17:00:10 0 733