Section 230

Doe S.F. v. Roblox Corporation

🏛 District Court, N.D. California · 1 filing
2025-12-08 Complaint Section 230 First Amendment

Issue: Whether Roblox Corporation is liable under negligence, products liability, and consumer protection theories for allegedly defective platform design—specifically the absence of age verification, identity screening, and effective parental controls—that enabled an adult predator to groom and sexually exploit a 13-year-old minor user, and whether §230 of the Communications Decency Act bars those claims.

Plaintiff, a minor acting through her guardian, filed this complaint in the Northern District of California on December 8, 2025, asserting that Roblox's app design, default communication settings permitting adult strangers to message children, failure to implement age or identity verification, and systematic misrepresentations to parents about platform safety collectively caused the minor's sexual exploitation by a predator who first contacted her on Roblox. The complaint alleges that Roblox knowingly prioritized user-growth metrics over child safety and actively concealed the prevalence of predatory conduct on its platform. Plaintiff expressly disaffirms any terms-of-service agreement, including arbitration and delegation clauses, on the ground that she lacked contractual capacity as a minor. The document is a complaint only; no court ruling or motion disposition is reflected in the text provided.

The case tests whether product-design and failure-to-warn theories targeting a platform's architectural choices—such as self-reported age fields, default open-messaging settings, and the absence of verification tools—can survive §230 immunity by being framed as claims arising from the defendant's own conduct rather than third-party content, a distinction that remains actively contested across circuits and is central to ongoing efforts to impose platform liability for child exploitation harms.