Section 230 Motion to Dismiss

In re: Roblox Corporation Child Sexual Exploitation and Assault Litigation

🏛 U.S. District Court for the Northern District of California · 📅 2025-12-12

Issue

Whether §230(c)(1) of the Communications Decency Act immunizes Roblox, Discord, Snap, and Meta from tort liability arising from third-party predators' use of their platforms to groom and sexually exploit minor users, and whether the First Amendment independently bars such claims by treating the platforms' content moderation and editorial decisions as protected expressive conduct.

What Happened

Defendants Roblox, Discord, Snap, and Meta filed a Consolidated Case Management Statement in this MDL proceeding consolidating numerous actions alleging that minor plaintiffs were groomed and sexually exploited by predators who used defendants' platforms to initiate contact before moving communications to other services or arranging in-person assaults. Defendants identified several threshold legal issues requiring resolution before discovery, including motions to compel arbitration (with sub-questions about applicability of the Ending Forced Arbitration of Sexual Assault and Sexual Harassment Act), Section 230 immunity, and First Amendment protection for editorial and content-moderation decisions. Defendants argued that all claims are barred by §230(c)(1) because they necessarily implicate defendants' roles as publishers of third-party content, citing *Doe v. Grindr*, *Doe v. MySpace*, and related circuit precedents, and separately argued that the First Amendment, under *Moody v. NetChoice* and *Miami Herald v. Tornillo*, precludes liability for platforms' filtering, prioritization, and content-moderation choices. Defendants further contended that plaintiffs' claims independently fail on elements of proximate causation, duty, fraud specificity, and because online services are not "products" subject to product liability law.

Why It Matters

This MDL consolidates a large volume of child sexual exploitation claims against major platforms and will require the court to rule on the outer boundaries of §230 immunity and First Amendment protection for content moderation in the context of minor-safety harms—an area where circuit courts have generally upheld immunity but public and legislative pressure to narrow it is intense. The court's resolution of whether algorithmic and editorial decisions by platforms constitute protected expression under *Moody*, and whether §230 bars claims framed as product liability or negligent design rather than publisher liability, could significantly shape the litigation landscape for platform child-safety suits nationwide.

Related Filings

Other proceedings in the same litigation tracked by this monitor.