Browse Cases
9 resultsDowey v. Siems
Issue: Whether Meta is liable under product liability (design defect, failure to warn) and negligence theories for the deaths of minors who were sextorted by predators whom Meta's recommendation systems allegedly connected to the victims, or whether such claims are barred by Section 230 immunity.
This case directly tests the boundaries of Section 230's design-defect carve-out post-*Moody v. NetChoice* and in light of the Supreme Court's non-decision in *Gonzalez v. Google*. Plaintiffs invoke the emerging theory—successful in *Garcia v. Character.AI*—that platform architectural choices, recommendation algorithms, and data-sharing features constitute the platform's own product design decisions outside Section 230's scope, particularly where the platform allegedly knew its systems were connecting minors to predators and declined to implement identified safeguards. If the court permits these claims to proceed past a motion to dismiss, it would reinforce a narrowing of Section 230 immunity for algorithmic harms and establish that platforms face tort exposure for design decisions that foreseeably facilitate criminal exploitation, even when the harmful content itself is user-generated.
View on CourtListener →State v. Andreas W. Rauch Sharak
Issue: Whether Google acted as a government agent (implicating Fourth Amendment protections) when it scanned user files for CSAM and reported flagged content to law enforcement pursuant to federal reporting requirements.
This case addresses Section 230's role in incentivizing platform content moderation by providing immunity from liability for voluntary scanning and reporting of illegal content. The court's interpretation that Section 230 was designed to encourage ESPs to engage in proactive content moderation—including automated scanning—without fear of liability directly implicates ongoing debates about the scope of Section 230 protections for active versus passive moderation practices and whether such activities transform platforms into "information content providers" or government agents.
View on CourtListener →Trupia v. X Corp.
Issue: Whether X Corp. is immune under Section 230 and the First Amendment from claims challenging its alleged suppression or moderation of a user's posts on its social media platform.
This case directly implicates the scope of Section 230 immunity and First Amendment protection for platform content moderation decisions post-*Moody v. NetChoice*. X Corp.'s invocation of both Section 230 publisher immunity and First Amendment editorial discretion as independent bars to liability for content moderation represents the standard defense posture for platforms facing user grievances over deplatforming or suppression, and the outcome will reflect how courts apply *Moody*'s editorial-discretion framework to individual user content-moderation disputes on major social media platforms.
View on CourtListener →Doe v. Meta Platforms, Inc.
Issue: Whether Meta/Instagram can be held liable for injuries to a minor allegedly groomed by a sexual predator through a fake Instagram account and subsequently assaulted, based on claims that appear to involve platform design, recommendation features, and failure to prevent predatory use of the service.
This case has potential significance for Section 230's application to platform design and safety features, particularly age verification, fake account detection, and grooming prevention systems. If plaintiffs frame claims around Instagram's product design choices (rather than traditional publisher liability for user content), the case could test the boundary between immune editorial functions and non-immune product liability theories post-Gonzalez, similar to recent social media harm litigation involving minors.
View on CourtListener →Stokinger v. Armslist, LLC
Issue: Whether Armslist.com, an online firearms marketplace, is subject to personal jurisdiction in New Hampshire based on its website design and operation, and whether claims alleging that Armslist negligently designed its website to facilitate illegal firearms sales are barred by Section 230 of the Communications Decency Act.
This case presents the design-defect theory of platform liability similar to cases like Garcia v. Character.AI—plaintiffs allege the platform's design choices (not merely hosting third-party content) created liability exposure. The jurisdictional posture may interact with Section 230's scope: if design claims fall outside Section 230 immunity, platforms face multi-jurisdictional exposure based on purposeful availment through website architecture targeting specific states' users for harmful transactions.
View on CourtListener →Welkin v. Meta Platforms, Inc.
Issue: Whether Section 230 of the Communications Decency Act bars plaintiff's intentional infliction of emotional distress claim and request for injunctive relief arising from third-party content on Meta's platform, and whether the court has personal jurisdiction over Meta.
This motion presents a standard Section 230 defense against IIED claims based on third-party content, testing whether Meta's editorial and recommendation functions qualify for publisher immunity. The case also illustrates the routine procedural posture in platform litigation where defendants assert multiple grounds for dismissal including lack of jurisdiction, failure to state a claim, statutory immunity, and contractual forum selection, providing insight into Meta's current litigation strategy post-Moody v. NetChoice.
View on CourtListener →Rosenblum v. Passes, Inc.
Issue: Whether Section 230 of the Communications Decency Act immunizes Passes, Inc. from liability for child sexual abuse material (CSAM) where plaintiff alleges the platform's agents actively solicited a minor to join the platform and then marketed and distributed the resulting CSAM.
This case presents a potentially significant challenge to Section 230's scope in CSAM cases by alleging that platform agents' active recruitment and marketing of a minor creator transforms the platform from a passive host into a content developer or co-creator. If the material contribution theory survives the motion to dismiss, it could narrow Section 230 immunity for platforms whose employees or agents allegedly facilitate the creation or distribution of illegal content, particularly involving minors—extending the "content developer" exception beyond algorithmic design to direct human agency and solicitation.
View on CourtListener →Doe v. Grindr Inc.
Issue: Whether Section 230 bars state law product liability and negligence claims brought by an underage user against Grindr based on alleged design defects, failure to warn, and negligent misrepresentation, and whether plaintiff stated a plausible TVPRA sex trafficking claim sufficient to invoke FOSTA's exception to Section 230 immunity.
This decision reinforces broad Section 230 protection for dating and social platforms against product liability and design defect claims when those claims are characterized as targeting the platform's publisher function over third-party content. The ruling also establishes a demanding pleading standard for invoking FOSTA's exception to Section 230, requiring plaintiffs to plausibly allege knowing participation in or benefit from sex trafficking—a threshold this plaintiff could not meet despite allegations of underage use and harm.
View on CourtListener →Karam v. Meta Platforms, Inc.
Issue: Whether Section 230 bars claims against Meta arising from the company's decision to ban or restrict plaintiff's Facebook account and its alleged failure to prevent other users from posting content about plaintiff.
This decision reinforces the broad application of Section 230 immunity to platform account termination and content moderation decisions, extending publisher immunity not only to third-party content but also to the platform's own editorial decisions about which users may access its services. The ruling demonstrates courts' continued willingness to apply Section 230 at the motion to dismiss stage to bar claims challenging fundamental platform curation functions including account access decisions.
View on CourtListener →