Section 230

Ballentine v. Meta Platforms, Inc.

🏛 District Court, M.D. Florida · 2 filings
2026-02-17 Motion to Dismiss Section 230

MOTION to Dismiss Plaintiff's First Amended Complaint by… — Attachment 85

Issue: In *Ballentine v. Meta Platforms, Inc.*, Meta argues that a pro se plaintiff's racial discrimination claims arising from the disablement of his Facebook account must be dismissed because the court lacks personal jurisdiction over Meta, Section 230 categorically immunizes account-enforcement decisions from civil rights liability, and the federal statutes invoked — §§ 1981, 1982, 1983, and 1985(3) — each fail on independent pleading grounds. The case raises the unresolved question of whether Section 230's publisher immunity extends not merely to decisions about what content to remove, but to claims that the *selection of whose account to disable* was made on racially discriminatory grounds.

Meta filed this Rule 12(b) motion to dismiss Ballentine's First Amended Complaint in the Middle District of Florida, seeking dismissal with prejudice and without leave to amend. The motion presents five independent grounds for dismissal, each targeted at a separate claim or jurisdictional basis. On personal jurisdiction, Meta argues that platform accessibility in Florida and the plaintiff's in-state injury do not establish the constitutionally required contacts under *Daimler* and *Walden*. On Section 230, Meta contends that disabling an account is a quintessential publisher decision immunized regardless of how the underlying claim is labeled. On the civil rights claims, Meta argues that Plaintiff's comparator evidence is too thin to survive *Iqbal* plausibility review, that mandatory NCMEC reporting does not make Meta a state actor under *Lugar*, and that binding Eleventh Circuit precedent in *Jimenez v. WellStar* forecloses using §§ 1981 and 1982 as predicate rights for a § 1985(3) private-actor conspiracy claim.

This motion is a case study in how major platforms structure layered Rule 12(b) dismissal arguments to resolve civil rights platform-liability cases before any contested legal question reaches the merits. Meta's maximalist Section 230 position — asserted without engaging whether discriminatory *selection* of enforcement targets constitutes the platform's own conduct rather than editorial judgment — signals that the industry regards that gap in doctrine as a vulnerability worth avoiding rather than litigating. If the court dismisses on personal jurisdiction or any of the threshold pleading grounds, the harder Section 230 question goes unanswered; a ruling that reaches it would fill a genuine gap in Eleventh Circuit law. The motion also highlights a growing tension between the *Walden*-based jurisdictional framework and platforms' geographically targeted commercial advertising activity — a pressure point that will likely recur as more plaintiffs allege platform discrimination tied to monetized business use.

2026-02-17 Motion to Dismiss Section 230 First Amendment

MOTION to Dismiss for Failure to State a Claim by… — Attachment 29

Issue: Whether Section 230(c)(1) and (c)(2) immunize a third-party content moderation vendor that assisted Meta in reviewing and recommending the termination of a user's Facebook advertising account from civil rights and discrimination claims brought under 42 U.S.C. §§ 1981, 1982, 1983, and 1985(3).

Plaintiff Ballentine, a Black entrepreneur, alleged that Meta terminated his Facebook advertising account after flagging it for child sexual exploitation material, and that one of three third-party vendor defendants—Accenture, TaskUs, or Genpact—reviewed his account and recommended the ban based on his race. Accenture moved to dismiss under Rule 12(b)(6), arguing primarily that the complaint failed to plausibly allege but-for causation or racial animus, and additionally that Section 230(c)(1) bars claims against third-party content moderation service providers for assisting platforms in publication decisions, while Section 230(c)(2) independently shields good-faith efforts to filter objectionable material. The motion presents the §230 arguments as independently sufficient grounds for dismissal, in addition to the merits-based pleading deficiencies.

This case raises the relatively underdeveloped question of whether §230 immunity extends downstream to third-party vendors that perform human content moderation review on behalf of platforms, a question with significant implications for the emerging ecosystem of platform-adjacent moderation contractors; if courts accept Accenture's argument that §230(c)(1) and (c)(2) together shield vendors assisting in publisher decisions, it would substantially insulate the outsourced content moderation industry from civil liability for moderation outcomes.