Ballentine v. Meta Platforms, Inc.
Issue
Whether Section 230(c)(1) and (c)(2) immunize a third-party content moderation vendor that assisted Meta in reviewing and recommending the termination of a user's Facebook advertising account from civil rights and discrimination claims brought under 42 U.S.C. §§ 1981, 1982, 1983, and 1985(3).
What Happened
Plaintiff Ballentine, a Black entrepreneur, alleged that Meta terminated his Facebook advertising account after flagging it for child sexual exploitation material, and that one of three third-party vendor defendants—Accenture, TaskUs, or Genpact—reviewed his account and recommended the ban based on his race. Accenture moved to dismiss under Rule 12(b)(6), arguing primarily that the complaint failed to plausibly allege but-for causation or racial animus, and additionally that Section 230(c)(1) bars claims against third-party content moderation service providers for assisting platforms in publication decisions, while Section 230(c)(2) independently shields good-faith efforts to filter objectionable material. The motion presents the §230 arguments as independently sufficient grounds for dismissal, in addition to the merits-based pleading deficiencies.
Why It Matters
This case raises the relatively underdeveloped question of whether §230 immunity extends downstream to third-party vendors that perform human content moderation review on behalf of platforms, a question with significant implications for the emerging ecosystem of platform-adjacent moderation contractors; if courts accept Accenture's argument that §230(c)(1) and (c)(2) together shield vendors assisting in publisher decisions, it would substantially insulate the outsourced content moderation industry from civil liability for moderation outcomes.
Related Filings
Other proceedings in the same litigation tracked by this monitor.
How accurate was this summary?