Section 230 Motion to Dismiss

Welkin v. Meta Platforms, Inc.

🏛 U.S. District Court for the Northern District of Georgia · 📅 2026-01-12 · 📑 No. 1:26-CV-00148-ELR (N.D. Ga.)

Issue

Whether §230(c) of the Communications Decency Act immunizes Meta from an IIED claim and request for injunctive relief arising from Meta's alleged failure to remove a third-party Facebook impersonation profile whose content Iranian authorities reportedly used as evidence in criminal proceedings against the plaintiff's mother.

What Happened

Plaintiff Carly Welkin, a pro se Georgia resident, filed suit against Meta in the Northern District of Georgia alleging copyright infringement and intentional infliction of emotional distress based on Meta's partial response to her requests to remove a Facebook profile she characterized as an unauthorized impersonation of her mother. Meta moved to dismiss under Rules 12(b)(2) and 12(b)(6), arguing: (1) the court lacks both general and specific personal jurisdiction because Meta is incorporated in Delaware with its principal place of business in California and no forum-directed conduct was alleged; (2) the copyright claim fails for lack of a registered copyright and no plausible allegation of copying by Meta; (3) the IIED claim fails under California law for want of extreme or outrageous conduct and requisite intent, and is additionally barred by the Meta Terms of Service; and (4) both the IIED claim and the injunctive relief request are independently barred by §230(c) because the content was provided by a third party and Plaintiff's claims seek to hold Meta liable for its exercise of traditional editorial functions. In the alternative, Meta sought transfer to the Northern District of California pursuant to a mandatory forum-selection clause in its Terms of Service.

Why It Matters

The motion squarely tests whether §230(c) shields a platform from tort liability and injunctive relief when a plaintiff alleges harm flowing not from the platform's affirmative conduct but from its editorial decision to only partially remove third-party content flagged as an impersonation account, potentially reinforcing the breadth of publisher immunity for content-moderation decisions short of complete removal.