Doe v. Meta Platforms, Inc.
Issue
Whether Meta/Instagram can be held liable for injuries to a minor allegedly groomed by a sexual predator through a fake Instagram account and subsequently assaulted, based on claims that appear to involve platform design, recommendation features, and failure to prevent predatory use of the service.
What Happened
This is a complaint filed by a mother on behalf of her 13-year-old daughter alleging that the minor was groomed by a sexual predator using a fake Instagram account and then transported by Lyft drivers to a kidnapping and sexual assault. The complaint names both Meta/Instagram and Lyft as defendants. The excerpt shows only the initial pleading sections (parties, jurisdiction, venue) and does not yet reveal the specific causes of action or legal theories. However, the case involves a named technology defendant (Meta/Instagram) in litigation concerning harms allegedly facilitated through the platform's services, making Section 230 immunity and potential design defect claims likely to be at issue once the substantive counts are pleaded.
Why It Matters
This case has potential significance for Section 230's application to platform design and safety features, particularly age verification, fake account detection, and grooming prevention systems. If plaintiffs frame claims around Instagram's product design choices (rather than traditional publisher liability for user content), the case could test the boundary between immune editorial functions and non-immune product liability theories post-Gonzalez, similar to recent social media harm litigation involving minors.
Related Filings
Other proceedings in the same litigation tracked by this monitor.