First Amendment

Computer & Communications Industry Association v. Paxton

🏛 District Court, W.D. Texas · 3 filings
2025-10-16 Other First Amendment AI Liability

Amicus Brief

Issue: In *CCIA v. Paxton*, bipartisan technology scholars argue that even if CCIA demonstrates a likelihood of success on the merits, the balance of equities and public interest independently defeat preliminary injunctive relief because the ongoing, neurologically irreversible harms that Texas's S.B. 2420 seeks to prevent for children are categorically different from the reversible compliance costs the industry faces. The non-obvious difficulty is whether a court may deny a preliminary injunction on equitable grounds alone where the underlying statute's constitutionality remains genuinely contested, and whether framing a child-safety app-regulation law as content-neutral conduct regulation — rather than speech restriction — alters the scrutiny analysis in ways that bear on that likelihood-of-success prong.

This is an amicus curiae brief filed by Georgetown professor Meg Leta Jones and Joel Thayer of the Digital Progress Institute in opposition to CCIA's motion for a preliminary injunction against enforcement of Texas S.B. 2420, a child online safety statute, at the trial court stage. Thayer discloses that he assisted in drafting S.B. 2420. The brief argues that child harms linked to social media, AI chatbots, and gaming are permanent and neurological, while industry's compliance costs are temporary and self-inflicted, given that Apple and Google already maintain age-rating infrastructure. Amici contend that suspending a democratically enacted statute is itself a cognizable sovereign harm under *Abbott v. Perez*, and they invoke the *Winter v. NRDC* four-factor standard to argue all equitable considerations favor the state. The brief also characterizes S.B. 2420 as a content-neutral regulation of commercial conduct — contracting with and collecting data from minors — rather than a restriction on speech, and relies on the Eleventh Circuit's stay in *CCIA v. Uthmeier* and the Supreme Court's *Free Speech Coalition v. Paxton* decision to argue that structurally similar laws have recently survived judicial scrutiny.

The brief advances two arguments worth watching across the broader wave of child online safety litigation. First, the conduct-regulation framing — that age-gating requirements target platform business practices rather than expressive content — is the central legal lever that could determine whether strict scrutiny applies at all; if it succeeds, it substantially lowers the bar for states defending these statutes. Second, the brief surfaces a genuinely open doctrinal question that *Moody v. NetChoice* (2024) has made more acute: whether laws that in practice restrict which apps minors can access implicate platform editorial discretion regardless of how neutrally they are drafted, a tension the brief does not address. The credibility of the "disinterested scholars" posture is also contestable given Thayer's drafting role, and opposing counsel should be expected to press that point in any response.

2025-10-16 Other First Amendment Section 230

Issue: Whether Texas SB 2420, which imposes age-verification, parental consent, and age-rating disclosure requirements on app stores, regulates protected speech subject to First Amendment heightened scrutiny, or instead regulates commercial conduct falling within the state's police power and governed by the *Zauderer* commercial-disclosure standard.

The Computer & Communications Industry Association (CCIA) filed suit in the Western District of Texas challenging SB 2420 and moved for a preliminary injunction, arguing the statute regulates constitutionally protected editorial or expressive conduct by app stores. The National Center on Sexual Exploitation and the Digital Childhood Institute filed an amici curiae brief opposing the preliminary injunction, arguing that SB 2420 regulates three categories of commercial conduct — contract formation with minors, age-rating disclosures, and retail distribution practices — none of which triggers heightened First Amendment scrutiny. Amici contended that *Apple Inc. v. Pepper* controls the commercial-conduct analysis, that contractual capacity regulation is a core sovereign function consistent with federal law including COPPA, and that the statute's age-rating provisions constitute compelled truthful commercial disclosures reviewable only under the more permissive *Zauderer* standard rather than strict or intermediate scrutiny.

This amici brief advances a content-neutrality framework specifically designed to distinguish SB 2420 from statutes invalidated in *NetChoice v. Griffin* and *Brown v. Entertainment Merchants Association*, potentially offering courts a doctrinal path to uphold app-store child-safety regulations by classifying gatekeeping and contracting functions as commercial conduct rather than protected editorial discretion — a distinction that, if accepted, could broadly affect the constitutional viability of similar legislation in other states.

2025-10-16 Other First Amendment AI Liability

Issue: Whether Texas S.B. 2420 (the App Store Accountability Act), which voids contracts between app developers and minors unless parental consent is obtained and mandates parental disclosure of data collection practices, violates the First Amendment as applied to app store owners and developers.

Plaintiff Computer & Communications Industry Association filed a motion for preliminary injunction seeking to block enforcement of Texas S.B. 2420, codified at Tex. Bus. & Com. Code §§ 121.022–056, on First Amendment grounds. The Digital Childhood Alliance filed this amicus brief in support of Defendant Attorney General Paxton's opposition to that motion. The amicus argues that S.B. 2420 regulates commercial contract formation with minors—not expressive speech—and is therefore content-neutral conduct regulation to which the First Amendment does not apply; it further contends that the statute applies to all apps uniformly specifically because the tech industry previously challenged Ohio's narrower, content-targeted parental notification law as impermissibly content-based under *NetChoice v. Yost*, 716 F. Supp. 3d 539 (S.D. Ohio 2024). The brief draws on Senate Judiciary subcommittee testimony regarding a minor harmed by a Character.AI chatbot whose family's litigation was obstructed by arbitration clauses and damages caps the minor had accepted without parental knowledge.

This brief illustrates how states are attempting to circumvent First Amendment platform-autonomy challenges by framing minor-protective legislation as commercial contract regulation rather than speech regulation, a theory that—if accepted—could substantially limit the reach of *Moody v. NetChoice* in the context of app store transactions and AI product liability for minors.