Another Texas Online Censorship Law Partially Banned – CCIA v. Paxton

This case concerns HB 18, one of several online censorship bills that continue to churn out of the Texas legislature. This particular law requires “digital service providers” to provide age verification to all users. (The bill expands on HB 1181, which also requires age verification by some porn sites. A challenge to that law is now pending in the U.S. Supreme Court.) Parents can “contest” an age determination, in which case the service must treat them as a “known minor.” (Seriously? How much of a gamble is that?) Services must verify parental status, even though there is no reasonable process available today to do so. (My Segregate and Suppress paper will make this point.)

After minors are separated, Services must block minors’ access to content that “promotes, glorifies, or facilitates” the following categories of content:

(A) “suicide, self-harm or eating disorders”;
(B) “substance abuse”;
(C) “stalking, bullying or intimidation”;
(D) “grooming, human trafficking, child pornography or other sexual exploitation or abuse”; and
(E) material that is considered obscene to minors under Texas Penal Code Section 43.24.

The law also provides privacy-like protections for minors, including bans on targeted advertising and parental controls over minors’ online activities. The law also provides transparency requirements for algorithms used to rank content.

The court imposes the requirements for blocking the content and leaves the rest alone.

Control level

The court says HB18 is “a content- and speaker-based regulation aimed at DSPs whose primary function is to share and broadcast social speech.” The law excludes certain speakers (“news, sports, commerce, and provider-generated content”) while treating UGC more narrowly, leading to strict scrutiny. Citing NetChoice v. Yost and NetChoice v. Fitch, the court explains:

HB 18 discriminates based on the type of content presented on a medium, not just the type of medium. A DSP that allows users to interact socially with other users, but that “primarily functions to” provide access to news or commerce, is unregulated. An identical DSP, using the exact same communications medium and method of social interaction, but that “primarily functions to” provide updates on what a user’s friends and family are doing (e.g., via Instagram posts and stories), is regulated. If there is a difference between the regulated DSP and the unregulated DSP, it is the content of the speech on the site, not the medium through which that speech is presented.

Application of strict testing

Following the Supreme Court’s instructions in Moody, the court moves through each provision to apply the applicable test.

The provisions on data protection, parental control and disclosure

The court says that these provisions are “largely unrelated to First Amendment expression.” I find this confusing. Targeted advertising, for example, is always about First Amendment expression. And as I discuss in my Segregate and Suppress paper, parental control provisions are deeply problematic from a free speech standpoint. The court says “it is not clear that a law requiring parents to review and change their children’s privacy settings addresses First Amendment concerns,” but this is clearly a false statement when it comes to minors consuming content (plus the problem with deploying a credible solution for parental authentication). And more generally, any requirement for publishers to authenticate users based on their age is categorically constitutionally problematic, and without segregation of minors, there is no way to apply it to rules specific to minors. The court sidesteps all of these obvious problems. Perhaps CCIA/NetChoice should reemphasize these points in further proceedings. Problems with the age authentication and parental authentication provisions are also inseparable, so I hope these provisions will eventually be dropped.

Provision for monitoring and filtering

The law’s provision on blocking content is not the least bit subtle. The court says: “The monitoring and filtering requirements explicitly identify and single out individual categories of speech to be filtered and blocked. That’s as content-based as it gets.”

The court questions whether blocking prohibited content constitutes a compelling state interest, “such as regulating content that might advocate for the deregulation of drugs (possibly “promoting” “substance abuse”) or defending the morality of physician-assisted suicide (likely “promoting” “suicide”)… Many of the regulated topics are simply too vague to even say whether they constitute a compelling interest. Terms such as “promoting,” “glorifying,” “substance abuse,” “harassment,” and “grooming” are undefined, despite their potentially broad scope and politically charged nature.”

The court said the provision was not narrowly targeted and did not use the least restrictive means. The ambiguous terminology was also too broad, so that “HB 18 is likely to filter out far more material than is necessary to achieve Texas’s purpose.” Moreover, the law is not inclusive because it restricts only UGC, not content from first-party publishers. The law also “threatens to censor social discussions of controversial topics” — a particularly damaging outcome by cutting teens off from important parts of the internet. “Texas also prohibits minors from participating in the democratic exchange of opinions online. Even if we accept that Texas seeks to prohibit only the most harmful pieces of content, a state cannot pick and choose which categories of protected speech it wants to block teens from discussing online.”

Vagueness

The plaintiffs challenged the DSP definition as vague because it enables the “primary” function of those services. The court says that argument is a stretch, and “the term ‘social interaction’ may have a narrow meaning in general.” (What is…? I have no idea.) The court also says that the plaintiffs’ claims seem clear that they fall under the statute. Yes, but they’re just examples of an entire industry with uncertain boundaries. The court says the companies can raise that issue in an as-applied challenge.

The court is more concerned with the verbs “promote, glorify, and facilitate” in the blocking provisions. As an example, the court notes that “pro-LGBTQ content could be specifically targeted at ‘grooming’” — a legitimate fear given Texas’ weaponization of LGBTQ status in its culture wars.

As a gotcha, Paxton argued that the supposedly vague terms appear in some companies’ TOSes. The court’s response is damning: “A self-policed ​​rule does not suffer from the same vagueness problems as a state-sponsored ban. Facebook may know its internal definition of ‘glorify,’ but it cannot be certain that the Paxton or Texas courts will adhere to the same definition.”

Article 230

The court states that Article 230 takes precedence over the rules for blocking content:

Unlike HB 1181, HB 18’s monitoring and filtering requirements do impose liability based on the type of content a website hosts. A website will only filter content that falls into certain categories of prohibited speech (e.g., “glorifying” an “eating disorder” and “promoting” “substance abuse”). The monitoring and filtering requirements necessarily derive their liability from the type of content a site displays

Courts have been skeptical of Section 230 as a sword to undermine legislation (e.g., NetChoice v. Reyes), so it’s nice to see it working here.

Conclusion

Like the 9th Circuit’s recent ruling in NetChoice v. Bonta, this decision exists in a hypothetical alternate universe in which the court ignores the age-verification mandate. But if the age-verification mandate itself is unconstitutional—which I believe it is—then any accompanying provisions should be unenforceable because of the inability to separate minors from adults. By sidestepping the core defect in the law, the opinion occupies a narrow space that would need to be resolved by the appeal in Free Speech Coalition v. Paxton (I will be filing an amicus brief in that case, drawing on material from my article Segregate-and-Suppress).

If the non-mandatory provisions ultimately fail, they could pose huge problems for the internet. I won’t go into those problems in detail now because of the limited ambiguity surrounding age verification, but of course other legislators and courts will have a keen interest in any space where they are free to regulate.

While the appeal in FSC v. Paxton awaits the Supreme Court, this ruling will be appealed to the Fifth Circuit, which has yet to encounter a censorship law it doesn’t like. That decision, whichever way it goes, also appears destined for the Supreme Court. Moody v. NetChoice went through the exact same step-by-step process, and we may be experiencing a sense of déjà vu at every step.

Case quote: Computer & Communications Industry Association v. Paxton, 1:24-cv-00849-RP (WD Tex. August 30, 2024)

You May Also Like

More From Author