KOSA: Protecting children or controlling speech?

In late July, the U.S. Senate recently voted to Children’s Online Safety Act (KOSA) for a vote in the House after fierce opposition from civil rights organizations ranging from the American Civil Rights Union (ACLU) to NetChoice. KOSA has been the subject of controversy since its inception first introduction in 2022 by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), largely because it gives the government the power to enforce a vague “duty of care” provision against digital platforms. Essentially, this means that websites and apps could be sued for failing to exercise reasonable care to ensure that their platform does not expose children to “harmful content,” whether that’s graphic material or self-harm. Previous versions of the bill authorized both the Federal Trade Commission and politically elected state Attorneys General (AGs) to enforce the “duty of care” provision, leading to concerns about biased enforcement and the suppression of protected speech.

The current version limits the powers granted to state AGs, but still gives the FTC broad authority to police the Internet. While the FTC is more politically independent than state law enforcement, the agency’s history with child safety regulation still makes KOSA a ticking constitutional time bomb. Indeed, it was the over-regulation of child safety that inspired Congress to pass adopt legislation in 1980 to limit the FTC’s regulatory powers.

Child safety and freedom of expression

Child safety online is a powerful but relatively underrated topic. When one thinks of digital platform governance, broad issues like antitrust, data privacy, content moderation, and cybersecurity come to mind. While Congress may be deadlocked over passing competition law reform or federal data privacy protections (and for good reason), the Senate overwhelmingly passed KOSA 91-3.

Despite fierce opposition from diverse interests Ranging from industry groups to LGBTQ advocacy organizations, lawmakers on both sides supported the bill. Traditional civil rights organizations such as the Foundation for Individual Rights and Freedom of Expression (FIRE) warned that KOSA’s vague language would inevitably authorize the suppression of protected speech by giving law enforcement broad authority to determine what is harmful to children. LGBTQ organizations raised legitimate concerns that conservative law enforcement would not apply nuance to policing genuinely inappropriate content, such as the promotion of sexual content to children, and helpful resources, such as websites that offer guidance to LGBTQ youth. Indeed, Senator Blackburn was on file in which she stated that she believed KOSA would help “protecting minor children from transgender people in this culture.” The same can be said of left-wing enforcers who can foreseeably use KOSA to persecute their own disadvantaged groups, such as gun dealers, Trump supporters, and what they might loosely define as “racism.”

But KOSA proponents are not without merit. Indeed, it is hard to fault senators for not wanting to be seen as opponents of child safety and for being portrayed as defenders of mental health issues, suicide, pedophilia, and all the horrible things that harm children online. One remarkably prescient example was the airing of grievances that took place on January 31, 2024, during the Senate Judiciary Committee hearing on “Big Tech and the Child Exploitation Crisis.” During the hearing, parents packed the room, holding up signs of their children who had been harmed by their experiences on social media. Five CEOs sat in the witness chairs: Mark Zuckerberg of Meta, Shou Chew of TikTok, Evan Spiegel of Snapchat, Jason Citron of Discord, and Linda Yaccarino of X (formerly Twitter).

Each of the senators from both parties took turns grilling the CEOs about the threats children face on their platforms, ranging from cyberbullying to eating disorders to sex trafficking to suicide promotion. Even the most principled defenders of free speech could do little but listen in horror as the attacks came one after another. While strict regulation of content is not the answer, the emotion in the room, from the grieving parents to the bipartisan outrage, was impossible to ignore. The most iconic moment The highlight of the hearing was when Sen. Josh Hawley (R-MO) asked Zuckerberg to stand and apologize to the parents in the audience for the harm their children had suffered on Facebook and Instagram. This was a public relations disaster for Meta, and a masterstroke for Hawley and those who support KOSA.

KOSA, as amended, is still dangerous

While KOSA’s authors maintain that the bill is not aimed at speech but at behavior, enforcement would clearly target expression on an as-applied basis. That is, the difference between compliant behavior and illegal activity would be the content that promotes the allegedly harmful behavior. Two websites could perform the same functions, but one could be sued for displaying material that law enforcement finds objectionable, whether it’s resources for LGBTQ youth or what the powers that be deem “dangerous misinformation.”

The ultimate problem with the amended legislation, however, is its insistence that the FTC can constitutionally enforce KOSA’s “duty of care” provision and, even if it could, would do so in a way that doesn’t unduly burden society. Most, if not all, critics of KOSA express concern that the Commission cannot be trusted to remain impartial. The president ultimately determines the agency’s political leanings, selecting the chair to create a 3-2 majority for the party that controls the White House. While the Commission may not be as political as an elected state AG, it’s not absurd to believe that ideological biases, if not overt agendas, could prevail. Biden’s FTC chair, Lina Khan, has indeed taken the agency in a very progressive direction, targeting big tech companies with novel antitrust theories and introducing unconventional priorities like labor rightsIt would not be far-fetched to believe that a Trump administration would choose its own ideological chairman.

What the FTC Can Learn from Its Own History

But even without an activist FTC chair, the commission’s record on child safety suggests that it would likely over-regulate and become a nanny state enforcer. Despite the fact that KOSA authorizes the FTC to undertake a traditional parental activity, namely moderating what content children can see, the agency has tried to regulate child safety before and drawn public outrage. In 1978, the commission infamously announced its “Children’s video” rule that sought to reduce the danger of sugary food TV ads to children. FTC officials reflecting on the rulemaking explain in a report that the rule did the following:

  1. Ban all television advertising for any product that is directed to, or seen by, audiences of which a significant proportion are children too young to understand the sales objective of the advertising;
  2. Ban television advertising for food products posing the most serious risks to oral health that is aimed at, or seen by, target groups that include a significant proportion of older children; and
  3. Require that television advertising for sugary foods not covered by the ban but aimed at, or seen by, an audience that includes a significant proportion of older children is balanced with advertiser-funded information on nutritional or health aspects.

The rule was seen as so intrusive and disproportionate that Congress past the FTC Improvement Act to punish the agency and temporarily close the Commission. Even the liberal-leaning Washington Post ran an article titled “The FTC as a National Pleaser,” which reflected both industry opposition and public disapproval of the rule. The KidVid regulation marked a break from the approach to commercial misconduct and a shift towards micromanaging the public good.

The report notes that the FTC, deeply bruised by this response, spent the next few decades rethinking its approach to regulation. Rather than attempting another heavy-handed solution to child safety, the FTC instead focused on enforcing its traditional unfair and deceptive practices, while also conducting research studies on industry behavior, such as the marketing of R-rated movies and graphic M-rated video games. In addition, the agency embraced a more interdisciplinary and tailored approach by working with several agencies, such as the Food and Consumer Product Safety Authority and Health and Human Services. Finally, the agency asked companies to produce informative ads to counter the potentially harmful content it was trying to address, and encouraged private companies to develop self-governance frameworks. In essence, the agency addressed a very real problem by taking a step back and adopting a more comprehensive approach that did not compromise free expression or rely on draconian regulation.

It seems that 40 years is more than enough time for Congress to forget the lesson of the “KidVid” regulation. Instead of tempering the FTC, the Senate is eagerly giving the agency the power to once again assume the role of national nanny. There is still time, however, to reflect and remember that children’s safety can be protected without restricting free speech or needlessly attacking the tech industry. In fact, it’s likely that by trying to trade freedom for safety, we will ultimately get neither.

Ethan Yang is an Independent Institute Contributor and Legal Associate at the Cato Institute and an Adjunct Fellow at AIER. His work focuses on antitrust, consumer protection, civil liberties, and foreign affairs. During law school, Ethan served as a law clerk for the Senate Judiciary Committee, where he served as the technology and antitrust staff leader for Senator Ted Cruz. He has also held positions at the Federal Trade Commission, International Center for Law and Economics, and was a Google Public Policy Fellow. Ethan received his JD from the Antonin Scalia Law School at George Mason University and his BA in Political Science from Trinity College (CT). He is also the co-author of China Dilemma: Rethinking US-China Relations Through Public Choice Theory.

Catalyst Articles by Ethan Yang

You May Also Like

More From Author