United States: US lawmakers push for online child safety amid constitutional battles

In short

In recent years, both U.S. state and federal legislatures have stepped up their efforts to pass laws aimed at protecting minors in the digital world. However, several court rulings have ruled that these legislative actions overstepped constitutional bounds. This article highlights major legislative initiatives at the federal level in the U.S. and in California and Texas to protect children and teens online, and lawsuits challenging the legality of the measures in California and Texas, starting in early September 2024.


Federal Children’s Online Safety and Privacy Act (KOSPA)

KOSPA is a legislative package that combines two bills: the Kids Online Safety Act, first introduced in 2022, and the Children and Teens Online Privacy Protection Act, first introduced in 2019. On July 30, 2024, the U.S. Senate passed KOSPA by a vote of 91-3.

KOSPA is intended to protect “minors,” defined as individuals under the age of 17. KOSPA would establish certain protections for “children,” defined as individuals under the age of 13; and certain protections for “teenagers,” defined as individuals between the ages of 13 and 16.

KOSPA would impose obligations on several types of entities, including:

  • “Covered Platforms” means an online platform, online video game, messaging application, or video streaming service that connects to the Internet and that is used or reasonably likely to be used by a minor, subject to various exceptions.
  • “Online platforms” means any public website, online service, online application or mobile application that primarily provides a community forum for user-generated content.
  • ‘Operators’ of online services that are directed to children or teenagers or who actually know or can reasonably be expected to know based on objective circumstances that they are collecting personal information from a child or teenager.

Below are some examples of obligations that KOSPA would impose on companies if it were adopted in its current form:

  • Duty of care: Covered platforms should exercise reasonable care in creating and implementing a design feature to prevent and mitigate various prescribed harms to minors. These harms include certain mental disorders, addictive-like behavior, online bullying, sexual exploitation, promotion and marketing of drugs, tobacco products, gambling or alcohol, and financial harm.
  • Safety precautions: Covered platforms would be required to implement certain safeguards to protect a user they know is a minor. These safeguards include restrictions on the ability of others to communicate with the minor and to view the minor’s personal information, and limiting the use of design features that result in the minor’s compulsive use of their platform.
  • Parental Notices, Resources, and Consents: Covered platforms should provide distinct notices and easily accessible and user-friendly settings for parents to support a user whom the platform knows is a minor. In the case of an individual whom a covered platform knows is a child, the platform should obtain verifiable consent from the child’s parent prior to the child’s first use of the platform.
  • Transparency reports: Platforms with more than 10 million active monthly users in the U.S. that primarily provide an online forum for user-generated content must publish a public report at least annually describing the reasonably foreseeable risks of harm to minors and assessing the prevention and mitigation measures taken to address such risks, based on an independent third-party audit conducted through reasonable inspection of the platform affected by the action.
  • Privacy obligations: KOSPA would make numerous and significant changes to the existing Children’s Online Privacy and Protection Act (COPPA). One set of changes would expand the class of “operators” covered by the amended law, including by adding teens ages 13-16 to the class of individuals protected by the law and expanding the circumstances in which an operator knows it is processing the personal information of children or teens. KOSPA would also impose new rules and restrictions, including prohibitions on profiling and the serving of targeted advertising to children and teens, subject to certain limited exceptions. Operators would be subject to these new requirements to the extent they are not already subject to such requirements under COPPA.

California Age-Appropriate Design Code Act (CAADCA)

In September 2022, California passed the CAADCA with the stated intent of requiring businesses to consider the best interests of minors under the age of 18 when designing, developing, and offering online services. In September 2023, the U.S. District Court for the Northern District of California granted a preliminary injunction against enforcement of the CAADCA on the basis that the CAADCA likely violates the First Amendment. However, in August 2024, the U.S. Court of Appeals for the Ninth Circuit partially reversed the district court’s preliminary injunction, essentially holding that some, but not necessarily all, of the CAADCA is likely constitutionally invalid, and remanded the case to the district court for further proceedings.

Subject to the ongoing constitutional challenge, the CAADCA imposes a broad set of obligations and restrictions on any “business” that provides an online service likely to be used by minors. A “business” is a for-profit organization that determines the means and purposes of processing the personal information of California residents and meets one of three thresholds: (1) has annual gross revenues of $25 million, adjusted for inflation; (2) buys, sells, or otherwise shares the personal information of 100,000 or more California residents or households annually; or (3) derives at least 50% of its annual revenues from selling or sharing the personal information of California residents.

The portions of the CAADCA that the district court and the Court of Appeals agree are likely unconstitutional are the provisions requiring companies to conduct a data protection impact assessment and take certain steps related to the assessment (i.e., Cal. Civ. Code §§ 1798.99.31(a)(1)–(4), 1798.99.31(c), 1798.99.33, and 1798.99.35(c)).

The district court will ultimately consider the constitutionality of the remaining provisions of the CAADCA. These provisions included requirements that affected businesses:

  • Estimate the age of minor users with a reasonable and appropriate degree of certainty, or treat all users who reside in California as minors.
  • Set all default privacy settings for minors to settings that provide a high level of privacy, unless an exception applies.
  • Ensure that privacy information, terms of service, policies, and community standards are provided concisely, prominently, and in plain language appropriate to the age of minors likely to use the online service.
  • Provide prominent, accessible and responsive tools that enable minors, or where applicable their parents or guardians, to exercise their privacy rights and report concerns.
  • Not to intentionally use the personal information of minors in a way that materially harms the physical health, mental health or well-being of a minor.

Texas Securing Children Online through Parental Empowerment (SCOPE) Act

The SCOPE Act, which was passed in 2023 and went into effect on September 1, 2024, regulates “digital service providers” (DSPs) with the stated intent of protecting minors under the age of 18. On August 30, 2024, the U.S. District Court for the Western District of Texas Austin Division issued a preliminary injunction prohibiting Texas from enforcing the “monitoring and filtering requirements” of the SCOPE Act (i.e., Tex. Bus. & Com. Code § 509.053) on First Amendment grounds, while staying the block on the other provisions of the SCOPE Act.

The SCOPE Act defines a DSP as the owner or operator of a website or online software that determines both the means and the purposes of collecting and processing users’ personal information, connects users in a manner that enables users to interact socially with other users, allows a user to create a public or semi-public profile to log in to and use the digital service, and allows a user to create or post content that can be viewed by other users of the digital service. The SCOPE Act also sets forth several exceptions to the definition of a DSP.

The “monitoring and filtering requirements” that the district court ordered to be enforced would have required DSPs to monitor and filter certain categories of content from being shown to known minors. Specifically, the SCOPE Act would have required DSPs to develop and implement a strategy to prevent a known minor from being exposed to content that promotes, glorifies, or facilitates various categories of topics, including suicide, substance abuse, bullying, and grooming.

The district court declined to prohibit the remaining provisions of the SCOPE Act from being enforced. DSPs must therefore carefully evaluate their obligations under the SCOPE Act, including the requirements to:

  • Make sure users register their age before they can create an account.
  • Verify the identity of a person posing as the parent of a minor and the relationship to the minor, using a commercially reasonable method.
  • Give parents the opportunity to dispute their child’s registered age.
  • Limit the collection of personal information from known minors to that information reasonably necessary to provide the digital service.
  • Not allow the known minor to make purchases or conduct other financial transactions through the Digital Service; and
  • Please explain how their algorithms work if their algorithms automate the suggestion, promotion, or ranking of information to known minors on the digital service.

Just a snapshot

The federal proposals and the California and Texas laws outlined above are just three examples of legal developments in the online protection space for minors. Numerous other bills, laws, constitutional challenges and enforcement actions are moving forward rapidly across the U.S., including child privacy regulations, age-appropriate design rules, restrictions on addictive feeds, and parental consent requirements and management tools. Stay tuned for more updates from the Baker McKenzie team.

You May Also Like

More From Author