United States: US lawmakers push for child safety online amid constitutional battles

In brief

In recent years, both US state and federal legislatures have intensified efforts to enact laws aimed at safeguarding minors in the digital world. However, several court rulings have found that these legislative actions overstepped constitutional limits. This article highlights key legislative initiatives at the US federal level and in California and Texas to protect children and teenagers online, and lawsuits challenging the legality of the California and Texas measures, as of early September 2024.


Contents

Key takeaways

Federal Kids Online Safety and Privacy Act (KOSPA) 

KOSPA is a legislative package that combines two bills—the Kids Online Safety Act, which was first introduced in 2022, and the Children and Teens Online Privacy Protection Act, which was first introduced in 2019. On July 30, 2024, the US Senate passed KOSPA by a vote of 91-3.

KOSPA is intended to protect “minors”, defined to mean individuals under the age of 17. KOSPA would establish certain protections for “children”, defined to mean individuals under the age of 13; and certain protections for “teens”, defined to mean individuals between the age of 13 and 16. 

KOSPA would impose obligations on various types of entities, including:

  • “Covered platforms,” defined to mean an online platform, online video game, messaging application, or video streaming service that connects to the internet and that is used, or is reasonably likely to be used, by a minor, subject to various exceptions.
  • “Online platforms,” defined to mean any public-facing website, online service, online application, or mobile application that predominantly provides a community forum for user-generated content.
  • “Operators” of online services that are directed to children or teens or that have actual knowledge or knowledge fairly implied on the basis of objective circumstances that they are collecting personal information from a child or a teen. 

The following are a few examples of obligations that KOSPA would impose on companies if passed in its current form:

  • Duty of care: Covered platforms would have to exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate various prescribed harms to minors. These harms include certain mental health disorders, addiction-like behaviors, online bullying, sexual exploitation, promotion and marketing of drugs, tobacco products, gambling, or alcohol, and financial harms. 
  • Safeguards: Covered platforms would have to implement certain safeguards to protect a user that they know is a minor. These safeguards include restrictions on others’ ability to communicate with the minor and view the minor’s personal data, and limiting the use of design features that result in the minor’s compulsive usage of their platform. 
  • Parental notices, tools and consents: Covered platforms would have to provide various notices and readily-accessible and easy-to-use settings for parents to support a user that the platform knows is a minor. In the case of an individual that a covered platform knows is a child, the platform would have to obtain verifiable consent from the child’s parent prior to the child’s initial use of the platform.
  • Transparency reports: Covered platforms with more than 10 million active monthly users in the US and which predominantly provide an online forum for user-generated content would have to publish at least once a year public report describing the reasonably foreseeable risks of harms to minors and assessing the prevention and mitigation measures taken to address such risk based on an independent, third-party audit conducted through reasonable inspection of the covered platform.
  • Privacy obligations: KOSPA would make numerous and significant amendments to the existing Children’s Online Privacy and Protection Act (COPPA). One set of amendments would expand the group of “operators” subject to the amended law, including by adding teenagers 13-16 years of age to the class of individuals protected by the legislation and expanding the circumstances in which an operator has knowledge that it processes the personal information of children or teenagers. KOSPA would also impose new rules and restrictions, including prohibitions against profiling and serving targeted advertising at children and teenagers, subject to certain narrow exceptions. Operators would be subject to these new requirements to the extent they are not already subject to such requirements under COPPA. 

California Age-Appropriate Design Code Act (CAADCA)

In September 2022, California enacted the CAADCA with the stated intention of requiring businesses to consider the best interests of minors under the age of 18 when designing, developing and providing online services. In September 2023, the U.S. District Court for the Northern District of California granted a preliminary injunction against the enforcement of the CAADCA on the basis that the CAADCA likely violates the First Amendment. But in August 2024, the US Court of Appeals for the Ninth Circuit vacated in part the District Court’s preliminary injunction, finding essentially that some, but not necessarily all, of the CAADCA is likely to be constitutionally invalid, and remanded the case to the District Court for further proceedings.

Subject to the ongoing constitutional challenge, the CAADCA imposes a set of broad obligations and restrictions on any “business” that provides an online service that is likely to be accessed by minors. A “business” is any for-profit organization that determines the means and purposes of processing California residents’ personal information and meets any of the following three thresholds: (1) annual gross revenues of USD 25 million, adjusted for inflation; (2) buys, sells or shares personal information of 100,000 or more California residents or households annually; or (3) derives at least 50% of annual revenue from selling or sharing California residents’ personal information. 

The parts of the CAADCA that the District Court and Court of Appeals both agree are likely unconstitutional are the provisions requiring businesses to create a data protection impact assessment and take certain steps related to the assessment (i.e., Cal. Civ. Code §§ 1798.99.31(a)(1)–(4), 1798.99.31(c), 1798.99.33 and 1798.99.35(c)). 

The District Court will eventually evaluate the constitutionality of the remaining provisions of the CAADCA. These provisions include requirements that covered businesses:

  • Estimate the age of minor users with a reasonable and appropriate level of certainty, or else treat all users residing in California as minors.
  • Configure all default privacy settings for minors to settings that offer a high level of privacy unless an exception applies.
  • Provide privacy information, terms of service, policies, and community standards concisely, prominently, and using clear language suited to the age of minors likely to access the online service.
  • Provide prominent, accessible, and responsive tools to help minors, or if applicable their parents or guardians, exercise their privacy rights and report concerns.
  • Not knowingly use minors’ personal information in a way that is materially detrimental to the physical health, mental health, or wellbeing of a minor.

Texas Securing Children Online through Parental Empowerment (SCOPE) Act

Enacted in 2023 and effective as of September 1, 2024, the SCOPE Act regulates “digital service providers” (DSPs) with the stated intention of protecting minors under the age of 18. On August 30, 2024, the US District Court for the Western District of Texas Austin Division granted a preliminary injunction enjoining Texas from enforcing the SCOPE Act’s “monitoring-and-filtering requirements” (i.e., Tex. Bus. & Com. Code § 509.053) on First Amendment grounds, while holding off on blocking the other provisions of the SCOPE Act. 

The SCOPE Act defines a DSP as the owner or operator of a website or online software that determines both the means and purposes of collecting and processing users’ personal information, connects users in a manner that allows users to socially interact with other users, allows a user to create a public or semi-public profile for purposes of signing into and using the digital service, and allows a user to create or post content that can be viewed by other users of the digital service. The SCOPE Act also lists various exceptions to the definition of a DSP. 

The “monitoring-and-filtering requirements” that the District Court enjoined from being enforced would have required DSPs to monitor certain categories of content and filter them from being displayed to known minors. Specifically, the SCOPE Act would have required DSPs to develop and implement a strategy to prevent a known minor’s exposure to content that promotes, glorifies, or facilitates various categories of subject-matter, including suicide, substance abuse, bullying and grooming. 

The District Court refused to enjoin the other provisions of the SCOPE Act from being enforced. DSPs must therefore carefully assess their obligations under the SCOPE Act, including requirements to:

  • Make users register their age before they can create an account.
  • Verify the identity of an individual who purports to be the parent of a minor, and their relationship to the minor, using a commercially reasonable method.
  • Allow parents to dispute the registered age of their child.
  • Limit the collection of a known minor’s personal information to information reasonably necessary to provide the digital service.
  • Refrain from allowing the known minor to make purchases or engage in other financial transactions through the digital service; and
  • Publish explanations of how their algorithms work if their algorithms automate the suggestion, promotion, or ranking of information to known minors on the digital service. 

Just a snapshot

The federal proposals and California and Texas laws outlined above are just three examples of legal developments in the minors’ online protection space. Numerous other bills, laws, constitutional challenges and enforcement actions are moving forward at a rapid clip across the US, including children’s privacy regulations, age-appropriate design rules, addictive feeds restrictions, and parental consent and management tool requirements. Stay tuned for more updates from the Baker McKenzie team. 

Contact Information

Copyright © 2024 Baker & McKenzie. All rights reserved. Ownership: This documentation and content (Content) is a proprietary resource owned exclusively by Baker McKenzie (meaning Baker & McKenzie International and its member firms). The Content is protected under international copyright conventions. Use of this Content does not of itself create a contractual relationship, nor any attorney/client relationship, between Baker McKenzie and any person. Non-reliance and exclusion: All Content is for informational purposes only and may not reflect the most current legal and regulatory developments. All summaries of the laws, regulations and practice are subject to change. The Content is not offered as legal or professional advice for any specific matter. It is not intended to be a substitute for reference to (and compliance with) the detailed provisions of applicable laws, rules, regulations or forms. Legal advice should always be sought before taking any action or refraining from taking any action based on any Content. Baker McKenzie and the editors and the contributing authors do not guarantee the accuracy of the Content and expressly disclaim any and all liability to any person in respect of the consequences of anything done or permitted to be done or omitted to be done wholly or partly in reliance upon the whole or any part of the Content. The Content may contain links to external websites and external websites may link to the Content. Baker McKenzie is not responsible for the content or operation of any such external sites and disclaims all liability, howsoever occurring, in respect of the content or operation of any such external websites. Attorney Advertising: This Content may qualify as “Attorney Advertising” requiring notice in some jurisdictions. To the extent that this Content may qualify as Attorney Advertising, PRIOR RESULTS DO NOT GUARANTEE A SIMILAR OUTCOME. Reproduction: Reproduction of reasonable portions of the Content is permitted provided that (i) such reproductions are made available free of charge and for non-commercial purposes, (ii) such reproductions are properly attributed to Baker McKenzie, (iii) the portion of the Content being reproduced is not altered or made available in a manner that modifies the Content or presents the Content being reproduced in a false light and (iv) notice is made to the disclaimers included on the Content. The permission to re-copy does not allow for incorporation of any substantial portion of the Content in any work or publication, whether in hard copy, electronic or any other form or for commercial purposes.