On June 21, 2023, U.S. Senator Jon Ossoff introduced the Kids Online Safety and Privacy Act (the “Act”) (Senate Bill 2073), which is legislation focused on online experiences of minors. Recently passed by both the House and Senate, the Act soon awaits President Biden’s review. If signed into law, it will impose significant obligations on online gaming and media platforms, particularly those serving users under 17.
The key components of the Kids Online Safety and Privacy Act are as follows:
A Duty of Care: The Core Responsibility
The Act establishes a broad duty of care, requiring covered platforms to “exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate” specified harms to minors. Such harms to minors include: (1) mental health disorders (e.g., anxiety, depression, eating disorders, substance abuse, and suicidal behaviors); (2) patterns of use that indicate or encourage addiction-like behaviors; (3) physical violence and online bullying; (4) sexual exploitation and abuse; (5) promotion and marketing of narcotics, tobacco products, gambling, or alcohol; and (6) predatory, unfair, or deceptive marketing practices, or other financial harms. Id. at § 102(a).
The Act delineates guidelines for two key age groups: children under 13 (Title I and II) and teenagers aged 13 to 16 (Title II). Id. at § 201.
Title I: Protections for Children
For covered platforms accessible to children under 13, the Act enforces several stringent requirements:
- Design Restrictions to Prevent Compulsive Usage—Features like infinite scrolling, autoplay, and reward systems tied to time spent on the platform must be curtailed or eliminated to mitigate risks of overuse.
- Communication Controls—Platforms must limit other users’ ability to contact children.
- Parental Oversight—Robust parental tools must be provided to allow adults to manage privacy and account settings for their children.
- Mental Health Compliance—Designs must align with “evidence-informed medical information” to reduce risks of anxiety, depression, eating disorders, and other mental health challenges.
- User Reporting—Platforms must include a “readily-accessible and easy-to-use means to submit reports” of harms to minors.
Time Limits—Platforms must provide an option to limit the amount of time spent on it.
Id. at § 103(a)-(c). For developers, these measures necessitate reevaluating how platforms are built and used. Features once designed to maximize user engagement may need to be removed or significantly modified to comply with these new regulations.
Title II: Safeguards for Children and Teenagers
Teenagers are afforded greater autonomy, but covered platforms catering to this group are still required to implement protective measures:
- Feature Moderation—While engagement tools are not outright banned, they must not encourage excessive or harmful use.
- Data Security—Platforms must implement “reasonable security practices to protect the confidentiality, integrity, and accessibility of personal information” from unauthorized access.
Id. at § 201. These requirements challenge platform developers to balance user engagement with the responsibility of minimizing harm and protecting privacy.
A violation of the Kids Online Safety and Privacy Act would be considered an unfair or deceptive act or practice under the Federal Trade Commission Act. For covered platform operators, compliance will be critical. The Act defines a covered platform as “an online platform, online video game, messaging application, or video streaming service that connects to the internet and that is used, or is reasonably likely to be used, by a minor.” Kids Online Safety and Privacy Act, S.2073, 118th Cong. § 101(3)(A) (2024).
How This Impacts Platform Developers
The Kids Online Safety and Privacy Act represents a paradigm shift in covered platform operations.
Designers must:
- Reassess Design Principles—Features optimized for prolonged user engagement may now pose regulatory risks, requiring redesign or removal.
- Embed Privacy and Safety—Privacy settings and protective tools must become integral to platform architecture from the outset.
- Enable Reporting and Controls—Platforms must create accessible reporting mechanisms and tools to limit usage and prevent harm.
- Implement Robust Security Measures—Ensuring data protection and guarding against unauthorized access. Id. at §§ 103(a)-(c), 201.