Skip to content

Navigating Children’s Online Privacy Protections: Primary Legislative Objectives of KOSA

November 22, 2024

As digital platforms place greater emphasis on younger users, legislators are calling for stricter measures to safeguard children and teens online. The first installment of this two-part advisory series provided an in-depth analysis of the Children’s Online Privacy Protection Act (COPPA 2.0), and how its proposed updates will strengthen protections relating to the collection, use and disclosure of children’s personal information online. This second installment examines the Kids Online Safety Act (KOSA) and how it aims to actively mitigate potential harm to children by including design features and providing parents with tools to manage kids’ online activity.

The Senate initially passed KOSA in July 2024 with overwhelming support. More recently, KOSA was advanced by the House Committee on Energy and Commerce (the “Committee”) in September. Although there was pessimism around whether KOSA would even make it to a full House vote due to certain concerns around the burden KOSA would place on businesses to police their content, the bill ultimately saw some last-minute changes before receiving sufficient support in the Committee to proceed to a vote on the House floor. 

KOSA is designed to address the broader safety risks children face online, including harmful content and exploitation. The requirements outlined in the bill apply to a “covered platform” which is defined as an “online platform, online video game, messaging application, or video streaming service that connects to the internet and that is used, or is reasonably likely to be used, by a minor.” There are certain exceptions outlined in the bill, including internet service providers, email services and educational institutions. The bill grants enforcement powers to state attorneys general and the Federal Trade Commission (FTC) under section 18(a)(1)(B) of the FTC Act, regulating unfair or deceptive acts or practices. Key elements of KOSA include:

  • Duty of Care: KOSA introduces a duty of care requirement for covered platforms, mandating that platforms act in the best interests of minors under the age of 17 to protect them from a variety of online harms. Specifically, platforms must take reasonable steps to prevent and mitigate risks of exposure to content that could negatively impact minors’ mental or physical well-being. The list of harms that covered platforms must protect against are promulgated in the bill, however, this list is one of the more contested elements. In order to garner sufficient support to advance through the Committee, the promulgated list of harms was minimized. While the removal of some harms by the Committee has been seen as gutting the purpose of the bill, the inclusion of others is seen as potentially having unintended consequences. This list is likely to be debated further before this bill passes.

  • Design Requirements: KOSA requires that covered platforms adhere to a variety of design requirements, including enabling default safeguard settings for minors and providing parents with tools to manage and monitor their children’s online activity.

  • Reports and Audits: Under KOSA, covered platforms must issue an annual public report describing reasonably foreseeable risks of material harms to minors and assessing the prevention and mitigation measures they are taking to address said risks. In drafting the report, covered platforms must undergo an independent, third-party audit.

While KOSA has garnered bipartisan support, it has also faced significant criticism, particularly from privacy advocates and civil liberties groups. Some critics argue that the bill could lead to increased surveillance and censorship, as platforms might over-moderate content to avoid liability, potentially infringing on free speech rights. The bill’s broad definition of “harm” has also raised concerns, as it could lead to overreach by the FTC and state attorneys general, who would be responsible for enforcing the law. These enforcement powers, critics warn, could be used to target content based on political or ideological grounds, raising the risk of censorship. Industry leaders have also raised concerns about the feasibility of implementing KOSA’s requirements, particularly for smaller platforms. On November 18, 2024, more than 30 state attorneys general wrote a letter to federal lawmakers urging them to back this legislation in order to “act to aid [their] state-level efforts” to bolster youth online safety.

Varnum’s Data Privacy and Cybersecurity team is closely monitoring these legislative developments and stands ready to guide clients through the complexities of the new regulations. Should these laws be enacted, businesses will need to swiftly adapt to avoid legal risks and ensure they are effectively protecting the rights and safety of younger users.

Featured Authors

Featured Author

Bhashit (Sheek) Shah

Partner

Sheek advises clients on data privacy best practices and regulatory compliance. With experience in global privacy frameworks and laws including GDPR, CCPA, and COPPA, he helps businesses build and implement compliance programs and manage data breaches.

Sign up to be the first to access our leading legal insights.

The link you have selected will redirect you to a third-party website located on another server. We are offering the link for your convenience. Varnum has no responsibility for any external websites and makes no express or implied warranties about any external websites.

Please be aware that contacting us via e-mail does not create an attorney-client relationship between you and the firm. Do not send confidential information to the firm until you have spoken with one of our attorneys and receive authorization to send such materials.