Date
Topics

It’s been an eventful month in Washington. On October 1, the federal government shut down after Senate negotiators failed to reach a funding deal, slowing all legislative activity. House Speaker Mike Johnson (R-LA) adjourned the House just before the deadline, and lawmakers have yet to reconvene.

Prior to the shutdown, the Senate conducted key hearings, including Judiciary Committee sessions examining the harms of AI chatbots, and fresh allegations from additional Meta whistleblowers. Two former Meta researchers testified about the suppression of internal research on platform harms, an issue of particular concern to Children and Screens. Kris Perry, Children and Screens’ Executive Director, issued a statement underscoring the urgent need for transparent, independent research on technology and digital media.

Meanwhile, the House Energy and Commerce Committee has signaled plans for a child online safety legislative hearing. That effort remains on hold until the House reconvenes. The Kids Online Safety Act (KOSA) and Children and Teens’ Online Privacy Protection Act (COPPA 2.0) are expected to feature prominently, though the scope of any hearing remains unclear.

State Developments

With federal action paused, state and judicial activity continues apace.

New York’s Attorney General released proposed rules to enforce the SAFE for Kids Act, which limits the use of personalized recommendation systems and push notifications for minors. The draft rules focus heavily on age-verification mechanisms, and the public comment period is open through December 1, 2025.

In a similar vein, the Ninth Circuit Court of Appeals largely upheld the Protecting Our Kids from Social Media Addiction Act, striking down only the ban on displaying “like” counts. The law is very similar to New York’s SAFE for Kids Act, creating a legal precedent for algorithm restrictions across the country. If no further challenges arise, the law will take effect in 2027.

Meanwhile, California Governor Newsom signed a series of digital-safety laws, including:

    • A law regulating AI companions used by minors and holding companies liable for harms when safety precautions are ignored.
    • A law mandating device-level age verification and parental consent for minors to download apps.
    • SB-53, requiring large AI developers to disclose safety protocols and protecting whistleblowers.

International Context

Across the Atlantic, the United Kingdom’s Office of Communications (Ofcom) is under fire for its rollout of the Online Safety Act, which took full effect this summer. The law places broad new duties on social media and search platforms to ensure user safety. Early implementation drew sharp criticism as users encountered unexpected age-verification barriers — even for accessing political content on Reddit or streaming services like Spotify. These growing pains have intensified debates around privacy, censorship, and freedom of expression, just as other nations implement bans on social media use by minors and weigh broader digital media regulation.

ScreenShots Newsletter

Read our monthly newsletter, featuring the latest Children and Screens news and resources.