Federal Updates
On March 5, the House Energy and Commerce Committee (E&C) held a markup of several child online safety bills, including Sammy’s Law, the App Store Accountability Act, the Children and Teens’ Online Privacy Protection Act (COPPA 2.0), and the newly-minted Kids Internet and Digital Safety (KIDS) Act. The KIDS Act consolidates several previously standalone bills, including the Kids Online Safety Act (KOSA).
Committee members from both parties emphasized that protecting children online remains an urgent priority, discussing concerns about addictive design features, amplification of harmful content through algorithmic design features, and the growing difficulty caregivers face in monitoring children’s digital experiences. At the same time, the markup revealed disagreements regarding the appropriate scope of federal regulation.
E&C leadership has framed the KIDS Act as a multipronged approach to online safety rather than a single, comprehensive bill. Republican members argued that the package is better safeguarded against legal challenges, while still addressing risks associated with social media platforms.
Representative Gabe Evans (R-CO) acknowledged the promise of the KIDS Act stating, “We have to deliver common sense legislation that will survive all of the Court challenges that will inevitably come, and make sure that we give something durable that’s going to protect kids.”
The markup followed months of negotiations between Committee Democrats and Republicans. A bipartisan agreement had appeared close, but talks reportedly broke down shortly before the hearing. Ranking Member Frank Pallone Jr. (D-NJ) expressed disappointment in the lack of consensus, stating that partisan bills “do not meet the mark for kids’ safety and if they become law, would leave kids and their parents worse off than they are today.”
The legislative package also reflects several policy changes from earlier drafts, including scaling back preemption language previously included in the House proposals. Updated KOSA language removes the earlier “duty of care” requirement for social media companies and narrows the definition of covered harms. Across the package, the bills adopt an “actual knowledge” standard, meaning provisions apply only when platforms have empirical evidence, or a user declaration, that a user is a minor.
Committee Democrats have repeatedly advocated for a “constructive knowledge” standard, which would apply the laws when a user’s age can be reasonably inferred from objective circumstances. Democratic members repeatedly introduced amendments to remove preemption provisions and adopt a constructive knowledge standard, but none of the amendments were adopted.
With KOSA now folded into the broader KIDS Act and modified in several key ways, many advocacy groups have expressed opposition to the revised legislation. A large coalition of organizations sent a letter opposing the KIDS Act and urging lawmakers to pass the Senate’s standalone version of KOSA instead.
In a surprise development during the hearing, the Senate passed COPPA 2.0 by unanimous consent. In response, House E&C leaders removed COPPA 2.0 from the markup to allow further bipartisan negotiations, potentially improving its chances of passage. All other bills advanced out of the committee on party-line votes.
Kris Perry, Executive Director of Children and Screens, released a statement acknowledging the House’s efforts, calling for stronger protections in the House version of KOSA, and applauding both chambers for their work on COPPA 2.0.
State Updates
State legislatures, meanwhile, continue to introduce and advance technology-related legislation at a rapid pace. While these bills address a wide range of issues, two areas have seen significant activity.
School Smartphone Bans
States continue to advance policies restricting smartphone use in schools. Over the past two years, 44 states and Washington, D.C. have implemented policies addressing personal device use during the school day.
Many of these policies are relatively strict, requiring phones to be “off and away” for the entire school day and across grade levels, or similar. However, most laws stop short of mandating a specific policy, instead requiring districts to adopt policies that limit smartphone use or providing guidance on best practices.
As a result, there remains considerable variation between districts, since most state laws allow some level of local flexibility to account for differences in demographics, resources, and school environments. The 2025–2026 school year marks the first year that many of these policies are being fully implemented and tested.
AI Regulations
Most states now have at least one AI-related regulatory bill under active consideration, with many focused specifically on AI chatbots.
These bills vary widely in scope. Some simply require chatbots to clearly disclose that they are not human or sentient. Others place limits on how chatbots may interact with users—particularly minors—including prohibiting conversations that could lead users to believe the chatbot is sentient. Additional proposals include transparency and reporting requirements, age restrictions, or combinations of these approaches.
Because the legislative landscape is evolving rapidly, organizations such as the Young People’s Alliance and Transparency Coalition maintain trackers that monitor chatbot legislation across states.
This issue is especially relevant for Children and Screens as the organization launches its Evidence Council, an independent body of experts tasked with deliberating on urgent questions facing families and policymakers. In a 7–2 decision, the Council voted in favor of requiring parental permission for minors to use AI chatbots, reflecting growing concern that these tools may pose risks to young people.
Judicial Updates
As reported in our February issue, the ongoing trial against major social media companies continues to reveal volumes of internal company documents.
These unsealed materials provide insight on the research companies conducted on their own products, as well as strategies to attract younger users. Recent disclosures and reporting add to growing evidence that several major social media platforms were aware that many users, including minors, exhibited unhealthy patterns of use, such as habitual or compulsive engagement. Internal findings also suggested that certain product design features could contribute to those patterns. Despite this awareness, companies were often reluctant to implement changes that might reduce user engagement or time spent on the platforms.
The plaintiff, K.G.M., also testified about her own experiences with compulsive social media use and the effects it had on her social life and mental health. In her testimony, K.G.M. said she began using social media as young as nine years old, at times misrepresenting her age to create accounts.
Dr. Kara Bagot, a child and adolescent psychiatrist, testified that social media use can produce addiction-like patterns of behavior and that K.G.M.’s experiences met the relevant clinical criteria. The jury also heard testimony from Arturo Bejar, a former Meta safety research lead and whistleblower, who said company leadership largely ignored his warnings about harms associated with their products.
Taken together, the testimony, internal documents, and employee accounts highlight a central tension in social media companies’ business models: platforms are financially incentivized to attract as many users as possible and keep them engaged for as long as possible, even when their own research suggests that doing so may harm user health and well-being.