Children inevitably inherit the consequences of the world we, as adults, create—and public policy is one of its most enduring foundations. As the Executive Director of an organization committed to improving children’s digital lives, I am deeply concerned about how today’s legislative decisions will shape their future.
The current federal budget proposal includes a decade-long moratorium on state-level enforcement of AI regulations. While Congress recognizes that a patchwork of state policies can sometimes hinder progress, curbing state authority without offering robust federal safeguards risks leaving children unprotected. Preemption should only occur within comprehensive legislation that meets or exceeds what states could provide—and offers fair, effective, and enforceable protections for all.
The United States took a light regulatory approach to the internet. While that strategy fostered innovation, it also led to serious, often avoidable risks and harms. Some of these challenges were difficult to foresee, but that does not excuse inaction. In many cases, federal policymakers often chose not to act, while some states stepped forward with privacy protections, data rights, and rules for data brokers benefiting millions of families in the process.
Both state and federal governments are essential to effective digital governance. States often act more quickly, tailoring policies to their communities and piloting solutions that later scale nationally. Meanwhile, federal policy is essential to holding large tech companies fully accountable across jurisdictions. We need both of these layers of government working in coordination, not in conflict. A moratorium undermines that balance at a critical moment when alignment is urgently needed.
As AI systems are evolving rapidly and becoming embedded in nearly every aspect of society, federal oversight is vital. But many early policy ideas and protections will emerge first at the state level. The U.S. cannot lead in AI through speed alone. We must lead through safety, trust, and thoughtful governance. That leadership requires us to prioritize research, reflection, and responsible regulation.
In comments to the Networking and Information Technology Research and Development (NITRD) National Coordination Office (NCO), Children and Screens outlined key priorities for AI policy that puts children first. We urge policymakers at both the federal and state levels to:
-
- Facilitate AI alignment and safety research prioritizing children’s development,
- Support the design of AI systems that consider children’s developmental needs and capabilities, and
- Take reasonable precautions when deploying AI in educational settings.
Today’s AI tools, like so many other digital tools, are developed by and for adults, with little consideration for their impact on children. But it is children who will live longest with the consequences. As we race toward innovation, we must also lead with intention, for their sake and for the sake of a healthier digital future.