Date
Topics

Today’s ruling, in which a jury found that Meta (Instagram and Facebook) and Google (YouTube) are legally responsible for how the design of their platforms contributed to harm experienced by a young user, marks a significant shift in how society understands responsibility for the digital environments that children grow up in.

For decades, social media companies have been shielded by the frameworks that limited accountability for the ways their products are designed and deployed. This decision signals a change: a jury has determined that design choices (e.g. how platforms are built to engage, retain, and influence users) can carry real-world consequences, particularly for young people. 

A growing body of research has shown that certain platform features, including algorithms, persistent notifications, and engagement-driven design can shape how children think, feel, and behave on and offline. They are designed to interact with developing brains in ways that increase vulnerability for some young users.

At Children and Screens, our work centers on advancing and translating the evidence into clear, actionable insights for families, educators, and policymakers. The research is increasingly consistent: digital environments are not neutral and design features affect the health and well-being of young users. 

Importantly, this case reinforces a key principle long recognized in public health: harm does not need to have a single cause for responsibility to be shared. The evidence does not suggest that social media is the sole cause of harm for any individual child. Rather, it shows that specific design features can contribute to patterns of use and experiences that, for some young people, result in substantial and material harms.

The proceedings also brought visibility to the social media company’s strategic business priority of attracting and maintaining young users, highlighting an imbalance between the scale of these systems and the capacity of families to manage them alone.

Parents, educators, and young people are navigating digital environments that have become deeply embedded in daily life. Through our resources and convenings, we consistently hear that families are being asked to manage systems that were not designed with child development as the primary consideration. Research increasingly shows that, in these environments, families are often operating at a disadvantage. 

This ruling draws a clearer line: companies are responsible for the consequences of the intentional design of their products. 

At the same time, legal decisions such as this alone are not sufficient to ensure safer digital environments for children. Strong, evidence-based policy and product design standards remain essential. Families should not have to wait for harm to occur before meaningful protections are put in place. 

We hope this outcome will shape future litigation and policy discussions. Most importantly, it underscores a broader set of questions we face as technology continues to evolve: how can we ensure that the digital environments children grow up in support, rather than undermine, their health and development.