Policy Voices is a recurring newsletter feature that spotlights thought leaders, policymakers, and advocates whose work interacts with the mission of Children and Screens. Through short, curated Q&As, this section elevates informed voices helping to shape the policy environment around children’s digital well-being. These individuals bring deep expertise, practical insight, and real-world experiences that enrich the broader conversation about how research, policy, and practice can better support children and families.
Zamaan Qureshi is an advocate, policy expert, and strategist focused on tech accountability, online safety, and privacy. He is the co-founder and co-chair of Design It For Us, where he leads advocacy on emerging technology policy, builds coalitions, and drives the youth online safety movement. His work has been featured in national media for elevating young voices in the fight for a safer digital world. He is a graduate of American University, where he earned a B.A. in international studies and political science. Zamaan is on the 2025 Forbes 30 Under 30 list.
Q: What motivated you to pursue tech policy advocacy? And how has it evolved over time?
I joined the tech policy advocacy space in 2020, as I saw the ways social media was being used to erode our democracy and our democratic institutions. My interest was piqued after I watched The Great Hack, a documentary focused on Cambridge Analytica and Facebook. In it, there’s a moment where the CEO is insidiously laughing about how Cambridge Analytica used targeted political advertising to discourage voters from turning out to vote in Trinidad and Tobago, where my grandmother is from. I remember how infuriating watching that moment was, and how motivating it was for me to get involved in this space because these companies thought they could get away with manipulating people, for profit.
Over time, my focus has become more centered around young people’s experiences growing up on social media products, but with the same concern about Big Tech companies manipulating our attention. As a young person, I have seen firsthand how these companies target my generation, preying on our insecurities and vulnerabilities at an early age. Social media companies, through their addictive design, have trapped young people in a vicious feedback loop, keeping us coming back to products we would otherwise want more agency from. As the discussion around youth safety evolved, and as someone in the tech policy advocacy community early on, I saw a lack of youth voices in our policy conversations. It drove me to co-found Design It For Us, a coalition of young people advocating for safer online spaces for our generation through policy, campaigns, and action. It’s been a privilege to watch this space grow and create space for young people to take center stage at decision-making tables.
Q: There are a lot of online safety bills floating around Congress, and it can be hard to to understand what any of them is actually supposed to do. For those who don’t follow kids online safety legislation closely, can you give the 30,000 foot view of the regulatory approaches being considered at the federal level? Are there distinct models or strategies being considered?
Congress has debated many different policy interventions to rein in Big Tech and tackle addictive design to better protect consumers online. I tend to think about the various regulatory approaches with the most teeth to regulate Big Tech in a few different buckets. First, there are kids’ safety bills that are specifically focused on holding Big Tech accountable to protect young people online. Within that bucket, some bills restrict young people’s access, versus holding social media companies accountable for their product design. Next, there are consumer privacy bills that protect people’s online information and data from overreach by companies like social media or data brokers. The United States does not have a federal data privacy law, which has escaped Congress for decades.
Another is antitrust — the regulation of companies’ structure and market power, premised on the idea that Big Tech has gotten far too big and now dominates markets, stifles competition, and harms consumer choice. Transparency and researcher access as a regulatory intervention both live in their own bucket and underpin many of these other legislative proposals based on the fact that Big Tech companies’ products are black boxes, and peeling back the curtain and allowing researchers a chance to study these companies to have visibility into whether they are upholding their public commitments and complying with other laws. Finally, more recently, there have been new proposals taking shape to regulate emerging technologies, such as liability for AI chatbots or bills that would repeal older laws that shield Big Tech from liability, like Section 230 of the Communications Decency Act.
Regardless of the intervention you believe in or that resonates with you, Congress has largely failed to take decisive action in this area despite the dozens of hearings it has held and the American people have suffered the consequences for its lack of action.
Q: There’s no question that kids’ online safety has gained a lot of attention in the past few years. When did this current fight for regulations start?
The fight for kids’ online safety legislation that regulates social media products largely emerged from the vital disclosures of whistleblowers. In 2021, Frances Haugen became a whistleblower from Facebook when she testified that Instagram knew its products were harming young people but not taking sufficient action to protect young people using their products. That spurred lawmakers like Senators Blumenthal and Blackburn to introduce the Kids Online Safety Act (KOSA) to hold social media companies accountable for their design features and regulate Big Tech’s business model. The bill has taken on many shapes and forms as it has undergone edits throughout the legislative process over the years. Meanwhile, other proposals have emerged, such as the Children and Teens’ Online Privacy Protection Act or COPPA 2.0, that would update the existing children’s data privacy law to adapt to social media. Youth and parent advocates have repeatedly shown up to Capitol Hill to tell their stories of personal trauma, pain, loss, and a hope for a better digital future to advocate for bills to become law.
Q: What child online safety and privacy bills are you currently focused on and why are they important to you?
We remain focused on urging Congressional lawmakers to take up a strong kids online safety bill that targets social media companies’ design features. We have also explored other interventions, such as antitrust bills that would force more competition in the social media space and lead to safer social media products. As Congress continues to sit on its hands, however, we have turned to the states where we feel present more opportunities and have strong champions willing to fight for meaningful accountability. In states like California, Maryland, and Vermont, Design It For Us helped make the Age-Appropriate Design Codes (AADC) law. The AADC holds Big Tech accountable for its business model and product design to ensure safer online experiences for young people. We’re excited to see states like Michigan introduce the AADC in 2026.
We have also been supportive of bills like SB-53 in California and the RAISE Act in New York, which address catastrophic risks of artificial intelligence products and compel AI companies to meet certain transparency and disclosure requirements to better protect the residents of those states from the harms of AI.
Q: How do you think the debate over kids’ online safety policy will evolve in 2026?
2026 is a crucial inflection point in the kids’ online safety policy debate. Left unchecked, social media will harm young people across the country who are growing up with unregulated social media and predatory Big Tech products at their fingertips. And while Congress has failed to act, states have stepped up, whether those are bills in state legislatures or lawsuits filed by State Attorneys General. Those critical actions by states should inform Congress’s work and push the federal government to address some of the challenging questions it has wrestled with over the past few years to consider how to protect young people online.
Congress also must learn from its mistakes and those of other jurisdictions. Big Tech will stop at nothing to ensure meaningful regulation never becomes law, and when it does, they will challenge it to the ends of the earth in the courts or lobby for its non-enforcement. Congress must focus on regulating the business model of Big Tech and not on preventing access for young people, shifting the burden of responsibility for regulation, or nullifying state protections. It is squarely on Congress’s shoulders to listen to the body of work the tech policy advocacy community has built for years to inform its decision-making.