Date
Episode
030
Guest
Matthew P. Bergman

As courts increasingly examine the impact of social media on young people, new legal approaches are beginning to reshape how responsibility for harms are understood, shifting the focus from content to platform design. On this special episode of Screen Deep, host Kris Perry speaks with Matthew Bergman, founding attorney of the Social Media Victims Law Center, professor of Lewis & Clark Law School, and a member of Children and Screens’ Scientific Advisory Board. Bergman also served on the legal team representing the plaintiff in a landmark case examining the impact of social media platform design on youth.

Together, Kris and Matt explore why it has historically been difficult to bring lawsuits against technology companies, how recent litigation strategies are more effective than past efforts, and what these developments could mean for the future of platform accountability.

About Matt Bergman

Matthew P. Bergman is an attorney, law professor, philanthropist and community activist who has recovered over $1 billion on behalf of his clients. He is the founder of the Social Media Victims Law Center and Bergman Draper Oslund Udo law firm; a professor at Lewis & Clark Law School; and serves on the board of directors of nonprofit institutions in higher education, national security, civil rights, worker protection and the arts.

 

In this episode, you’ll learn:

    1. Why social media companies have historically avoided accountability for harm encountered through their products, and how that may be changing 
    2. How recent lawsuits have uncovered key internal evidence that social media platforms were designed to be  addictive to young users despite known risks 
    3. How legal experts are using scientific research and product liability law to help define harms to youth and establish social media platform accountability
    4. Why recent lawsuits are focusing on social media platform design rather than content, and why that matters 
    5. Why legislative change is needed to assist the effort to enforce social media platform transparency and accountability

Studies mentioned in this episode, in order of mention:

Office of the Surgeon General (OSG). (2021). Protecting Youth Mental Health: The U.S. Surgeon General’s Advisory. US Department of Health and Human Services.

Twenge, J. M. (2017). iGen: Why today’s super-connected kids are growing up less rebellious, more tolerant, less happy—and completely unprepared for adulthood (1st ed.). Atria Books.

Haidt, J. (2024). The Anxious Generation: How the Great Rewiring of Childhood Is Causing an Epidemic of Mental Illness. Penguin Press.

Burnell, K., Flannery, J. S., Fox, K. A., Prinstein, M. J., & Telzer, E. H. (2025). U.S. Adolescents’ Daily Social Media Use and Well-being: Exploring the Role of Addiction-like Social Media Use. Journal of children and media, 19(1), 194–212. https://doi.org/10.1080/17482798.2024.2402272 

Flannery, J. S., Burnell, K., Kwon, S. J., Jorgensen, N. A., Prinstein, M. J., Lindquist, K. A., & Telzer, E. H. (2024). Developmental changes in brain function linked with addiction-like social media use two years later. Social cognitive and affective neuroscience, 19(1), nsae008. https://doi.org/10.1093/scan/nsae008

[Kris Perry]: Welcome to Screen Deep, where we decode young brains and behavior in a digital world. I’m your host, Kris Perry, Executive Director of Children and Screens. Most episodes of Screen Deep focus on the science, what researchers and clinicians are discovering about how digital media shapes child development and what this means for families. But today, we’re looking at these complex issues from another important vantage point – the legal and policy frameworks that determine whether the companies designing these platforms can be held accountable for the harms they may cause.

My guest today is Matt Bergman, founding attorney of the Social Media Victims Law Center, a professor at Lewis and Clark Law School, and a member of Children’s Screen’s National Scientific Advisory Board. Early in his career, Matt represented victims of asbestos exposure in landmark cases that helped establish accountability for companies that knowingly produced harmful products. Today, he’s applying that same legal lens to social media platforms. Matt was part of the team that secured a landmark jury verdict holding major social media platforms, specifically Meta and YouTube, legally responsible for harms linked to their design features. Through this and a growing number of lawsuits, Matt and his team have worked to establish that platforms can be held accountable when features like infinite scroll, algorithmic feeds, and beauty filters contribute to harm young users.

Today, we’ll talk about how product liability law applies to digital platforms, what internal company documents have revealed about what social media companies know about the risks to youth, and why a legal provision known as Section 230, has become such a central issue in the debate over platform accountability.

Matt, thank you so much for joining us today.

[Matthew Bergman]: It’s a pleasure to be here.

[Kris Perry]: As you know, the work of Children and Screens and this podcast is laser-focused on helping children lead healthy lives in a digital world. While Screen Deep is primarily focused on talking with leading researchers and how what they are uncovering can help parents and families make decisions about children’s media use, I really wanted to bring you on here today to talk about this from a different angle. Namely, the growing evidence that the platforms that children and adolescents are using every day bear no small amount of the responsibility for the harms that children have encountered using them. So to start, tell us about how and why you founded the Social Media Victims Law Center a few years ago.

[Matthew Bergman]: I’ve been practicing law for 30 years, representing individuals who are harmed by toxic chemicals and in product liability cases. And in the fall of 2021, I was looking to do something a little bit more impactful. I also was incredibly affected by Francis Haugen’s revelations to Congress about what the social media industry knows about what its platforms are doing to kids, as well as the Surgeon General’s report on the mental health crisis that young people are experiencing. And I decided that I would try to parlay my experience as a product liability lawyer into adopting a new approach to hold social media companies accountable for the design of their platform, which we believed might be permissible to go forward as opposed to their hosting of third-party content, which countless courts had thrown out before the cases even got started.

[Kris Perry]: So part of holding social media companies accountable is establishing that they knew or know that they are causing harm to youth with their products. So what evidence have you and others found in terms of what they know about these harms and what they did or didn’t do about it?

[Matthew Bergman]: Well, one of the problems is that plaintiffs were getting shut down before the cases even got started. So we never had an opportunity to discover documents or to take depositions. We just had to take the social media company’s word for it. And so when Francis Haugen leaked a bunch of Facebook documents to the public for the first time, we got a glimpse of what the social media companies knew their platforms were doing to kids. Over the course of three and a half years of discovery, we discovered documents that established categorically that the social media companies have designed their platforms to be addictive, taking advantage of their knowledge of the underdeveloped neurology of young people, as well as their social need for the acclaim of their peers. And as a result, it became evident that they were addictive by design, no different than the tobacco companies manipulating nicotine levels in cigarettes. And so the trial was based upon primarily their own internal documents that these companies had generated in secret over the last decade.

[Kris Perry]: Can you share more of the specifics of what you saw in those unsealed internal documents and what surprised you or confirmed what you may have already known?

[Matthew Bergman]: Well, what we saw was really what I would characterize as outrageous misconduct, where they set references to, by Meta, that, “Tweens are herd animals,” likening them to penguins, or, “Kids have an addict’s mentality.” Or, in this case of YouTube, “We intend to make the product addictive by nature.” Or Facebook, “A third of the girls who look at our platform feel worse about it,” or, “40% of users receive unwanted sexual content every week.”

I think one of the most resonant documents, though, was what was called the “Myst document,” because in this case, the defendants basically did not expend much effort trying to convince the jury that their conduct was pristine. They rather sought to denigrate the plaintiff and her family and say that her mental health harms were solely a result of a bad mom in a broken family.

So, it was really important for us to essentially be able to combat that narrative and with their own documents. And we obtained the Myst document from Meta that said that kids that are already having mental health challenges or are in a challenged socioeconomic stratum are more likely to become addicted to social media and more likely to develop harms as a result. So, essentially, we were able to show that they prey on the vulnerable. And I think that was an important document in kind of overcoming this “blame the victim” narrative that we were subjected to.

[Kris Perry]: Well, I’m really glad you mentioned Frances Haugen and we know it’s been several years since she testified in front of Congress. And, as you know, the – Children and Screens has been here for almost 15 years now because, not only the smartphone, but social media platforms and all of these design features that I know you’re an expert in have been in place and been used by millions and millions of teenagers around the world. So help me understand the timeframe you’re talking about. When you are looking at those internal documents, the email traffic at Meta, the email traffic at TikTok or Snap, what timeframe is this? Is it, do we go back to 2010? Was it more?

[Matthew Bergman]: Yeah, we go back to 2010 when it starts. And then over 2012, it starts getting more resonant.

[Kris Perry]: Mm-hmm. As somebody that’s been here for many, many years and looking at the research. You know, I think of this as basically an entire generation of children who have been using smartphones and social media platforms for the bulk of their lives. And the plaintiff that you represented is an example of what millions of children and families have been going through. And I am just, I remain shocked and frankly very unhappy and upset by the fact that we’ve known these things and the company’s known and we’re just now getting a ruling like this.

[Matthew Bergman]: No, you’re absolutely right. And the other thing that we see, though, is the best evidence is in the company’s own documents. And it’s also equally clear that there are and were many people of conscience within these companies who were concerned about what the platforms are doing to kids. And we’re providing notice to the top C-Suite personnel. And time and again when the decision became implement a safety measure that will help kids or pursue platform policies that will enhance profits, they chose profits.

[Kris Perry]: Yep, yep, well, I mean, that is the business model. And many of the guests that we’ve had on the show, whether they’re neuroscientists, or child development experts, or people who look at education tech, every single one has really underscored that the business model has been one of the biggest hurdles to improving the platform. So, I hope that this ruling is going to be a motivator for the companies to try and do better by centering children.

While you were preparing for the case, what other independent research or science beyond the platform’s own documents informed your thinking on how social media has harmed young or vulnerable people?

[Matthew Bergman]: The work of Jean Twenge and Jon Haidt was extremely important. Actually, listening to a podcast by John Haidt early on was one of the catalysts for me to get started. And then that was, you know, extremely compelling to me, as well as the emerging neurologic studies done by Eva Telzer and her team at University of North Carolina that, you know, we are actually looking at a physiologic addiction quite similar to nicotine or drugs. I had begun my litigation effort operating into the assumption that this was purely a behavioral addiction. And more and more the research is showing that no, it has a physiologic component to it.

[Kris Perry]: Yeah, I mean, we know that dopamine and the reward center of the brain are absolutely being not only activated but manipulated by the design of these products. Is there any other research, say at the community level, that rose up and got your attention that we could use to help advocate for kids? In other words, how could we leverage science? Not only in the litigation space, but in helping organize more and more outcry about the way these products are impacting kids?

[Matthew Bergman]: Well, a lot of the researchers associated with Children and Screens were incredibly impactful. Some of them served as experts in this case. Some of them simply published good articles that we relied on.

One of the other things, you know, as a trial lawyer, my job is also to try to make things relatable to juries and to judges. And talking to school teachers who described the impact when this fall they imposed the cell phone ban, and just talking to school teachers on what a profound impact it had on the quality of education that students receive, but also the mental health that the students were experiencing. So, I think that really, you know, there’s quantitative research, which is of course very important, but there’s also, you know, allegory that I think is incredibly compelling. And to me, talking to teachers was one of those.

[Kris Perry]: Yeah, I know what you mean. This bell-to-bell ban wave that, in part, Jon Haidt and others have called for has been rapid and effective. At least initially, the reactions of both students and teachers has been pretty positive. And we’re all looking for the research down the road that will help illuminate us further on whether those bans have done something to help kids with their both academic but also social emotional well-being.

Why is it so hard to hold social media companies accountable for liability from their products? And as you think about that answer, tell us a little bit more about Section 230, because you’ve pointed out that you used a different framework here than had been tried before.

[Matthew Bergman]: So, the reason it’s so hard is that big tech is the only industry that doesn’t have a duty of reasonable care. Social media has operated under immunity for 20 years for anything associated with third party content, which is everything that social media does. Section 230 was a statute enacted in 1996 explicitly to protect kids, but it had been interpreted by the courts far afield from its original intent or statutory language to basically confer absolute immunity. And by immunity, it means you don’t even get in the courthouse door for anything that a social media platform does involving third party content. That’s how widely it was interpreted.

And look, there were some good reasons that Section 230 was enacted and there remain some good reasons for there to be protections, but not to the scope and extent to which those protections had been extended. I mean courts were granting immunity to social media companies for hosting child sex trafficking, for hosting CSAM, for hosting drug deals. And that was not anything Congress enacted. 

We adopted a new approach and said, “We are going to sue social media companies not for the content that they host on their platforms but for the design the platforms, for the infinite scroll, for the likes, for the streaks, for the algorithms that show kids not what they want to see, but what they can’t look away from. From the design that’s based not on providing a online experience to kids that is beneficial, but simply addicting them to stuff.” And we know from the standpoint of neurology that materials that kids look at that are psychologically discordant, that they don’t want to see, provide a greater dopamine hit than stuff that is uplifting or affirming. And so by design, these algorithms are showing kids stuff not for purposes of their viewing experience, but simply for their addiction.

And that was the theory that we were able to get around Section 230 on in most instances. It still remains a hurdle. And I was testifying in front of the Senate last week, advocating some legislative reform on that score. But this was a new theory that we adopted and we were able to get past the motion to dismiss which, up until then, had basically stopped these cases before they got started. Once we did that, for the first time we had access to internal documents and had the opportunity to put the executives under oath and ask them poignant questions. And we’re able to really establish that these platforms are addictive by design and that they are designed by some of the smartest, most sophisticated, and, I would say, avaricious corporate executives and designers in the history of the industry.

[Kris Perry]: And why that is resonating with me is that we’re not talking about products for adults who would have a difficult time managing their behavior when it comes to this addictive design, but we’re talking about really young children – 13 and under, 13 and above.

You used a couple of terms, one was CSAM, which we know is “child sexual abuse material,” but you also used a different term, “duty of care.” And because this is going to be spoken about more and more and more as we go forward, can you tell us a little bit more about what that means, break it down for our listeners so that they understand how companies do or don’t respond to a requirement of duty of care?

[Matthew Bergman]: So there’s a concept of duty of care that applies to every company and every individual in the United States. And that’s that a company has a duty to exercise reasonable care, to design products and to sell products that are reasonably safe when used as intended. Now, that does not connote a duty of clairvoyance, nor does it connote a duty of perfection. It just says that, you know, if you know something’s dangerous, don’t do it. And if you should know that something’s dangerous, don’t do it. And if you willfully deceive yourself into saying that something that’s dangerous isn’t dangerous, you’re still responsible. That’s no different than the duty that any of us have to drive the speed limit, to slow down when you see someone, you know, crossing the street. That’s just a basic duty of reasonable care. And every company in America has that, except social media.

[Kris Perry]: And we know, you use the tobacco example, and I know you worked on asbestos cases, and you used the car analogy, which is often used in this case. It’s a dangerous product, but over many, many decades, the car companies have been required to make them safer and safer, and that they have become safer. And so we see this in our daily lives all the time.

But we’re also seeing a huge lag between the pace of technology-driven change and the way people are living their lives. And our legal structure is supposed to keep people safe in the real world, but it’s seeming to be slow to do it, you know, to protect us in the virtual world. Can you address this lag between what we know about technology and its harms and getting more regulation and more accountability in place?

[Matthew Bergman]: Well, you know, the problem with regulation is you can only regulate what you know about. And the tech companies know a lot more than politicians. And so, the duty of care, you know, when utilized in the legal system, basically puts the economic incentive on the party that makes the platform. So, you know, the idea of product liability is that the entity that has the greatest ability to implement safety has that responsibility. So, you know, for instance, an automobile manufacturer might decide that, you know, we can save $50 million a year in production costs by putting bad brakes on the car. But they will, if they do that, face product liability lawsuits in car accident cases that will exceed the $50 million that they save on shoddy brakes.

Now, in that kind of a scenario, it doesn’t matter if the company is, you know, the most public-spirited or the most avaricious company in the world. Based on simple Milton Friedman economics, they are going to invest in safety. But they’ll only do that if they bear the cost of safety. So, if you tell a automobile manufacturer, “You have total immunity for anything associated with brakes.” In that case, they’re not going to have that economic incentive to put safe brakes or better brakes on their cars. What we have done with social media is we have said, “Anything associated was third party,” in other words, virtually anything, “you’re immune from.” So the normal economic exigencies that motivate every other company in America don’t motivate social media. They are limited to either, you know, moral persuasion or bad press. And we know, you know that people are very capable of conforming their moral principles to their economic interests. And, you know, bad press is bad press, but if you are making good profits, your shareholders are happy.

So this kind of anomaly in the law allowed for social media companies to emerge the way they did. There’s no way that any normal company would produce something that provided suicide videos to kids. And yet, in our cases, we have many examples where the social media companies, TikTok, Snapchat, YouTube, and Instagram have affirmatively sent suicidal material to kids, encouraging them to take their lives. And in the horrific cases when that takes place and we seek to hold the company responsible, they seek dismissal based on Section 230. And sometimes they get it.

[Kris Perry]: Well, so I guess I’m a little stumped because if there was always a product liability option and the products have been fully, you know,  adopted worldwide for more than 15 years, why wasn’t the product liability path taken before this? Or was it?

[Matthew Bergman]: No one did it before we did.

[Kris Perry]: Which is really surprising given the level of both whistleblower testimony, but also the research and the science and the evidence that had been amassed over those 15 years. It’s an incredibly powerful path to be on, and really important.

[Matthew Bergman]: And I think that’s why before we started doing our work, the focus was wholly on regulation. And look, we’re all for regulation, if it’s intelligent. It’s just, I think, that regulation in the absence of the economic stick, if you would, that tort liability can impose on companies is not going to get the job done, you know. If you grab them by their pocketbooks, their hearts and minds will follow.

[Kris Perry]: Yeah. I think many organizations like Children and Screens are really feeling more hopeful that this lag between the legal protections and the evidence will speed up. And that’s really important right now because of current uses of technology that are really intensifying the design features that you mentioned, thanks to new generative AI products and their ability to produce deep fake imagery, for example. Are you looking at, now, at AI products at all in your legal work or staying focused on the social media side?

[Matthew Bergman]: Definitely are, but just to kind of finish up the last point: there’s a synergistic relationship between regulation and litigation. For a long time, the social media executives had testified in front of Congress that their platforms were safe, and we were able, through these lawsuits, to actually get the evidence to show Congress that these executives had lied. And so in that sense, I think the litigation process brings to light a lot of unpleasant truths that the social media companies had held from the public until the litigation process could go forward.

Similarly, I think regulation also is beneficial to litigation insofar as it kind of establishes a benchmark standard, what is and what isn’t necessarily the appropriate standard of care. But in answer to your question, we are, over the last year and a half, expanding our focus into generative AI, including, in particular, chatbots.

[Kris Perry]: Interesting. Do you sense any willingness on the part of social media platforms to voluntarily make these platforms safer in light of these disclosures?

[Matthew Bergman]: We have seen changes since we’ve been filing the lawsuits. We have seen some significant changes. I don’t want to ever depreciate anything that a social media platform does to make its platform more safe. And, you know, even a baby step is a step in the right direction. And as recently as the trial, they were coming out with some of their design changes to increase and improve verification. So, you know, I think that is the civil justice system working the way it should be working, which is to incentivize safety.

You know, the worst thing about this is that the product defects that make social media so dangerous really could be eliminated with a flip of a switch. We could make social media 80% safer tomorrow if we wanted to. Or if they wanted to. Now, would there still be some problems? Of course. But, you know, the levels of harms that we’re seeing could be, you know, could be cut by 80%.

[Kris Perry]: So what do you think it’s going to take to motivate this kind of real change that is clearly not hard to do, it’s less profitable, what’s going to motivate that change?

[Matthew Bergman]: I think what’s going to motivate them is basic Milton Friedman economics. When it is cheaper for them to make a safe product than a dangerous product, they’re going to make a safe product. They clearly, I mean, they’ve been –  how many times have the executives been excoriated in front of Congress? Doesn’t make a difference. How many people of conscience within these companies have sent pointed memos to the CEOs? Doesn’t make a difference. How many times have the Wall Street Journal or the New York Times or the Washington Post done these incredible exposés on this? It doesn’t make a difference. What will make a difference is when they have to bear the economic cost of their deliberate design decisions. And that’s what we’re here to do.

[Kris Perry]: And, you know, in both the cases last week, there were financial settlements that are significant. They will matter to the plaintiffs. They will matter to the legal team and to those harmed. And yet to the companies, they’re still not at the level that would generate the kind of change I know you’re, we’re talking about on this in this conversation, but I know that there are thousands of other cases coming along behind them, whether it’s from attorneys general or individual plaintiffs. What do you believe the timeline is for this number of cases to reach resolution and create a bigger and bigger financial burden for the companies?

[Matthew Bergman]: I don’t know what the end point is. I know that the pressure is increasing. And you know the fact that Meta got two juries in two days in two different states found Meta’s conduct wanting; in the case of KGM, found Meta’s conduct to be malicious, as well as YouTube’s conduct. That has to give some pause to these companies as they consider whether litigation is their best path forward.

That said, they have all the money in the world and can hire the best lawyers to keep pushing forward and will do so. You know, I have no illusions that this is anything other than a long, hard battle. But, you know, to paraphrase Winston Churchill, “This is not the end, this is the end of the beginning.”

[Kris Perry]: And that’s progress. When you spent the time you did looking at those internal documents, did you think to yourself, “Ooh, I wish we had more science in this area. I wish this field had the opportunity to study this phenomenon longer so that we would have more evidence.” Did you think where you might want to push this scientific field further than they’ve gone so far?

[Matthew Bergman]: Well, the best science is the company’s own research. And I think that the best way to ensure and identify online harms is to make the incredible amount of data and make researchers, make third party neutral researchers not – and by the way, I’m not talking about researchers on the plaintiff’s side or the defense side, but neutral third party researchers – need to have access to this data because I think that will establish, I think, the appropriate guidelines and the appropriate guardrails that need to be put into effect. It is astounding when you look at some of the research studies that, particularly some of the early ones that were done, where they’re a little clunky because it’s self-reported user data. Well, why do you need self-reported user data if you have access to every user online? So, those questions of, you know, kids are asked, you know, “How many hours do you spend online?” Or parents are asked, “How many hours do your kids spend online”? And you can, over time, develop some sense of that. But wouldn’t it be better for researcher to be able to actually know that this child spent 4.27 hours on Monday and 6.7 on Tuesday, and that they looked at this, this, this, and this, and all of that data is available. And all of that data is not only is available, it is actively being analyzed on a going-forward basis by these companies who know more about us than we do.

[Kris Perry]: Yeah. And we know kids are on screens of various types during the day. So it’d be – it’s very difficult to even calculate how many hours they’re on if they’re on at school, they’re on at home.

You have a number of cases pending right now to continue this process of establishing product liability for social media companies. And the case that you recently tried in California has been all over the news, especially after the verdict was announced in favor of the plaintiff. We’ve mentioned it a number of times on this podcast, but what would be really helpful for our listeners is if you could summarize this specific case and exactly what harm was alleged and found to be caused.

[Matthew Bergman]: KGM got on YouTube when she was nine years old and became addicted to the platforms. Got on Instagram and Snap, and TikTok later on, developed severe body dysmorphia and severe depression and anxiety and suicidal ideation. You know, thank God she’s okay. I mean, she is delicate, but she’s also resilient and she’s okay. This was a case that was picked by the court, not by the plaintiffs, as being representative. So Kaylee could have been anybody’s daughter. This was not the worst case scenario by any chance. This was more of kind of a, quote unquote, “normal” harms from social media. And yet this wonderful young woman has been severely and permanently impaired and a lot of her childhood was stolen from her as a result of this deliberate design decisions that these platforms made.

[Kris Perry]: Thank you for taking a moment to honor her commitment and her journey and the bravery she exhibited by standing up for a different approach by the companies. Tell me a little bit more about the specific arguments that the social media companies made in their own defense in this case.

[Matthew Bergman]: They made very few arguments in their own defense. It was pretty, pretty cursory. Their defense was attack KGM and attack her mom. KGM had, you know, a tough childhood, and she and her mom had a loving relationship, but it doesn’t mean that there weren’t some serious bumps along the way and some serious things that were said that maybe she wished her mom hadn’t said. And really the focus was, you know, “It wasn’t us. This child was messed up and this child had a really bad mom.” That was the defense. And, you know, I found it to be – I don’t want to use the word “offensive,” but I found it to be off-putting, particularly given that Meta’s own documents, including most importantly the Myst report, identified that kids from challenged socioeconomic backgrounds who had pre-existing mental health challenges were more susceptible to social media addiction and social media harms. And as a result, it was more like preying on someone who was vulnerable. That’s how Mark Lanier, in his unparalleled brilliance as a trial advocate, was able to characterize it: that they were preying on the vulnerable.

[Kris Perry]: And we know that a jury of her peers saw that that was the case also and that they supported her standing up, but more importantly, having the means to get the help that she’ll probably need for the rest of her life. And these settlements are intended to mitigate some of this harm, but it isn’t the same as having never been harmed. And that’s what we really want in the end, is for children not to go through what she went through. And now that you’re starting to see how – you’ve seen how they’re going to argue the cases, you’ve seen their defense and you’ve seen this internal information, their willingness to attack vulnerable parties in trial. People are calling this trial a bellwether trial. And how is this trial verdict going to affect those other trials that are coming up, knowing what you know now about the evidence and the way that the other side is going to argue?

[Matthew Bergman]: Well, you know, both parties learn a lot. It’s a real mistake as a plaintiff’s lawyer, or of any lawyer, when you win a case, you think, “I did everything perfectly,” and when you lose a case, you think, “I messed up entirely.” That’s not true. And so, you know, when you win a case you still need to think back about, “What could I have done better and what went over well and what didn’t?” And when you lose a case, the same thing, you have to think carefully about what you could have done better. So I anticipate that the social media companies that have among the smartest and most capable lawyers around are going to reassess as, you know, how they approached it.

You know, there’s only so much you can do with these documents. I mean, I really believe that the best evidence, the best scientific evidence on the impact between social media and severe child mental health harms is in the company’s own files. So it is our hope that as more juries become acquainted with these documents, they will respond in a similar way.

[Kris Perry]: As we’ve discussed, your lawsuits are contextualizing social media platform liability within a broader product liability context in the United States. What laws do you think the government can and should enact that would make it easier to enforce platform transparency and responsibility?

[Matthew Bergman]: I think clarify Section 230 to conform to its original intent. Section 230, there were five purposes of Section 230. Number one was to enhance the internet for exchange of ideas, two was for commerce, three was to provide users with choice over what they got online, the fourth was to empower parents, and the fifth was to empower law enforcement. Unfortunately, we’ve been primarily focused on one and two. So I think clarifying Section 230. I think passing the Senate version of the Kids Online Safety Act that passed with 91 votes – I mean, imagine, in this Congress, 91 votes. That would provide a real opportunity. And I think also some of the bills that have been proposed on a bipartisan basis to potentially exempt CSAM from Section 230, that might be a good way to approach that. I think some of those remedies would go a long way.

[Kris Perry]: Yup, I was so heartened by that bipartisan support in the former Congress. And I know one of these major sticking points in changes to current pending legislation is the duty of care. And I’m really glad you spoke at length about that today, because it really helps our listeners understand what a fundamental feature that is of being a company owner. And we can’t have new legislation go forward without something as basic as duty of care.

[Matthew Bergman]: And I think it’s important when you talk about that, Kris, to remember that all we’re saying is that social media should follow the same rules as every other company in America. This is not a question of imposing greater legal obligations, greater litigation risk on social media companies than any other company has. And so that’s what the duty of care is.

[Kris Perry]: So what’s next after this big trial, this big verdict? Where do you go from here? What keeps you going, Matt, with this level of responsibility and this amazing amount of momentum that you currently have?

[Matthew Bergman]: What keeps me going is the voices of the parents, who are channeling the voices of their lost children. One of the back stories of this trial was the presence of parents almost every day in court to bear witness. The day that Mark Zuckerberg testified, the parents literally camped out in front of the courthouse the night before so that they could get in line because the social media companies had hired people to wait in line ahead of them. So they said, “Well, we’ll show them.”

The absolute moral force of parents is both what keeps me up at night and gets me up in the morning. It’s the most weighty responsibility I’ve had as a lawyer. It’s also, at times, emotionally difficult and traumatic and has led to some mistakes that I’ve made in my practice. But it is, to me, just an ultimate honor and privilege to be representing people whose claims are so worthy. As lawyers, so much of what we do is we make distinctions out of differences and differences out of distinctions. And that’s just what lawyers do. In most work that lawyers do, the right and wrong is a little bit nebulous. And in few opportunities do you actually get to do work where the right and wrong is absolute.

You know, I had wanted to have been a civil rights lawyer in Mississippi in 1962 when the right and wrong was absolute. There’s no way you can ever – anybody can justify a legal system that says that Black people drink out of different drinking fountains as white people. Well, I feel like this work has the same moral clarity. How can you ever justify a company being allowed to send suicide videos to a 16-year-old telling him to jump in front of a train and he jumps in front of a train? How can you ever justify and countenance a company that subjects a young woman to body image dysmorphia that causes her to starve herself to death? You just can’t. And so that’s what keeps me going.

[Kris Perry]: It takes so much heart and commitment and care to do what you’re doing and to bring the parents into this conversation today is very moving to me as a parent. And I hope it is to our listeners, too, because we can’t imagine something more terrible than what they’ve experienced and for them to rise above their own personal tragedy, work with you, work with each other, come to court every day to remind the companies and remind all of us of what this is doing to a generation of children was very courageous. And there’s still so many more months and years ahead, but I feel like we’re going in a really good direction.

Your leadership is incredibly important in this  movement, Matt, and I really want to thank you for joining us today. Your perspective on how product liability interests with the design of today’s digital platforms adds an important dimension to this conversation, especially as families, researchers, and policymakers continue to grapple with the real world impacts of social media on vulnerable young people. 

For our listeners, if you found this conversation helpful, we encourage you to learn more about Matt’s work at the Social Media Victims Law Center and to visit Children and Screens for research-based resources on children’s media use, platform design, and digital safety. And if you haven’t already, be sure to subscribe to Screen Deep wherever you get your podcasts. New episodes are released regularly featuring leading experts helping us make sense of the digital world our children are growing up in. Thanks for listening and we’ll see you next time on Screen Deep.

Want more Screen Deep?

Explore our podcast archive for more in-depth conversations with leading voices in child development and digital media. Start listening now.