From parent and school restrictions on smartphone use to the design of the latest AI tools, what does it mean to center children’s own perspectives and rights in ensuring safety in their online worlds? On this episode of Screen Deep, host Kris Perry sits down with Sonia Livingstone, Professor at the London School of Economics and Director of Digital Futures for Children , a joint LSE and 5Rights Foundation research centre to explore the intersection of children’s rights, parental mediation, and public policy in the digital world. Drawing on her research and policy work, Dr. Livingstone describes how parents can best help mitigate online risks, and how policies should help protect children from harmful online content and mechanisms while reinforcing their rights to privacy and autonomy. Dr. Livingstone emphasizes the importance of talking to children to learn how they feel about various technologies and incorporating their insightful perspectives into both household rules and macro-level policy.
Listen on Platforms
About Sonia Livingstone
Sonia Livingstone OBE FBA is a professor in the Department of Media and Communications at LSE. She has published 21 books on media audiences, children and young people’s risks and opportunities, media literacy and rights in the digital environment, including “Parenting for a Digital Future: How hopes and fears about technology shape children’s lives” and “Digital Media Use in Early Childhood: Birth to Six.” Since founding the EC-funded “EU Kids Online” research network, and Global Kids Online (with UNICEF Office of Research-Innocenti), she has advised the Council of Europe, European Commission, European Parliament, UN Committee on the Rights of the Child, OECD, ITU and UNICEF. She is currently leading the Digital Futures for Children centre at LSE with the 5Rights Foundation. See www.sonialivingstone.net.
In this episode, you’ll learn:
- What children themselves say about the kinds of support they need from parents and communities.
- Research results on the most effective parenting strategies for keeping children safe online
- How income, access, and background shape online safety risks.
- How the new AI Design Code could help developers prioritize children’s safety in in design.
- Why addressing harmful algorithms through regulation of online platforms is key to increasing child online safety.
- What the latest research says about the effectiveness of school phone bans and their impact on learning and child well-being.
Studies mentioned in this episode, in order of mention:
Convention on the rights of the child/ (1989) Treaty no. 27531. United Nations Treaty Series, 1577, pp. 3-178.
Livingstone, S., & Helsper, E. J. (2008). Parental Mediation of Children’s Internet Use. Journal of Broadcasting & Electronic Media, 52(4), 581–599. https://doi.org/10.1080/08838150802437396
Sáez-Linero, C., and Jiménez-Morales, M. (2025). Young, lower-class, and algorithmically persuaded: exploring personalized advertising and its impact on social inequality. Communication & Society. https://doi.org/10.15581/003.38.2.005
Wood, S. (2024). Impact of regulation on children’s digital lives. Digital Futures for Children Centre, LSE and 5Rights Foundation.
5Rights Foundation. (2025). Children & AI design code: A protocol for the development and use of AI systems that impact children. https://5rightsfoundation.com/children-and-ai-code-of-conduct
Rahali, M., Kidron, B., and Livingstone, S. (2024). Smartphone policies in schools: What does the evidence say? Digital Futures for Children centre, LSE and 5Rights Foundation.
Hartstein, L. E., Mathew, G. M., Reichenberger, D. A., Rodriguez, I., Allen, N., Chang, A. M., Chaput, J. P., Christakis, D. A., Garrison, M., Gooley, J. J., Koos, J. A., Van Den Bulck, J., Woods, H., Zeitzer, J. M., Dzierzewski, J. M., & Hale, L. (2024). The impact of screen use on sleep health across the lifespan: A National Sleep Foundation consensus statement. Sleep health, 10(4), 373–384. https://doi.org/10.1016/j.sleh.2024.05.001
Stoilova, M., Livingstone, S., and Nandagiri, R. (2019) Children’s data and privacy online: Growing up in a digital age. Research Findings. London: London School of Economics and Political Science.
[Kris Perry]: Hello and welcome to the Screen Deep podcast where we go on deep dives with experts in the field to decode young brains and behavior in a digital world. I’m Kris Perry, Executive Director of Children and Screens and your host. Today, I’m honored to be joined by Professor Sonia Livingstone, a pioneering voice in the field of children’s digital lives. For decades, Sonia has been at the forefront of exploring how youth experience the internet, social media, and emerging technologies, not just as users, but as individuals with rights, perspectives, and voices that matter. She is a professor in the Department of Media and Communications at the London School of Economics and has published 21 books on media audiences, children, and young people’s risks and opportunities, media literacy, and rights in the digital environment. She is currently leading the Digital Futures for Children Center at LSE with the Five Rights Foundation. Sonia’s work has helped define what it means to advocate for children in the digital age. Sonia, let’s dive in.
[Sonia Livingstone]: Okay, thank you so much for inviting me.
[Kris Perry]: Sonia, you were an early force in not only studying how children and youth are using the internet, social media, and digital technologies more broadly, but also how their perspective can lead to new ways of evaluating and thinking about how youth actually use media. Can you take us through how you came to be involved in studying children, media, and children’s rights?
[Sonia Livingstone]: I’ve been studying children and media—“new media,” we’ve often said, like children in relation to the latest, newest, whatever it is—actually since 1995, which is when “new” meant the remote control, multiple channels, and the big controversy was whether children should be allowed to have a television in their own bedroom—a bit of a privileged question, even then. And even then, and I was kind of invited, I was actually given funding to slightly pivot my life and think about children and how they’re always at the leading edge of changing media. And then I discovered that they were somehow at the focal point of controversy and concern. And that made it very interesting, because children’s media use is not especially a kind of party-political, it’s not a left/right thing, but it’s issue people fight over in terms of what’s right for them, how much should we listen to them, what do they need at school, should they be regulated and so forth. So, I just found it opened up lots of really interesting questions.
[Kris Perry]: So this next question may seem like a big pivot for our listeners because this podcast, it mostly focuses on the details of research and findings. But your research has led you to a systems-level perspective, sometimes known as “children’s rights,” which you’ve really helped to develop over time. And we’re hoping to find out from you if there’s really a common understanding about children’s rights in general and what are children’s rights specifically in the digital age?
[Sonia Livingstone]: So, I can think of ways of explaining how children’s rights are actually human rights and they matter. I think there is a common understanding. And there’s a common understanding in lots of basic ways, like some of the basic rights—the right to health, the right to live, to survive, to thrive, the right to be treated with human dignity, the right to be consulted about things that matter to you. I think for adults, we kind of recognize that these are what makes a civilized society. It’s not always delivered. In fact, most countries around the world don’t manage to provide health or dignity or to listen to their citizens sufficiently. But I think the idea that this is what we understand “human” to be.
And especially for children, I think it is generally understood that childhood or children deserve a good education, whatever that might mean in different parts of the world, but they deserve it. It’s hard for us to imagine childhood without play. So the right to play is one of those interesting ones that is in the UN Convention on the Rights of the Child and is kind of recognized as crucial for childhood everywhere. The right to be protected. Children, I think we might all agree, should grow up safe without being abused, without being exploited. They should have enough food. I think there are lots of things that people around the world can agree on, even though exactly what that means in terms of educational play or family differs.
So I don’t think the idea of children’s rights is controversial. There are some of the rights in the Convention that are more controversial than others and perhaps—oddly to me, but common, the right of the child to be consulted about matters that affect them is something that many states and actually many parents find quite hard to kind of, you know, they think adults know best. And we could discuss that because I do believe we can always learn from asking children what they think, even if we know other things about long term consequences, or balance of interest that mean we don’t always, you know, we don’t have to do what they say. We have to listen and take account.
So when we come to the digital, those are the same rights, but the right to education now means something different when learning is online, when schools are full of EdTech, and when all the information in the world can be found, you know, on your smartphone. So is it the right to all of that? Is it the right to what we consider the “good” educational resources? Is it the right just to access the information or also to contribute to it, to be guided through it? You know, there are a lot of questions to discuss. The right to play is even more controversial online, I think, because we all love the image of a child playing outside on a sunny day with their friends making up a game. We don’t love it so much when they’re online, playing a game, we worry—what are they seeing, what are they doing, are they being persuaded and so on.
So, you know, the rights are the rights as I see it, broadly agreed on, but culturally interpreted in different ways. And now we have this whole challenge of online.
[Kris Perry]: I love the focus on play since we know that is such a fundamental part of childhood and so tied to children’s physical, mental, cognitive development, emotional development. But then it’s sort of crosswalked with this idea of digital play and where there may be harms. I mean, there are harms outside in the real world. There are harms in your digital life, but it really is interesting for you to put it inside that “child rights” frame.
Parents and families use a wide range of strategies to keep children safe online by using several kinds of restrictions, whether it’s on apps or limiting access. In 2008, you conducted research into these different parental strategies for mediating online risks. What did you find?
[Sonia Livingstone]: We found that parents are really trying a number of different strategies, and broadly the strategies fall into two camps. One, as you indicated, is kind of about restrictions: “Don’t do this. Don’t use that. Don’t spend so long on this. Use it not, you know, this way, not that way.” I think that’s where parents often start. They feel that they’re the authority. They should know better and children need to be a certain age before they can do certain things. So that’s one starting point.
The other, I think, is something parents do very instinctively, but don’t always get the same recognition for. And we call that kind of “active mediation,” or enabling. And many parents understand it’s their role to support their children in kind of guiding the way, showing the way. And that can involve talking about what’s on the screen, helping them understand what’s good and bad about what’s on the screen, inviting the child to reflect on what’s working for them or not, supporting them making good choices about what’s on the screen. You know, I think parents like themselves better when they are the person that does the enabling, than when they’re the person that’s always saying no, but they are not quite so sure how to do it. And so what we see is, very often, parents kind of begin with thinking they’ve got to restrict, then they face that impossible conundrum that they know their child is growing up in the digital age. Their child often understands the digital better than they do. Their child often says, “Everyone else is using it. You can’t stop me.” So you could, but you don’t want to be fighting with your child all the time.
So the question is, can enabling activities work? And what we see in research—I’ve done it since 2008 and ever since, really—is that the enabling activities really help build children’s confidence. They help build children’s digital skills and the kind of social skills that go with it, you know, “How do you interact with people online?” It helps them build their informational skills. “How do you judge? Is this good information or not? Is this reliable? Is this trustworthy?”
But it doesn’t always keep them away from the risks. And that’s something I think I find quite challenging to explain, whereas if you restrict your child, you can keep them away from the risks, but you also, they don’t build the confidence and they don’t get the skills and it’s just a really hard one. And I think it’s a bit like riding a bicycle. You know, when you teach your child to ride a bike, new kinds of risks come into their lives, but also new competence and new agency. And that’s what bringing up a child is about.
[Kris Perry]: Mm-hmm. Oh, I love the bicycle analogy. Can you talk a little bit more about those enabling activities versus the restrictive position that a parent might take and what are some of the helpful ones that you’ve been able to find in your research?
[Sonia Livingstone]: I think there’s quite a wide research community now that’s been really trying to understand what do parents do and what they can do. I think we have some clear “don’ts” now, ironically, given that the first one is don’t say “don’t” to your child. Don’t kind of stop them without a reason. But I think when parents get anxious and they do get anxious about the digital world, I think their instinct is to restrict, judge, and punish. And children are very upset about having tech taken away from them or stopped. And they’re also very worried about—they’re very upset about being judged.
So the enabling is kind of the opposite of that, which is before you judge, before you jump in, talk to the child. Why do they want to do it? What are they hoping to gain from playing a certain game or visiting a certain website? And they maybe do some of those things with them, kind of accompany them on the journey. Children love it when you let your child know or think that the child knows better. “Explain to me,” you know, “I’d love to understand why, how you’re doing what you’re doing.” Occasionally, you know, the child can be teacher. But it means that you’re also building that trust relationship and you’re building the communication relationship.
And then, what we see in research, is parents have lots of expertise to draw on that may not be very high tech, but maybe they’ve been using digital resources at work. Maybe they’ve got a particular hobby and that’s given them a kind of deep dive into a particular kind of—how online communities might work. You know, so I think parents know a lot they can draw on and it is just a process. I often—things start to go wrong when children say, and I find this very distressing in both interviews and surveys, they say, “Something bad happened to me online and I haven’t told anyone,” or, “I haven’t told my parent because they’ll take it away or they’ll stop me.” And the child wants that kind of learning moment and they might decide, “You’re right, that was too much for me, I’m gonna pull back.” But they kind of want, you know, they need to develop their confidence to make some decisions. So I think it is all about scaffolding a child’s confidence because, in the future, they’re gonna be on their own without you there anyway. So they gotta learn.
[Kris Perry]: Well, and so many of these examples of children and their digital life and their parents trying to help manage it with them, are not that different from the risks that they encounter in the real world, and how it’s the parent’s job to be an outstanding communicator from the word “go” so that you’re building trust and confidence in the child and in you so that they feel like you’re a resource rather than something to be avoided, because the risks are real. We’re going to talk more about that in a minute.
But I wanted you to just maybe tell us a little bit more about how you’re thinking about parental involvement in managing youth risk online and how it’s changed over time. We know your work has been underway for almost 30 years, but in this study we just talked about was now, you know, close to 20 years ago. I mean, how much has changed since then?
[Sonia Livingstone]: I think it’s changed in many ways and interesting ways. So, one change, one kind of research we used to do in those days, 15, 20 years ago, was all about the gap. We would ask parents and children the same question. “Has your child played this game or seen this problem online?” And children would say, “Yes,” and parents would say, “No.” And we’d say, “Aha, look, there’s this gap. This is what we have to overcome so parents and children understand each other better.” Now, we find the gap is less. I think parents are keeping up better with what their children are doing, and children are turning more to their parents. And I find this very interesting, and I think it is partly that the, kind of, the worries and the risk about the digital have become an everyday conversation. You know, the child is no longer a pioneer on their own, exploring some kind of—exploring new territory. The risks have become, in some ways, bigger, and that’s a bad reason why it’s, you know, it’s familiar to everyone. It is very much on parents’ minds, but it also means that the parent-child conversation is much closer.
And I see children more often in surveys now saying, “If something bad happens, I’ll tell my parent,” and, “I trust that they will do the right thing and they will offer me the right advice.” I think that’s very positive, so that’s the big change. And maybe I also see a bit less of that “everyone’s doing it” argument, and a bit more children beginning to say, “In our house we’re like this,” or, you know, “My friends are often doing that but I can make a different decision.” I think it’s something a little—in a way, it’s kind of quite individualistic, but I think children feel a bit more empowered and parents feel a bit more empowered to say, “Well, we’re going to go our way and we’re going to talk about it and work it out because not everyone has to be the same. And we don’t all have to slavishly do the same that everyone else does. We have our values.”
[Kris Perry]: We’re talking a lot about children in general terms, but I was wondering if you could dive a little deeper into whether there are either structural or socioeconomic inequities at play with children online and their safety and what you could share about that.
[Sonia Livingstone]: Huge. And I think families are also pretty aware of all of those differences. So we could say in terms of socioeconomic differences, poorer parents, or with more pressures on them or are facing more difficulties, may have less time to spend with their child, less resources to kind of safeguard their child because they’re busy or they’re working two jobs or there are too many other competing demands. So those children, in some ways, we might think will be a bit less supported. But actually, maybe those are children who have to grow up a bit quicker in some ways, and they’ve got to—they learn to look out for themselves a bit in some ways. So, we don’t see evidence that poorer children or richer children are more or less safe. It’s kind of different kinds of protections around them.
We do see that children, poorer or wealthier, have very different resources in terms of gaining the benefits of online. And so, of course, the better-off children get the better tech, they get the faster connection, they get the more tutoring, they get coding clubs. There’s a lot of unequal resources in our society. So in that sense, socioeconomic differences between families kind of get reproduced in digital differences among families. And that’s, you know, that’s something where you want to target some resources on those less well-off children. Often through school, we often hope the school will be a kind of an equalising space where some children who don’t have stuff at home can get to, you know, stay after school and use the technology.
There are lots of other differences. I mean, I think we—you know, maybe we should talk a bit about children with special needs and disabilities. Which is one of those—it affects many children. They’re not always kind of mentioned, but there’s an area of research that really addresses their needs. And that’s one of those cases where we really see the intense dilemmas that technology poses because technology can really be the means of inclusion and it can really be the form of assistive technology that gives a child a chance. Not only kind of a physical chance to overcome some kind of disability, but also, I’ve heard in my research, a kind of hope that there’s a pathway for future work, an area where they could be skilled online, even if they’re limited offline, for example. But also, we see in research that children with special needs or mental health difficulties or disabilities get bullied more often online and they get taunted and they receive more hate online, and they can be excluded unfairly and without—so, it’s a very intense challenge to, you know, the technology can really overcome some of the real world problems, but it also introduces some intense risks.
[Kris Perry]: So now, going beyond parents and families, what responsibility do the online platforms and their shareholders and executives have in keeping children safe online? And what are they doing now, and what could or should they be doing in the future?
[Sonia Livingstone]: Mmm. I think they have a huge responsibility. Okay, let me give you an example of a piece of research I just read, which said that it was looking at the kind of algorithmic push of advertising to children. And it showed that platforms push more advertising for particular kinds of products to poorer children who may have fewer protections around them and it kind of knows who we are and it feeds us the adverts that it thinks we’re going to be vulnerable for. There’s also research showing that if children have mental health difficulties, particular kinds of adverts that maybe are going to give them more confidence, will come their way. So I think that’s very hard for parents and teachers to combat. I think it really is for the platforms to sort out their algorithmic feed so that they don’t play to people’s vulnerabilities. Even though it will make them, maybe it will make them a profit, there is something that really does feel against children’s rights and children’s dignity to kind of play on their vulnerabilities.
And that, I think, is a kind of example of why it is so important that we think about platform regulation. Because in the first, I don’t know, first 10 years of the internet, it was really, as Microsoft used to say to us, “Where do you want to go today?” It was, “where do you choose to go? What do you want to look for?” And now as you turn it on, and it’s all pushed at you, it’s fed to you. That word “feed” is so interesting. It’s fed to you based on your data, it’s personalized in ways that you don’t understand, it’s based on calculations about who you are that you don’t really have access to. And I think at that point, it has to be for the platforms. So we see more efforts being made by governments around the world in America, in Europe, in Brazil, in China, you know, lots of countries are trying to grapple with the question, not just of what content children see, but what flow of content is being personalized to them. And it’s proving very hard to regulate. And the industry, the platforms, are pushing back in many ways.
So we have been doing some research on the regulation process and in our last report on the impact of regulation, we could see that once regulation is passed, or even when regulation is about to be passed, platforms start making lots of changes. They start giving users more information about why they see what they see. They start giving users, including children, more tools about how to kind of control what the flow of the feed or the information might be. And they also start making more of those kind of “by default” changes. So, for example, a lot of regulation says you can use a child’s information to direct educational information at them, but not for your profit, not to advertise to them. And you can see some platforms doing it. But, in the last few months or year, we’ve also seen the platforms rolling back. And they are doing less of that, less moderation, less child-friendly services, less putting the child’s needs before business interests. And that’s a huge challenge, I think.
Kris Perry (25:42.626)
[Kris Perry]: Well, and that sort of corresponds with the advent, the deployment of generative AI. AI isn’t new, but the way generative AI tools have been either introduced or, frankly, embedded in a number of products from search engines to toys to games is quite rapid. The algorithm you brought up, the feed, was one thing, but just in this past year, it seems generative AI has taken us to an even more automated place.
[Sonia Livingstone]: Yes, it has. It has. We’ve been interviewing children in a number of countries over the last few months about how they use generative AI. So I would first say that, for many of them, it’s exciting. It’s fun. I will say children often receive an innovation with a lot of enthusiasm, and they try it out. And so they are intrigued by, you know, a bot they can chat to, a service that can make them images or write a piece of text or answer questions in smart and very fast ways. You ask a long question and pow, there is the answer.
They love that, but they are also puzzled. Why is it suddenly one day, quietly, embedded in their Snapchat or embedded in their WhatsApp or embedded in their Instagram? They didn’t ask for it. You can’t get rid of it. So of course, children are curious, they press on it. They want to know what it does, but it wasn’t a choice. And they don’t like that. They feel that something’s being pushed on them. They can see that parents and teachers are really concerned about it. Parents often for their safety. Teachers because they worry what are they going to teach the kids if all the information and the essays can just be produced by gen AI? So, you know, they’re intrigued, but they’re also concerned and it’s a perfect moment to be talking to children about how they use gen AI. Don’t assume they’re not using it. Ask them to show you how they are using it and what they think.
[Kris Perry]: Yet another great point around communication, communication, communication. You recently co-authored a Children & AI Design Code report with 5Rights, as well as others in the field. Take us through the basics of this design code. What are the essential criteria that one should consider when evaluating whether AI systems are safe and good for children?
[Sonia Livingstone]: So, it’s—the design code is really trying to show the regulators what good could look like, you know, if they are considering a law or some education guidance, it’s like, “This is what would be”—I’m actually holding it here. Can I just flash it in front of me? It’s to say, you know, this is a code that couldn’t, that can show you how to, what would work. So it does a number of things. It goes through a lot of what the problems can be, it’s really addressed to those who create gen AI tools and to deploy gen AI tools and it’s just—or AI tools. And it’s to say, think about how you’re making these tools. Think about how you’re using these tools and, you know, here’s some checklists. These are all the things that could go wrong. Think about them and don’t wait till it’s too late. Don’t wait till your product is designed and it’s really expensive to change it. Anticipate them in advance. Make a kind of balanced decision about what could be done to mitigate a risk in advance and then decide how you’re going to talk, whether this is appropriate for children and how it could be used by children.
I think it’s—it also tries to—the code breaks down all the different people involved in using, deciding to use AI. So it’s in a company, it’s the product manager, it’s also the person in charge with legal compliance to make sure that the tech meets the relevant laws, the person who’s in charge, the organization in charge of marketing to make sure it’s not sold for children when it’s intended for adults or sold for young children when it’s intended for older. So within a company, but then also all those who are using it, and I’m actually really interested in all those who start using AI in a school, let’s say. They may not really understand it, but they think, “Oh, this is a nice EdTech product. It can help the kids learn maths or whatever.” So they begin using it. So it’s for them to stop and think and really is a kind of stop-and-think tool. Make sure you know who’s responsible.
And then when, if something goes wrong—which it might—make sure there’s someone in the company, somewhere in the school, somewhere in the government who is responsible for looking into it, for responding. Because I think what the public knows—children, parents, schools all tell me, they find a problem with a technology, there’s nobody to tell. There’s no one who’s going to change it. There’s no one who’s going to kind of notice. So it’s making sure people know who is the, you know, the children’s champion in that company or the responsible person who can be contacted, who can try to redesign.
So it’s trying to think of, you know, all of this tech is a kind of ecosystem around a child. It’s not just a device but there’s so many different players who are contributing to building it. And it’s about them all seeing what their part is in the whole thing. So they know where they’re responsible and they know who else is responsible that they’re working with because many companies say and, you know, I hear them, that they don’t want to be part of it. They don’t want to design a bit of a product that ultimately is going to harm children. You know, they want to kind of see that this whole workflow, someone is giving mindful attention to. And AI is developing so fast that it needs a code. It needs that guidance for everyone to reflect.
[Kris Perry]: Well, it seems to me the code is underpinned by a philosophy that keeping children safe doesn’t necessarily mean that you’re stifling innovation. So, what does responsible innovation look like in the AI space or in the AI-for-children space when everyone is competing so fiercely for children’s attention?
[Sonia Livingstone]: Mm-hmm. It is very hard to require people in a way to slow down and reflect in this kind of frenzied moment when I know many companies and, actually, many countries are kind of looking for the first move advantage. They’re thinking, “If we can get in there first with our product, that will get rolled out to all schools,” or, “That will be on every child’s phone,” or, you know—and yes, that exists, but the history of technological innovation is littered with disasters, is littered with business failures, is littered with expensive lawsuits and angry parents. Right now—I don’t know about where you are, but we have a lot of parents protesting outside the government, outside schools, saying, “This is not safe, this is not right.” You know, this is incredibly bad publicity and damages brands and so on.
So I think even though it seems like a moment to rush forward, that’s a high risk strategy. And what we see in technology innovation over and again is those early ones don’t always stay the course and others can kind of come in on the back of your disaster and say, “Okay, I have a better way of doing it.” So it may seem like a hope, but I think the history of innovation also tells us that there are lots of different—this is an expanding domain. There’s going to be lots of different ways of developing AI and embedding AI in our lives. And the first ways are not often the ones that count. Think of MySpace or think of use groups. When I was doing my research 20 years ago, I was interviewing children about their personal home pages and that’s what everyone was doing. It’s gone, you know. A lot of things will go and we will—I hope we’ll get a second chance, but I also hope that many will make their first chance right.
[Kris Perry]: Thank you for bringing up parents again because earlier in this conversation, we talked about their role in a sort of one-on-one relationship with their child and their child’s devices and how to manage and mitigate at that level. And then you also brought up the role parents are playing at this larger sort of systems level or as activists and advocates for children broadly, not their own children, all children, because they’re seeing the impact not only to their child, but their children’s friends, everyone in their family, their children’s school. It’s reached this point, a tipping point, you might say, around the globe. And wondering if you’ve seen policies that are doing a better job than others and safeguarding children online? And if so, which countries are leading the way and why?
[Sonia Livingstone]: I’m tempted to give you the depressing answer, that I’m seeing politicians in many countries almost kind of jumping on the bandwagon of parents’ frustration and saying, “Okay, we’ll call for a ban. And my, you know, my kind of percentage share in the popularity stakes will go up.” And I am astonished at some of the bans on social media or bans on technology that are being called for, not because I think children should have everything, access to everything, you know, really, really early, but because a ban is such a blunt instrument. You know, what I would like is for people to consider what children need at different ages, at different situations and make a judgment about what’s appropriate rather than just say, “Ban it all, make it go away.”
So I’m worried by that movement also because it suggests that, for the very complex question of the role technology is going to play in our lives, it suggests that there are going to be some simple solutions. And I’m afraid there aren’t. There’s going to be lots of solutions and we will evolve them into a successful strategy. So, you know, car safety is a good example. And we’ve discussed it—it did take 50 or 100 years, but now what do we have? We do have, you know, people have to have car insurance, they have to have seat belts, their cars have to be checked for safety, the roads have speed limits. Just in America, we were saying, “20 miles an hour everywhere, this is great, this is regulation that works. I’m sure children are safer and people have got used to it.” You know, it’s taken a whole set of strategies.
So I think the Australian model of Safety by Design is that kind of a strategy. It’s trying to look for different parts of a strategy that can work. So advising parents, thinking about school curricula, regulating big tech, offering good practice for tech that is willing to try to do the right thing.
Doing the research to make sure that the strategy is evidence-based. I think in Britain we are doing quite a bit of that. We’re a bit later in the game, so we’ve yet to see how much it’s going to work. And many other countries I know are trying different strategies.
But there is something, especially when it comes to the regulation of tech, that requires a critical mass. And so in that sense, I think, what happens from the European Commission and what happens in the United States of America is really going to change the game for big tech, if it works, if it can be done. But I’m not sure and you may tell me, but my sense is that those policies to regulate don’t always go hand-in-hand with policies to educate children about technology or resources to advise and support parents. That’s kind of left to others. So, you know, the good practice would be to do all of those things together. So we have the different parts of the puzzle working hand-in-hand and we kind of know that where the teachers can’t advise anymore because it’s so technically complicated, that’s where we need regulation. But we don’t want regulation for everything. We want some things left to parents to choose. We want some things left to children, you know. So it is different parts of quite a complex puzzle. I don’t think it’s going to get simpler.
[Kris Perry]: No, and I think that list of other things we should be looking at at any point we’re creating policy change would include studying whether the policy change had its intended impact. So a ban, and you’ve talked a lot about those bans on technology, whether—do they work or not? As a research institute, we would love to know the answer to that. And I mean, in some ways, they’re experiments. And you’re a researcher. You just co-authored a report on phone bans in schools in a few European countries. What did the reports find, and are school phone bans or restrictions effective in helping focus kids or in improving their learning environment?
[Sonia Livingstone]: Not very much, not very much. But I think for interesting reasons. So, it’s not that—it depends what you think was happening before. Many schools are introducing restrictions and policies of different kinds. Actually, some always had them. So, introducing a more draconian ban helps in that we can see it helping in some ways like children concentrating in class which is important. But then, often those schools will say, “Well, we never let the kids have the phone in class, anyway. Okay, so now they can’t bring it to school. Before they could use it in the corridor, but they never had it in class.” Or a school, schools will say, “We’ve now banned it in class,” and the children will say, “Well actually, we have it under our desks, we have it in our bags, we have the other phone.” You know, children will get around things.
So, there isn’t good evidence that phone bans really work, because A) teachers were being quite sensible before, B) kids will try to work around something they think is unfair, that hasn’t been done with their, you know, involvement in some way, and then C) because some of the problems, some of what happens in the school day is so much less influential than what happens out of school. So does a phone ban in school stop bullying? No, because the kids will be bullied at home if they’re not bullied at school. Does it mean that children sleep better? No, because the phone ban at school isn’t going to solve the sleeping problem and being online at—so, we need to think what are our outcomes? And the best evidence we’ve seen is that sensible restrictions in school with some consultation with teachers and children and parents can help children study and concentrate better, and that’s a win.
[Kris Perry]: What do you think needs to be done to make the evidence clearer about phone bans, screen time, or other approaches for guiding healthier use? Or do you think that the evidence is already clear?
[Sonia Livingstone]: I think we should be clear, need to be clear about our outcomes. I think there is probably now reasonably clear evidence that sleeping with your phone delays you going to sleep and can interrupt you and wake you up, especially if you have your notifications on. And that’s a problem. I don’t think that’s terribly controversial. Personally, I like to sleep with my phone, because I listen to a podcast and it helps me drop off and that’s it. No notifications. So there’s always, you know, it can be done, things can be achieved in different ways. But broadly, there’s, there’s some evidence about sleep, and clinicians will say, “Children not sleeping enough is definitely a problem for their schoolwork, for their mood, for their confidence, and so on.”
But beyond that, we don’t have evidence that the more you use your phone or the more screen time you have, the worse off you are. What we have is growing evidence that different kinds of uses are linked to particular kinds of problems. So, risky uses, uses of social media, let’s say, that put you in the way of bullying or mean that you see hate or that you find yourself kind of invited into self-harm groups, clearly, that’s a problem. But children, if guided, can find other ways of more positive, more helpful, more encouraging screen time that they really value.
So the question is not how much screen time, but what are they doing on the screen? And is the outcome we’re worrying about their sleep or their bullying or their school grades? I think everyone has probably—I’ve certainly interviewed the child who spends a lot of time on the phone in their evening and I say, “But how was your day?” And they say, “Well, I did all my homework. I’m getting good grades. I’ve played football. I’ve been out. I’ve seen my friends. I just need to chill and unwind. And so I do two hours of whatever it is and I’m ready for bed.” So, you know, different balances work for different people. A lot of children say about games—which we don’t talk enough about—that games are a place where they see their friends. Games are a place where they kind of feel part of their community that gets them, that understands them. Not too much, not 24/7. Yes, your grades matter and so on. So it’s not the screen time, it’s what you’re doing, what you’re getting from it and what outcomes we’re looking for for children.
[Kris Perry]: You’ve been a champion for youth voices and youth speaking for themselves on the issue of the day. What are you consistently hearing from youth that you wish parents and adults in children’s lives today could or should really hear or understand?
[Sonia Livingstone]: They are saying, and I hear this in many ways and from children of all ages, “Consult us, talk to us, ask us our experience. We might agree with you. You might be right anyway, but take into account what we are experiencing and what we see. Very simple example. The game I’m about to play lasts 45 minutes. If you say supper is ready in 40 minutes and the computer goes off, I’m upset. If you say supper is ready in 50 minutes, I’m upset because I’ve started the new game. Talk to me. Ask me how long the game plays. 45 minutes is a deal.” You know, that kind of thing. It’s not that the children want to be unreasonable, they want to be heard.
They really do believe that the world they grow up into is going to be full of technology and that they kind of need to understand it. So yes, I think they like feeling protected by their parents. They definitely like feeling their parents are there for them, but they don’t feel that they can be kept in the 19th century, as it were. You know, they’ve got to be in that world. And they know that they’ve got to figure it out for themselves eventually. So they want to kind of see where they are on that journey. And that means making—being allowed to make some mistakes without being judged or punished. Because a lot of the mistakes they make, they have a way forward, you know. So maybe you say to the child not, “Oh, that happened, that’s terrible,” but, “That happened, how do you feel? What do you think I should do? What do you think? You know, what could, what can we do together?” It’s—they like being partners in this, even when they’re very little.
And adults may be surprised, but they also find that they’re also quite optimistic. In fact, I’m increasingly asking people, “How do you feel about the future? Are you looking forward to what’s coming next?” And many adults are quite fearful. Many adults are in a bit of a pessimistic zone, but children are often optimistic and the question is not because they want insane goals, but because they see they could have a role and I think it’s our job to build their confidence for that.
[Kris Perry]: I love that. Is there any research or finding you were involved with in your career with children media use that was your biggest aha moment that changed how you thought about something or shaped your future work?
[Sonia Livingstone]: Probably lots of them actually, because every time I work with children, I kind of see things a bit differently. One example I might give, when I began some research on how children understand the question of data and where does data go and how is data used for them or indeed against them, a lot of people said to me, “You can’t ask children about data. They can’t see it, it’s behind the screen, they have no idea about business and so on.” And I began working with children. These were children from about 11, though later I began working with even younger. And we found ways, you know, the researcher is always trying to kind of find tasks. So we found ways of—we drew lots of bits of data on cards and said, “Okay, your favorite films? What you bought last week?
Who your friends are? What your sexuality is?” And then we said, “Okay, who can know these different things? Can your friends know? Can your parents know? Can your school know? Can Amazon know? Can the government know?” And when we made it really concrete and broke it down, actually, they had lots of opinions and they were ready to deliberate, you know, “Why should Amazon know my address? Oh yeah, they want to deliver the parcel. Okay, they need to know my address. Can they tell anyone else? No, they shouldn’t tell anyone else. It’s only to deliver.” So it’s a bit of a long-winded answer. I think we can consult children about anything. It’s a matter of how we do it and then they have opinions and they have things to say and I love learning from them as well as rising to that challenge of designing the tasks.
[Kris Perry]: Sonia, thank you so much for joining me today to share your powerful insights on children’s rights, digital safety, and the evolving role of technology in young people’s lives. Your work has helped shape the global conversation in extraordinary ways, and I’m grateful for all you’ve done to center children’s voices and needs in this space. And thank you to our listeners for tuning in. To read a transcript of this episode and explore more resources on digital well-being, parenting, and youth development, visit childrenandscreens.org. Until next time, keep exploring and learning with us.