With the 24-hour news cycle and a flood of sharable and easily manipulatable information, young people face a daunting task of identifying, evaluating and discerning fact from fiction, which even adults find difficult. On Wednesday, August 19, 2020 at 12:00pm EDT, Children and Screens held the #AskTheExperts webinar “Fact or Fake? How to Teach Your Kids to Spot Misinformation,” which offered evidence-based advice for parents, educators, children and teens to critically analyze and differentiate between news, opinion, propaganda, misinformation, and advertising. This webinar featured some of the nation’s leading educators, researchers, and parenting experts who shared common pitfalls experienced by young people online, discussed the complex information landscape, and helped everyone in attendance develop skills to become media savvy.

Speakers

  • Joel Breakstone, PhD

    Director Stanford History Education Group, Stanford University
    Moderator
  • Jevin D. West, PhD

    Associate Professor; Co-Director; Director Information School; Data Lab; Center for an Informed Public, University of Washington
  • Diana Graber, MA

    Author; Co-Founder; Founder Raising Humans in a Digital World: Helping Kids Build a Healthy Relationship with Technology; CyberWise; Cyber Civics

[Dr. Pamela Hurst-Della Pietra]: Welcome everyone to this week’s Ask the Experts workshop. I am Dr. Pamela Hurst-Della Pietra, founder of Children and Screens: Institute of Digital Media and Child Development and host of this popular weekly series. Thank you for being here today, and thank you in advance to our outstanding panelists. Children and Screens is one of the nation’s leading non-profit organizations focusing on digital media’s cognitive, psychological, physical, and social impacts on infants, toddlers, children, tweens, and teens. We partner with researchers, clinicians, public health professionals, educators, parents, and more to advance the science around digital media and child development, educate the public and policy makers, inform clinical practice, and improve the lives of young people and their families in a digital world. Along with funding research into the impacts of the increased digital media use during the pandemic, we are supporting families by hosting weekly webinars that feature interdisciplinary experts discussing the topics and questions on all of our minds. Digital literacy is more important than ever in the midst of a global health crisis, an election year, and a revitalized social justice movement. Today’s workshop is all about giving you the tips and tools you need to identify whether information is accurate, well-sourced, and reliable, or if you are being duped by deep fakes and skewed information. Our panelists have reviewed the questions you submitted and will answer as many as they can during the presentation. If you have additional questions during the workshop, please type them into the Q&A box at the bottom of your screen, and this time, indicate whether or not you would like to ask your question live on camera or if you would prefer that Dr. Breakstone read your question. Please know that we may not be able to answer all of your questions, but we’ll answer as many as time permits. We are recording today’s workshop and hope to upload a video onto YouTube in the coming days. You’ll receive a link to our YouTube channel tomorrow, where you can find videos from our past webinars on the YouTube channel as well. It is now my great pleasure to introduce our moderator, Dr. Joel Breakstone, the director of the Stanford History Education Group at Stanford University, which has worked to establish just how well or poorly young people identify fake news. We are so grateful that Dr. Breakstone is here today. Welcome, Joel. 

 

[Dr. Joel Breakstone]: Thank you so much, Pam. I appreciate the introduction. It’s a pleasure to be joining everybody today. What I would like to do before we move on to our other panelists is to provide a little bit of background on what we know about how well students do or don’t make sense of online information. I’m just going to share my screen here to walk through some of the research that my colleagues and I have conducted at the Stanford History Education Group around these issues specifically. We’ve been working in this realm for the last six years, focusing on how students make sense of all sorts of digital content. Unfortunately, the research has been quite consistent, which is that students struggle to make sense of the overwhelming amount of information that comes at them across their screens, the misinformation, disinformation, and propaganda. This headline from The Wall Street Journal summarizing our research project in 2016 neatly summarizes it: “Students don’t know when news is fake.” We collected almost 8,000 student responses to a series of tasks that asked students to evaluate different kinds of online content, and in examining those responses, we found that students from middle school to college had great difficulty in completing even the most basic evaluations of online sources. So, for instance, we asked middle school students to look at the home page of Slate and to just identify whether or not different items were news or advertisements. And 82% of the students were unable to identify whether or not an item that was marked as “sponsored content” was actually an advertisement. We saw responses like this one: the student wrote, “The purpose is not to try to lure people to use a website or product. It’s just an article about why women don’t go into technology.” So students did not have a familiarity with this basic term of “sponsored content,” and they were unaware that there’s this whole genre of advertisements that are out there, and so they were unable to know when someone was seeking to influence the way they thought about a particular issue. That was in 2016. In the intervening years, there has been an enormous amount of time and energy devoted to this issue of misinformation, and so we sought to determine whether or not things have improved. So over the course of the last couple of years, we did a second study where we asked 3,446 high school students from all across the country—it’s a sample that represents the makeup of high school students in the United States—and we asked them, again, to complete a series of tasks where they looked at real sources from the Internet. And unfortunately, once again, we found that students really struggle to evaluate online sources. So, just as one example, we gave students a video that came from Facebook. The headline was “2016 Democratic Primary Voter Fraud Caught on Tape,” and it was posted by somebody called “I on Flicks.” And the note was, “Have you ever noticed that the ONLY people caught committing voter fraud are Democrats?” At the time when we grabbed this, there were almost a million views. There are now well over a million views of this same video. I want to just show you a brief excerpt of the video. There’s no audio in it, but it shows what the video claims are examples of voter fraud during the election in a series of different states. So it’s just about 20 seconds here. It says that it’s in Illinois; there’s a woman looking away and somebody else walking over, and it looks like putting lots of ballots into a collection bin as the other woman seems to go be perhaps embarrassed, and then the worst case of all, supposedly in Arizona, again, somebody is stuffing ballots supposedly into a bin. Somebody rushes over to help with the vote stuffing. If you were to leave the video and to go search online, you would quickly discover that this is a series of videos that were from Russia, and they have nothing to do with voter fraud in the United States. However, overwhelmingly, students thought that it was a trustworthy piece of evidence when we asked them, “Does this post provide strong evidence of voter fraud during the 2016 Democratic primary election?” 52% of students said that it was strong evidence of voter fraud. So we saw responses like this one. The student wrote “Yes, it shows video evidence of fraud in several different states at multiple different times.” Most other students had rejected the source but not for the reasons we’d want. They didn’t ask the questions of, “Where did this video come from? Who posted it?” Instead they wanted just more evidence. They said things like, “The video only shows a few specific instances, and is not enough to blame the whole Democratic party of voter fraud.” They weren’t equipped to ask these basic questions of “where’d this come from?” and “should I trust this evidence?” This is an alarming set of results that indicates that we need to help students to be better at making sense of online information, and so we wanted to figure out what are ways to support students to be more discerning consumers of information? And to do that, we needed to have clarity about, what does expertise look like? What does it mean to be a skilled evaluator of digital content? And so we set out to do a research project. My colleagues Sam Wineberg and Sarah McGrew did a study where they asked three different groups of people to evaluate online sources: freshmen at Stanford University, young people in the heart of Silicon Valley; history professors, folks who examine sources all the time and determine whether or not something is trustworthy; and also professional fact checkers from the nation’s leading news organizations—and they asked them to examine a series of sources and to think out loud as they looked at them. And what we found was that there were really stark differences in the way these groups of people examined the source. For example, we had each group look at an article from minimumwage.com, which is a slickly-designed website. The particular article was about Denmark’s minimum wage policy, and asked them whether or not it was a trustworthy source. minimumwage.com describes itself as a project of the Employment Policies Institute, which is, by their description, “a non-profit research organization dedicated to studying public policy issues surrounding employment growth.” Sounds very trustworthy. They link to research reports, they list their address and contact information—all of these markers of, perhaps, a trustworthy source of information. However, in fact, it is a front group for a public relation firm, Berman and Co., and they are working for the food and beverage industry, who have a vested interest in keeping minimum wage lower. However, the only way that you can learn that connection is if you leave the website and search for information about minimumwage.com and, more importantly, Employment Policies Institute. What we found was that many of the Stanford students and also many of the historians did not do that. So this is a graph that just indicates how long it took each group to discover that Berman was behind the website and also how many of each group were able to do that. Only 40% of the Stanford students discovered that this website was actually from Berman and Co., and only 60% of the historians did. 100% of the fact checkers did, and they did it much faster than any other group. It’s really striking. And so, given that we see that the fact checkers are better, the key question is, what’s the difference between the two groups? And so these smart people—and I think it’s really important to emphasize this. The historians and the students are like all of us—we are unequipped to deal with the current digital landscape. And so they did things like focusing on a single page. They read vertically, up and down, on that one page. They never left the page to find out what other sources said about it. And then when they did engage in searching, it was often very unfocused. Many of them would just click on the top search result, rather than pausing to look for which source might be the best starting place. And they trusted their intelligence, their belief that they knew what the answer was, and they had a sense of what the source looked like, and they relied on familiar markers—things like the appearance of the site, whether it was a “.org” or a “.com,” whether or not it had contact information, links to credible sources, and often that led them astray with this particular site. In contrast, the fact checkers almost instantly left the page, and they read laterally, this move that we’ve described as “lateral reading,” of thinking about the the horizontal axis of their browser, of opening a new tab and reading on those other tabs by searching on the broader web to find information about the original site. And they engaged in click restraint when they did get search results, rather than clicking on the first search result, they would often pause and decide which might be the best source. That would take 30 seconds but often gave them much better information than they would have otherwise encountered. And they distrusted their intelligence. They didn’t think they knew the answer. Instead, they turned to the web and understood that the web could often provide much better information to quickly give them some immediate information. Again, looking at that chart, they were able in just over three minutes to find out that Berman and Co, a public relations firm, is behind minimumwage.com. That changes the way that we’re going to think about that source of information. It doesn’t mean that we have to reject it whole cloth, but we should at least take that into account when we’re evaluating the information that it provides. In looking at the work of the fact checkers versus these other two groups, we saw that they were guided by three big questions: Who’s behind the information? What’s the evidence? And what do other sources say? We refer to this as civic online reasoning. These are the skills and questions that we need to ask in order to be civically engaged in this information age. And so we set out to try to create curriculum that could help students to learn these questions and the skills of fact checkers, for instance, this move of lateral reading, of leaving a given website, opening new tabs, and seeing what other sources say to get more credible information. And we tested these curriculum materials last year. We did a study in a large, urban district in the Midwest, where we gave two groups of students a pretest and a post-test, and then some of those students received the Civic Online Reasoning curriculum, and some students didn’t. And what we found was that the students in the Civic Online Reasoning classrooms became more skilled at evaluating digital content after just six lessons. This brief excerpt from a student response shows how one of the students evaluated that same video after they had received the curriculum. So at the beginning of the study, she had said, “I don’t think it’s good because I want to see more videos,” but after the lesson, she said, “I’m wondering who’s behind the publishing of this. Who is responsible for getting this on film? What were their intentions? Is there more information about this somewhere else?” “I don’t trust this video. I’d definitely just Google ‘Pennsylvania Democratic voter fraud.’” And then when she did, an article from the fact-checking site, Snopes, popped up, and she read it and it says, “‘A video passed off as voter fraud committed by Democrats during the 2016 primary was filmed in Russia.’ So I already know this is not true. It is just completely false.” A really dramatic turn that she’s learned a skill that can quickly give her better information about the information that she’s encountering online. The materials that we used in this curriculum intervention are available for free on our website, cor.stanford.edu. They are organized into those three big questions of, “Who’s behind it?” “What’s the evidence?” And “what do other sources say?” The website also features a series of videos that can provide tips about teaching these skills and also general information based on a video series that we made in collaboration with John Green and his team at Crash Course. So I hope you’ll have a chance to check those out, and I look forward to answering more questions once we get into the question and answer session at the end. If you have additional questions after the end of this session, also would look forward to being in touch directly via my email address. But for now, I would like to introduce my colleague, Jevin West, who is an Associate Professor in the Information School at the University of Washington. He’s the co-founder of the DataLab and the director of the new Center for an Informed Public at UW. His research and teaching focus on the impact of technology on science and society, with a focus on slowing the spread of misinformation. He co-developed a course, “Calling BS,” that teaches students how to combat misinformation wrapped in data figures and statistics. The course is now being taught at universities around the globe. Welcome, Jevin. 

 

[Dr. Jevin West]: Thanks so much, Joel. I’m going to share my screen. And the work that Joel’s doing, and his group at Stanford, are things that we reference all the time. So I just want to say that I can be sort of an outside source to evaluate his sources. If I was another tab in that other screen, I am that other tab. It’s actually really important work. We, and a lot of people in the field, look to that constantly. So okay, I’m going to share my screen. I just want to make sure that you can see it. Joel, just give me a thumbs up if you can see it. Okay, perfect. Okay. So I first want to just introduce myself. Joel’s already given some of the background, but I am the director of a new center at the University of Washington called the Center for an Informed Public. We launched in December, not knowing that COVID was around the corner. As a way of providing a hub for researchers, policy makers, journalists, educators, librarians—you name it—around this issue of misinformation. Of course, COVID hit and that’s kept us up day and night ever since then, but this is really at the center of what we do. So our mission statement in the Center is “to resist strategic misinformation, promote an informed society, and strengthen democratic discourse.” And a big part of what we do is community engagement and education, so what I’m going to talk about today are some of the education efforts that we have and sort of leave maybe some of the research to maybe question and answer. But I will refer to it often. So let me start with one project that my colleague, Carl Bergstrom, and I released at the beginning of last year. This was a project we called whichfaceisreal.com. It’s a very simple game that you can play, anyone can play, you can just go to this website and play it with your students or with your kids or with your friends. The game’s simple: there’s one image that’ll be real, a real person on this earth that exists, as far as I know it, and one that was generated in our lab or in someone else’s lab based on what’s called GAN technology. These are generative adversarial network kinds of algorithms that create photorealistic faces. A lot of the technology that’s really behind what are called deep fakes or synthetic media. So deep fakes give anyone, really, that has access to this technology the ability to put whatever words you want in someone’s mouth and have them say in video pretty much anything you want. And that makes it even more difficult for our children to be able to discern what’s real or not because if they see an image in social media of a person, it sort of is an indication that it might be a real person. Also if they see a video of a politician or of some leader say something that maybe they never said, that can make it really, really hard. So not only do they live in this social media environment where there’s all sorts of fabricated news, there’s also all sorts of manipulated advertisements. You know, the world they’re growing up with, I think is much more difficult when it comes to misinformation. But now they have deep fakes to contend with. So this project, among many that we have going, was a way to bring public attention to the technology. It doesn’t mean that we have to give up and not believe anything we see online; in fact, that’s something that concerns me almost just as much for the younger generation. As we talk about misinformation and disinformation, I want them to be better discerners of the truth, and we want to give them the tools to be better BS callers, but we also don’t want to create a bunch of nihilists and a generation that just trusts nothing. Because democracy depends on us trusting these institutions and gatekeepers and experts. So when we talk about it, we really want to be careful about talking about misinformation because we want them to have some trust in these institutions. So, I don’t have time to play the game—you can go play the game—but at least in this example, if you’re looking at the images, the one on the right with the blue shirt is the one that’s real. Now, don’t feel bad if you chose the one in the red shirt. I’ve missed thousands of these—I miss many of these and I look at thousands of these. So don’t feel bad at all. It’s actually a really hard game. We also have a deep fake quiz that we’re going to be releasing in a couple weeks. This is in collaboration with Microsoft, USA Today, DeepTrace, and our Center for an Informed Public. This will be a game that anyone in the public can play. Kids can play it. In this game, there will be images, but there will also be videos, which makes it even more difficult. You’ll find how difficult it is. Hopefully it won’t be something that makes you want to give up, like I said, but hopefully gives you a little bit of exposure to what we might see in the upcoming election. That’s the big concern right now, is that this technology is going to rear its ugly face maybe a week or a couple days before the election. It’s already happened in other countries—in Brazil, in Iraq, and countries in Africa—before major elections, so there is some concern. Maybe it won’t happen, but we need to be prepared for it. So we have this deep fake quiz we’ll be releasing nationally; anyone can take it if they want. You’ll get to play with some of the videos. It’s kind of fun anyway, and it is also scary. We also will have a workshop that we’re putting out—a public workshop that anyone can join. If you want to learn more about deep fakes, you’ll see it highlighted—it’s on September 1st at 2 pm EDT. There’s a website there. I’ll make these slides available. This is a workshop that’ll be journalists from around the world, technologists, and policy makers to talk about deep fakes, just to give you more exposure if you’ve never really heard of deep fakes. All right, let me just talk a little bit about misinformation during COVID. As I said, our center is devoted to tracking rumors and misinformation during crisis events. That’s kind of our expertise, so as you can imagine, this is considered absolutely a crisis event. And you can also imagine that rumors have been flying; false rumors, of course, have been flying at viral speeds. And we track those kinds of things, we try to understand the dynamics of the amplification of the individuals and how things take off. Here’s an example of a graph of the number of tweets around the 5G conspiracy theory, which is this idea that 5G is behind a lot of what we’re seeing with this pandemic, that it suppresses our immune system. Of course, none of this is true. It’s not true. I want to emphasize, it’s not true. But it’s spread, and actually, the 5G conspiracy theory existed far before the pandemic, but these crisis events allow for a lot of these conspiracy theories to re-emerge and to go viral again. And a lot of times, conspiracy theories collide. We’ll be publishing a bunch of papers on this, about COVID specifically, but I mentioned this because we are constantly being flooded with these conspiracy theories, and one of the reasons that it does so well during crisis events is that there’s this real high level of uncertainty and anxiety. People want information, and those conspiracy theorists and those propagandists and those opportunists fly in and provide answers—simple answers—to some of these hard questions that science and research is grappling with right now as we try to get out as much as we can of this pandemic. We also are concerned, of course, not just about COVID, but the upcoming U.S. election. So we’ve engaged in a collaboration with Stanford University and several other organizations to do real-time monitoring of misinformation during the elections, specifically on nonpartisan issues. And one issue in particular, which is election integrity and voting integrity. So we publicly announced that a couple weeks ago; again, that’s something that we want everyone involved. In fact, a lot of our input—that’s the kinds of things that people would see on Facebook or Twitter—will be coming from regular citizens, and they can be coming from students. We’ve already engaged some students, so if you have students that want to engage in this, let us know. We also have like the AARP; we have librarians. We have all sorts of organizations—if they see things in their digital worlds, they’re gonna input them to us. So if you wanna learn more, you can also go to this website or contact me directly if you’re interested in monitoring misinformation, which is great practice, by the way, in fact checking. So we use it as an education opportunity but also on a really important issue. Our general philosophy is to engage in the community as well as doing research and engaging with policymakers. So we have this idea of community labs, and we’ve been doing programs like what we call MisinfoDay, where we bring hundreds of high school students to our campus in the days when we actually came to campus and hopefully we’ll get back to that. But we spend an entire day talking about misinformation, talking about research out of Joel’s group and stuff that Diana’s going to be doing, and she’ll be talking next after me, and we actually just had the formation of what’s called MisinfoDay Jr. This is by high school students, elementary school or high school teachers, middle school teachers, and even elementary teachers. They’re meeting every Monday, if you’re interested, to talk about how they’re going to bring this curriculum into high schools, middle schools, and elementary schools more formally. We’re doing this in Washington, but it’s expanding. We’ll have teachers—I think we already have some from California and other states—so if you’re interested, contact me about that. So I just wanted to tell you about that. Now I just want to mention a couple things about some of the research that’s out there, and if you’re new to this and you’re looking, what’s one of the first places I should read? Well first, I would go read Diana’s stuff and Joel’s stuff. That’s where I would start. After that, I would then start looking at just the social media and Internet usage of teens, tweens, and children zero to six. So there’s several good reports out there. I often refer to the European Union’s Kids Online Report 2020—just came out. It gives you all sorts of data on when kids are picking up social media, how they differ by gender, how they differ by country, how they differ by different social media platforms. And I find that those are really helpful for us as we go and talk to kids in high schools, in middle schools, and in elementary schools. And also another very good report that I would refer to is on the right here: the Common Sense Report that comes out every year. This is 2019, but there’s also 2018. One of the big take-homes in these reports, and they’re very similar in what they have found, is that kids, of course, as you might expect, are using more and more social media earlier on. The amount of time they’re spending on it is more, the amount of time on phones is more. They’re spending more time, certainly than we ever did when we were in this younger generation. So it’s more and more important that we talk about this exact issue that we’re discussing today, which is, in that world, they’re likely to come across manipulative advertising—you know, news that’s being purposely manipulative, click bait, all the kinds of things that we’re worried about in the misinformation world. And so I think that’s really what sets the stage for us, and so we work with a lot of researchers in the Information School where I’m at, that spend a lot of time on what’s called Digital Youth. It’s an area of research that spends time with kids in research labs thinking about, what are the things—very similar to Joel, their group at Stanford University—where we try to understand what pulls them in and also doesn’t get their attention in social media and in these online environments. So also there’s just a whole bunch of other reports I was going to mention. Also, we have a new book that we released a couple weeks ago. It has a stronger word, but you’ll see if you read it, we take it very serious. It’s actually something that we spend a lot of time talking to high schools about. We don’t use the term; we use “BS” or we use “malarkey” or other things, but we do take it very serious. And it talks about some ways in which everyone can be empowered to live in this world, especially misinformation that comes wrapped in data. So I’ll leave it to that if you’re interested. So feel free to contact me; I know we’re going to have a question and answer session right now after Diana speaks, and I’m looking forward to your questions, but feel free, please, to reach out to me, either through our center, Twitter, or email. So with that, I am going to turn it over to my fellow colleague and panelist, Diana. 

 

[Dr. Joel Breakstone]: Thanks so much, Jevin, and before we jump into Diana, we wanted to just have you answer one question that had come through in the chat, and it relates to that point that you made about this issue of not wanting students to—and young people in general and perhaps all of us—to end up believing that there’s no information that we can trust and that there’s this crucial difference between having students be skeptical and being cynical, that there is no truth and having that sort of nihilistic view. So somebody asked, “My teen is developing a mistrust of any political news. What should I say to her?” So some thoughts on that front. 

 

[Dr. Jevin West]: I think it’s a really good question. It’s something that actually keeps me up at night. There’s times when I wake up in the morning and I’m wondering, are we talking too much about this to the public, to students, to teachers, to librarians? And eventually, after thinking about it, I say no, we need to talk about it, but we also at the same time need to tell them that a lot of these institutions we depend on still work. For example, science has problems. We talk about this in my book a lot, my co-author Carl Bergstrom and I. And we say, “Here are some issues with the replication crisis, with some of the incentive structures, with the way we fund science, etc., but it still works. We still fly in 747s. We still have these amazing computers that we have in our pocket to access most any information we can.” So it still works despite its problems, but what I do when I talk to students too is I talk about the ways that purposeful misinformation, so disinformation campaigners, what their intent is, a lot of time, is to just sow distrust and to inject noise into the system so that we don’t trust anything. And so that is one of the big goals of disinformation. Not all misinformation has that kind of goal; some are just trying to make a buck, and they can make a lot of bucks from that. There’s been plenty of examples of that: the Macedonian teenagers during the 2016 election are an example. They didn’t care who won the election; they were just making money. But for those bad state actors out there or for those that are just making a living on conspiracy theory, really are just trying to sow that distrust so that you just don’t believe in anything. You retreat to your small neighborhoods of just a couple friends of yours who you trust or your neighbors, and we just can’t function as a democracy if that’s the case. So when my kids—I have kids too, so it’s great when I get those questions because I can empathize very much—and I would say that when I talk to him about how, yes, there are these problems with hyperpartisan news that they find everywhere, but you can look for sources, and there are places that do better than others, and to just talk about, we want to get better. Because it’s their generation that hopefully will help fix this problem that we have of extreme hyperpartisan news, a rise in misinformation, etc. But I do talk to him about how there is reliable information and some of the national media there that everyone wants to give a hard time on are ones that have done a lot of the great whistleblowing around major movements like #MeToo, etc. Those weren’t from just people sitting on Facebook and your friend that came up with something they found. So there is more reliable, good journalism out there, and we just have to really talk about the ways in which we can identify it.

 

[Dr. Joel Breakstone]: Thanks so much. Really crucial points. Thank you, and we’ll have time for more questions after Diana’s presentation, but now I’d like to introduce Diana. Diana Graber is an expert on digital literacy and the author of Raising Humans in a Digital World: Helping Kids Build a Healthy Relationship with Technology. She is the co-founder of CyberWise, a resource for adults to support youth in using digital media safely and wisely, and the founder of Cyber Civics, a popular middle school digital literacy program. Diana, welcome. 

 

[Diana Graber]: Thanks so much, and thank you guys for laying down such a great case of why this is really important stuff to teach your children. So hopefully, I can give parents a few tidbits of information that will help them do that. So first, I’m going to share my screen, and I won’t be able to see you guys, but if someone could tell me audibly if you can see that okay. That look all right?

 

[Dr. Joel Breakstone]: Yup, we can see it. 

 

[Diana Graber]: Okay, great. Okay. Alright! So as I mentioned, I’m Diana Graber. I’m the founder of Cyber Civics and CyberWise and also the author of Raising Humans in a Digital World. We’ll talk about all that in just a moment, but let me see if I can get to my next screen here. Okay. So I love this quote. You’ve probably seen it out and about. “We’re not fighting an epidemic; we’re fighting an infodemic.” And I think that’s so true, not just for us but for our kids too, because a lot of the misinformation that we’re seeing on our own social media networks and sometimes even in the mainstream news, our kids are seeing it tenfold in the places where they get news, and as you probably know if you have kids, our kids are not getting their news in the same place that we’re getting ours. They mostly are getting it from their social media networks. I love this little article that came out this week from Parentology, and it found that right now, Instagram is the number one provider of news to children, followed closely by Snapchat and TikTok. So, I guess this is tip number one I’d say to parents is that, be super aware of where your kids are getting their news because you can’t really talk to them about it unless you know what it is they’re seeing. So if your almost thirteen, fourteen-year-olds are using the social media networks, make sure you are too, and you’re taking a look at what they’re seeing online. I do this often myself with Snapchat. I think Snapchat is a really interesting way of providing news to kids. They have this whole area called “Discover,” where they have these little news tidbits, so I often scroll through there to see the kind of things that my students are seeing, and that gives me something to talk to them about. Alright, so really what I wanted to talk to you today was one of the things I do is we founded this curriculum called Cyber Civics, which is a middle school digital literacy curriculum, and it spans sixth, seventh, and eighth grade, which is really the time, I think, to teach kids about all of these things that underscore digital literacy, and I do want to say that within the curriculum, we actually provide the Stanford study for teachers to read as essential reading before they teach the curriculum because it’s so important for them to know what they’re up against. But, you know, there’s not one silver bullet to teach your kids about fake news. Unfortunately, I hate to tell you this, but it’s a really complex issue, and so we actually kind of roll it out over three years, and I’m going to show you how, and maybe you can take things away from this that you can do with your own families. But I think it’s really important to put this into the context of media literacy because media literacy is really understanding how to use critical thinking skills to analyze media messages, but it goes even deeper than that. So what we do with kids starting in sixth grade is we teach them what it means to be a good digital citizen, and you know, one of our responsibilities as a citizen of the digital world is to be super mindful of what we like, because that is important, and what we share online. So that’s sort of where we start. And then as we move kids through the curriculum, we start teaching them what information is, and I thought about that as Joel was speaking because one of the things we teach them to do is to read the results of your search results, like a Google page. Because it’s really hard to read, and it’s hard to tell if you look something up, what’s the ad and who’s selling me something? And where is my real information? And often, the real information you need is super low on the page, so that’s something. And also for kids to understand how information is customized to them based on what they searched for before. And then we give kids strategies to spot misinformation. We use something called the C.R.A.P test that you’ll see in just a moment, but it’s a super easy acronym that kids like and they remember, and it gives them something they can use whenever they run across something online that might be questionable or they maybe they wonder if it’s real. And then it’s important for kids to know how news is made. I mean, that’s a really essential component of media literacy—for them to come away understanding that anyone can make news and anyone can share news, and that puts a heavy responsibility on all of us as digital citizens, not only to be mindful of things we make and post online, but what we share with others online. And then, as was mentioned earlier, it’s important for kids to know about filter bubbles, clickbait, deep fakes; these are essential components of our digital world right now, so to understand what they are and to recognize them and what to do when they see them is so, so important for kids. And then finally, what to do when they encounter misinformation. And luckily for them, a lot of the networks they use, like Instagram for example, has really good mechanisms within it to allow kids to mark things as fake news, and that’s super important too because fact checkers aren’t perfect. It’s a lot of work, and so it’s up to us as citizens of a digital world to help the fact checkers actually mark things when we see them to be misinformation online. Now, I know this seems like a lot, and it is, but all of these components are really important. But I want to take a step back for a minute because what I often find in the classroom—I’ve been teaching Cyber Civics myself now, gosh, this’ll be my eleventh year—but what I often find from kids, they love to do this, like when somebody makes a joke or something they don’t like it, they’ll say to each other, “That’s fake news!” And it’s become often this thing that we throw around without really understanding what it is, so I like to show kids, this is one of our student videos that actually answers that question, what is fake news? So, got it here for you. Pretty short. 

 

[Video]: You’ve probably heard this term. [Compilation of various news clips saying “fake news”] But what exactly is fake news? And more importantly, why should you care about it? Let’s start by looking at its history. (Sound bite: “Wow, wait. That wasn’t a shadow. It’s something moving. What? It’s standing on legs. Those strange beings who landed in the Jersey farmlands tonight are the vanguard of an invading army from the planet Mars.”) While fake news is nothing new, the Internet has made it easier than ever for fake news to spread and even happen in the first place. Here’s why: before the Internet, most people got their news from the paper, radio, or television. Because there were fewer sources providing news, it was in the best interest of each to be as reputable as possible, but with the Internet, news moved online. Suddenly, anyone could post information on places like Facebook and Twitter. With so much information coming at us from all angles, it’s easy to get duped, especially when articles are made to look like verified news sources. People generally believe it to be true because it looks like news. This is happening more than ever. In fact, studies show that 75% of people who see fake news think it’s real news. It can be really hard to tell when something is fake; even our own eyes can be tricked. This is called a deep fake. Videos like this one use artificial intelligence to make it look like someone is saying or doing something they never actually did. Being duped by false information can have devastating effects on society and our democracy. That’s why it’s more important than ever for you to know what fake news is, be able to recognize it, and know how to stop it from spreading. So back to our original question, what is fake news? Fake news is when news stories or hoaxes are created to deliberately misinform or deceive. It also helps to know what fake news is not. News you don’t like or simply don’t agree with is not fake news. Stories that poke fun at real news on parody sites, for example, are not fake news. Opinion pieces on news sites are not fake news, and honest mistakes are not fake news. Still, recognizing fake news is hard. That’s why it’s up to you to be critical of what you see and hear online. A good way to do this is by using the C.R.A.P. test. Find out if the article is current. Sometimes old articles are recirculated online. Ask if the site where the article is posted is reputable. Open a second tab on your computer and look into the site that hosts the article. Find out who the author is. Is it a person with verifiable credentials? Find out the purpose or point of view of the article. Is it trying to sell you something or convince you of their position? Finally, you can always use plain, old common sense. If you see something online that makes you scratch your head, then it’s time to start doing some sleuthing. These sites can help. It can be easy to be tricked online, but if you’re smart and ask questions, you can stop fake news in its tracks. Whatever you do, don’t—

 

[Diana Graber]: Whoopsie. (Laughs) I think I hit that too soon. Sorry, it was just going to say, “Whatever you do, please don’t share fake news.” So I went that through that super fast, but I think it is time to move on. So, regardless of how you feel about this, there are ways to help your kids know what to do when they encounter fake news. So I’m not going to give you one tip right now; I’m going to give you a bunch of resources where you can go for a whole bunch of tips to help your children. Number one, within my book, I wrote an entire chapter on critical thinking, and within it, at the very end, there’s two activities that you can do with your children at home. One is called “Detecting C.R.A.P.”—it’s actually got the C.R.A.P. test in there; it tells you how to do it. And then another game you can do, called “Are you a consumer or a producer?” We also actually have this website too, for parents, cyberwise.org. We have what we call our “learning hubs,” and we have a whole hub dedicated to fake news. That’s the URL right there, but if you go there you’re going to find the video I just showed you, articles that we’ve collected about fake news, a downloadable infographic, and three games you can play to help your kids identify fake news, and other things like that. And then finally I’m a big fan of this C.R.A.P. test, and basically here’s why. You know, middle school kids are natural skeptics. They love things that are funny that they remember, so when you write the word “crap” on the board and tell them that’s what you’re gonna teach them today, kids generally remember that, and it’s such an easy acronym to teach your kids, you know, is it current? Is it reputable? Who’s the author? What’s their purpose or point of view? And you can apply that to web pages, to websites, to articles. Super easy to do with your kids, so if you go to our Cyber Civics website, we’re giving away that lesson for free. I love this one because it actually applies it to 10 websites that we collected. Eight are fake, and two are not fake, so it’s fun to see if the kids can figure out which are which. We also send out a newsletter every two weeks through CyberWise. This week, our newsletter actually focused on fake news, so we had all kinds of resources and information in it. You can get a copy of it at this URL. And one thing I really liked was we had this little article on how to report fake news on Instagram. Again, I think it’s important for our kids to know that they can be proactive digital citizens by helping these networks they use mark things as fake when they see them, and it’s really empowering for a young person to be able to not only know how to look at the stuff and decide if it’s real or fake but then to give them something they can do about it. One last thing I want to talk about here before I show you TikTok a little closer is that one of the reasons i connected with Children and Screens Pamela and I, I think, really come from the same background, where we think a lot of the issues and problems that we see online with our children really stem down to one thing: there’s so many kids, super young, using social media networks before they’re old enough. And nearly every social media network out there requires kids to be at least 13 years of age, and that’s for a really good reason: it takes kids that long to develop the critical thinking skills and ethical thinking skills that are really required to look at something and understand if it’s real or true or fake. So you know, you can teach your kids under 13, 12, these things, but it’s really not going to make a lot of sense in their brains until they get a little bit older. And also scaffolding the lessons in a way that makes sense to them really helps young people make sense of a very difficult digital world. So I’m going to show you one last thing. This is kind of a last minute addition, but as I mentioned, on our CyberWise website, we have all these learning hubs. This is a really good one because last week we took a look at TikTok and media literacy. TikTok actually has some great little videos right now to help kids know what to do when they see misinformation online. There’s five of them, but you can find them all on our website. This is a really cute one about distinguishing reputable sources from questionable ones, so they’re short, they speak directly to children, kids love them, but I would really encourage you to watch these together with your children. So again, if you want to find these, go to the CyberWise website, and it’ll be under our TikTok Hub. And then finally, I know I went through everything super fast today, but please feel free to reach out to me directly. I will get that email. Also, those are our three websites; all three of them have a lot of information on this topic, and I’m personally very passionate about it—I don’t know if you can tell—but feel free to reach out to me directly if you have any questions or want to know where to get these resources. So thank you. I will stop my share and go back. Alright, here I am. 

 

[Dr. Joel Breakstone]: Thanks so much, Diana. So we’re going to enter into a question and answer session, and so for all the attendees of this session, if you would like to ask a question, please put that into the Q&A tab, and when you enter your question, please also indicate whether you would like to ask that question live on camera or if you would prefer just to to send it as text and then I’ll read it to the panelists for a discussion. So as people are asking their questions, I want to just ask Diana one follow-up question, which is about schools in particular. How do we convince schools that it’s better to teach media literacy and strategies for evaluating online information than using filter or apps or websites that block information from students. You know, for instance, we often find as we’re working with schools that there are firewalls that prevent students from encountering social media sites when they are on the school’s network, and as a result, it’s a lot more difficult to have students figure out how to evaluate information on their, own the kind of information that is immediately accessible when they pull the phone out of their pocket and are encountering the web, so yeah. Some thoughts about working with schools?

 

[Diana Graber]: Yeah, well that’s an easy one because number one, I would say, the best filter in the world is the one kids keep between their ears: their brain. And then number two, I would say, I have never met a kid that can’t disable or get around any filter their parent or an adult will put on their phone, computer, etc., so those are imperfect solutions. The best solution is teaching kids how to use this for themselves, and I’ll tell you, having done this for so long, you guys will agree with me, I’m sure: kids love this stuff. I mean, especially middle school, they are natural skeptics, they love to find mistakes, and they’re they feel excited when they find things that are wrong, and they want to report it, and it’s just it hits them at such an important developmental stage, so it is the most effective way to do it. The other thing is, you know, we spent last year aligning our curriculum with the Common Core ELA standards, and it meets the same standards as the English Language Arts. I mean, this is English Language Arts today, so I would urge any school out there: you have got to keep teach your kids the stuff. It’s required, and also, where are kids reading? They’re reading the Internet, so we have to teach them how to be better consumers, readers, understanders of what they’re getting online. So have them call me if they have any questions. 

 

[Dr. Joel Breakstone]: Super, thanks so much. So we have a question from Lisa Guernsey, and she is going to ask that question live. Welcome, Lisa. 

 

[Lisa Guernsey]: Hi everyone. This is so helpful, and I’m at New America. We’ve tried to track some of these issues for educators and education policy people. I have a question about the C.R.A.P. test and lateral reading, and it’s something I’ve been trying to understand better. The lateral reading movement has been really helpful. I’’ve been hearing more and more about how much that makes a difference, and the C.R.A.P. test has been—I think it came before lateral reading if I’m understanding the research. So is it possible to do both, number one? And should students be also taught lateral reading in a very explicit way if the C.R.A.P. test doesn’t do enough of that? 

 

[Dr. Joel Breakstone]: So in terms of lateral reading, we definitely think that that’s an explicit strategy that we should teach students and that we need to have students learn. Before reading deeply on a website, the sort of most effective strategy is to leave that website to find out more about it because we don’t want students to be drawn in by the information that is being provided by that website itself because the people who own the website are the ones who control all the information on the website, and if we have students spending too much time reading the website itself, we’re more likely to end up sort of focused on the content that the the creators of the website control, and so our experience and research has shown that it’s really crucial to get students off of a website. It seems counterintuitive, but to understand a website, you need to leave that website. And what we found is that by teaching students that one skill, and we can do it in a relatively short amount of time, we can end up with a lot better results, in terms of the kinds of conclusions that students will draw about a given website. Because we don’t want them to focus on some strategies that might have been useful in sort of earlier days of the web, you know, things like, is it “.org” or “.com,” right? Some of these things that students have had ingrained into them and that students know very well, right? One of our tasks was, we gave students two different websites. One was a duke.edu site about gun control, and the other one was a Wikipedia page about gun control policy, and the Duke page was actually someboy reposting an NRA article, and the Wikipedia article had 200 citations on it, and our question was, what’s the best starting place for research? Most students pick the duke.edu just because it’s a “.edu” website, and they say, “I’ve been taught by my teachers not to use Wikipedia.” So part of this is teaching them some strategies that work in our current digital environment, and so looking just at the URL doesn’t work very well. “.org” doesn’t tell us anything other than the domain that an organization paid for. And Wikipedia can be really useful as a starting place. It shouldn’t be the end of our research, but well-researched Wikipedia pages are incredibly useful. So we think that lateral reading is a crucial strategy to teach explicitly at the outset, and it’s really clear, the difference between the fact checkers and these other really smart folks who spent a lot of time on the website itself. Thank you so much for that question, Lisa. Really helpful. Another question has come in, which is, “A question for youth on how to question someone’s fake posting?” So how do we deal with a peer who may be bringing up a fake item and how to how to deal with it. Diana or Jevin, do you want to take that on? 

 

[Dr. Jevin West]: Sure, I’m happy to jump in, and Diana, please jump in afterwards. I do want to just echo one comment that Diana said, which is, students love to be able to figure out when something isn’t right. In fact, we have students literally run up to the board and just sort of say, “Ha, I see something—I see something on the screen!” It’s a real joy to see that empowerment because humans are curious. So what happens when students then find that their peer on social media is posting something that sounds too good to be true or sounds suspicious? How should they approach that? I mean, one of the most important things is to approach that knowing that there is a human on the other side that can get their feelings hurt, that can be offended if you attack them through ad hominem attacks and attack them for being stupid. So we really focus in our class—in our class at the university and also in our book—we talk a lot about the importance of civics. To try to really focus on the question to start asking, where did you find that? Start with questions of even, where do you think that person got it? So just being curious and and sort of addressing it, as my colleague Mike Caulfield says, as an encyclopedian rather than a lawyer. Going about it asking the kinds of questions that you would as a fact checker, and then even getting at values a lot of times, so, “Oh, you think this is really an important issue.” Let’s say it’s on gun control or whatever that might be politically divisive. You can talk about a common set of values, but just really trying to avoid this issue of attacking the individual, and that’s not easy to do. We’re humans, and sometimes that happens. But by just being curious, and ask questions as if you were an investigative reporter, not a mean investigative reporter, but just asking, where did where did you learn that? Can you take me to that site? Can you provide a link? Did you see this other website? So just trying to do that in a non-confrontational way. It’s not easy to teach students that, but if we at least talk about it explicitly, they sometimes come up with those solutions themselves. They’ll say, “Oh, Professor West. You know, you could do it this way,” and I’ll go, “Ah!” I’ve never thought about addressing it on social media. There’s customs and culture online within different generations that even I’m not aware of. So anyway, I think just opening that discussion, I think, is about how you avoid the personal attacks, and focusing on the actual source itself and doing the kinds of things that we talk about more broadly I think is the way to go about it. 

 

[Diana Graber]: I would just jump in too. I so agree with you, Jevin. That’s sort of like the whole purpose that I wrote the book, you know, Raising Humans in a Digital World is to remember, and this is what we underscore in every lesson is, behind every online interaction is a real human with real feelings. And I think this whole idea of empathy is missing so much, especially with our children. So to teach them, like, if there’s something you don’t like or don’t see, go directly to the person and have a conversation. If you can have it face-to-face—it’s hard today—but have it directly with that person because you’d be empathetic of why they posted that. Maybe they don’t have the advantage of learning the C.R.A.P. test or whatever and they just don’t understand, so to always come back to that idea of empathy and a real human behind the screen. I think we cannot say that enough to our children. 

 

[Dr. Joel Breakstone]: Really crucial points, thanks so much. I also need to clarify what I was talking about a moment ago. A question came through of just asking for a definition of lateral reading. So lateral reading is this move that we saw fact checkers engage in during our research, which was when they encountered an unfamiliar website that they opened new tabs in their browser and read across those tabs. And so, thinking about reading laterally, rather than the vertical reading on a single web page that we saw so often led to weaker evaluations of sources. When the Stanford students and the historians stayed on the pages, they were less likely to come to strong conclusions about a site; in contrast, the fact checkers read laterally by opening up tabs and searching on the web for more information about it. So one particularly important takeaway from that research was that that was an efficient strategy because I think one issue that we all need to take into account is that we need to figure out approaches that work quickly. We’re not going to spend hours and hours tracking down every detail about claims that we encounter online. There’s so much information out there. We need to figure out ways that can be done quickly and are realistic that all of us can can deploy effectively in real life. We now—

 

[Dr. Jevin West]: Joel, can I just jump in just real quick on what you were saying? So I think this lateral reading is so important. And like I said before, I refer to that often when we talk to students. But there are these really simple tricks when you’re doing the lateral reading. I’ll give you one that I use all the time and that we talk about to students. Let’s say you run into a new site that you’ve never seen. So you just take the URL. Let’s say it’s “wallstreetjournal.com,” and it’s not quite spelled right. You can put that into Google and then just put space and then “wiki” or “Wikipedia.” And almost always, very reliably, Wikipedia will give you sometimes the political slant, when it was formed. You can use “WHOIS” or some of these other websites that show you when the website was from. Just doing that gives you at least a start into correcting for it. And what’s crazy to me—and Joel was talking about this too—is that Wikipedia is this ray of hope, actually, in this mess of misinformation. We study, in our lab, we look at billions of social media posts in our center across the platforms: Twitter, Reddit, PlaySport, etc. Wikipedia has its issues for sure, and certainly you can’t always trust it, but it’s actually one of the more reliable places. So I think this idea that we’ve taught students for a long time—and I was one of those. Ask me fifteen years ago if I would have trusted it—I probably wouldn’t have. But I think Wikipedia is, if you’re going to start somewhere, is more reliable than a lot of the other social media platforms. 

 

[Dr. Joel Breakstone]: Yeah, and I think it raises this crucial point of, we need practice, and students need practice to begin to learn, what are these strategies that work? That tool of doing the “ wiki.” So finding more information about and t figure out what works well. Thanks so much. We now have a question from Cyndy from Project Look Sharp. 

 

[Dr. Cyndy Scheibe]: How you doing, Joel?

 

[Dr. Joel Breakstone]: I’m doing well.

 

[Dr. Cyndy Scheibe]: I work with Chris Sperry, and I know you did an article for Social Education that he was guest editor for, so my question has to do with teaching students to reflect on their own biases, in particular confirmation bias that we all face when we are choosing what to read, what to remember, what to choose, and how to get people not just to critique what they’re seeing but also to be aware of their own biases in this whole process. Loving what you guys are talking about and would love to continue to collaborate. So any suggestions? 

 

[Dr. Jevin West]: I’ll go first. Yeah, I can just say a few things and I’ll turn over to Diana or Joel because this is such an important issue. When we had our Misinfo Day, the very first one, we had all these topics that we wanted to choose from. We only had from like 10 am to 2 pm to talk to all these students about all these issues of misinformation. We could not exclude confirmation bias. And we knew it was going to be difficult to talk to students about this self-reflection and these biases that every human has, and sometimes those biases can be helpful in the world in which we live, but many times, they can be major crutches, especially when we’re trying to discern truth and not truth. We can be amazing motivated reasoners when it’s something that we want to fit our narrative. So what we do are some exercises where we talk about this concept, we go through some exercises where we do our best to select some news stories that we think will reflect some of the biases that exist in the room. We never hit it exactly right. We’re happy also to share any of the data that we have on the exercises that we do for our Misinfo Day—just let me know—but I think this idea of confirmation bias, to do this meta-level thinking, this sort of self-reflective thinking, it really helps students start to see more than just this issue of misinformation and how we can get blindsided by that and also how our current news ecosystem, it sort of really plays on those biases very much so. And so if they are aware of it, just being aware of it, doesn’t mean we don’t falter to it. I falter to it still; we all do. We have these biases, and there’s great work going on right now. I look to David Rand at MIT and Gordon Pennycook at Yale that’re doing a bunch of really great work on confirmation biases and motivated reasoning, finding all sorts of—actually, I think some positive results coming out on that—that actually, you know, we’re all smart enough to do this. We just have to be aware of these biases and be aware that these platforms are running gazillions of A/B experiments right now, taking advantage of that. So I think doing some of these exercises will be helpful. We’re happy to share some, and I know there’s a lot of great work out there as well. 

 

[Diana Graber]: And I would just say, to jump on that, Jevin, that’s such a great question, Cyndy, and I think you make a wonderful case why none of this stuff can be taught alone. It fits together like a puzzle for the kids, and as I mentioned earlier, that’s why we teach the kids, what is Wikipedia? But first of all, how does it work? Why does it work? And then you layer on top of that the C.R.A.P. test, and then you layer on, that what’s visual literacy? And then you layer on that your biases, and then on top of that, how does the media misrepresent different people online? So all of these pieces are like part of this puzzle, and that’s why I think media literacy, it’s like not a one-stop lesson. It’s like, you gotta put all these parts together at a way that it makes sense to kids, also at a time that it meets them developmentally. So I’m such a big proponent of that because it works. You see, these kids will come out of high school and beyond, and they’ll just—you know, they’re going to make a better Internet. That’s what I believe, so. Yeah. 

 

[Dr. Joel Breakstone]: Thanks so much for that question, Cyndy. I really appreciate it. We’ve got another question that came in through the chat. It says, “Lots of great info aimed at primary and high school kids, but I’m dealing with college students in Boston who seem to be at about par with some/many high school kids or below them.” I think this certainly matches what we have seen. So we’ve done a series of studies now that include college students, and we’ve seen that the college students absolutely struggle in ways that are very similar to high school students, and we’ve seen the same patterns play out of students focusing on the content of web pages, rather than doing any investigations of where that came from, and the appearance of credibility, of saying, “This looks like it is peer-reviewed” rather than seeking out to determine that it is, or to say something looks good because of the appearance of the websites, or that it has an author listed or contact information listed. So we’ve seen that over and over again, that college students are in the same boat, and I think our research with historians points out that many adults are as well. We’ve spent much less time doing research with adults, but these are very well-educated, smart people, and they struggled as well. So it speaks to the fact that none of us were were prepared to deal with the current information landscape, and so we need to think through, how do we go about providing support to people to learn some strategies that are effective? How do you begin to recognize a deep fake, and how do you think about finding more trustworthy information? We just did a study as part of an online nutrition class, actually, fully asynchronous, so the students were completing all of the tasks online by themselves, and so we gave them a series of activities to do where we were teaching the skill of lateral reading and having them watch some of those videos that John Green and the Crash Course team had created. And the results haven’t been published yet, but the initial look at it looks like we can move the dial a little bit. We did another study two years ago in a large state university, where we did in-person instruction. We did see improvement from students in college classrooms, where we can begin to teach these strategies in just a few lessons and that they can begin to be more discerning consumers of information if we tell them how to do it. Nobody wants to be duped, and if we can give very direct instruction about ways that are better for finding information, we’ve found that students can do it in college classrooms. Jevin, I know this also speaks directly to your work and the course you’ve created. You want to talk a little bit about that? 

 

[Dr. Jevin West]: Yeah, sure. I mean, that’s where I have most of my expertise is on a university campus. So my colleague and I launched a course in early 2017, actually something we’d been working on for years prior to the 2016 election. We call it “Calling BS,” but we use the other term—for various reasons. We take the term, like I said, very serious. When we talk about philosophers that have talked about this issue, and the first time we offered it, it was in the spring of 2017. It filled within a minute; it still continues to fill within minutes of registration, of 160 students from majors across campus, and they’re mostly seniors because they have priority in registration. And what we have found is, just as the question noted and the comment, that college students are also unprepared for it. There’s certain things they do quite well, and they can learn quite quickly, but they struggle with, in addition to basic media literacy skills, they struggle with things like correlation and causation, selection bias, what we call graphical malfeasance that we see everywhere, and especially with COVID, there’s graphs and data and statistics everywhere, so we have a big focus on that. They have a hard time discerning reliable science venues versus non-reliable science venues, so when they do get to a primary source, it’s hard for them to tell whether it’s reliable, how to check the credentials, all the kinds of things that you would hope that any citizen of the world would be able to do, and these are seniors at universities. So I totally empathize with that, and I think it’s something that we really need to focus on earlier and also at universities, of course—that’s where I spend most of my time. There are some states, in my state, Washington state, also now laws, a sort of a policy passed at the state legislature, to require media literacy, and we want to integrate it everywhere. As Diana says, we want to get that integrated. It’s this big puzzle that needs to be integrated across the disciplines, but I think at the university level, we’re actually kind of co-collaborating with over 70 universities, really, in many ways. Now over 100 universities have contacted us wanting to at least adopt some of the content we created or add onto it, so we’re learning from them as well. So feel free to go to our website. You can go to callingbull.org, or the full word, callingbull[shit].org, and you’ll find free videos, case studies, material. We put all the material on for free, and if you have other ideas, please send them our way. And then the book that I mentioned is based on the university course, and we go through things, and it is directed primarily—well, it’s a public book; it’s a trade book, so it’s for the public—but it is based on a lot of our experiences with college students around this. But honestly, I think it’s the most important skill that any college student can learn when they go out into the real world. I mean, to teach people how to recognize when other people are talking rot. You know, as Alexander Smith talked about in the 1700s, and I teach classes in data science and statistics, and those are important, but really, they’re nothing as important as just teaching people to be better discerners of information in this world where we’re just overwhelmed with information every single day.

 

[Dr. Joel Breakstone]: Thanks, Jevin. One participant just asked for you to clarify about correlation versus causation.

 

[Dr. Jevin West]: Oh, yes, okay. So this is something that when we talk about it in class, all the students start nodding, “Yes, I know. Correlation doesn’t imply causation.” They think they know it because they hear it through college and maybe even sometimes in high school, but they don’t. We do tests just to show them, and then they recognize pretty fast that they can’t. This is this idea that when we see data or we see a story that connects two variables, that there’s a connection. I’ll give you kind of a fun example. This is an easy one, and it’s actually from a real statistics paper that tries to make the point where there was a story—these are the kinds of things you see around COVID all the time—that someone collected data on the number of storks in countries in Europe and the number of people born in different countries—actually around the world, I think it wasn’t just in Europe. And they said, “Look, there’s this very strong relationship between the number of storks and the number of people, so less storks in this country, less people.” So the claim is that storks must deliver humans. If I was an alien that landed on the earth, and I said, “Aha! Look at this data. Storks deliver humans.” You could make that case by looking in that, but you’d be falling prey to this correlation argument, rather than being careful and looking at alternative explanation. The cause to that relationship is not that storks deliver babies; it’s that small countries have less storks and less people, and big countries have more people and more storks. And so these are the kinds of things that we go through, and this happens—it’s ubiquitous. In fact, literally every single day, I find examples of either selection bias problems, correlation versus causation issues that good researchers, good journalists make the mistake of, and certainly things we see in social media, so just by discerning that, it just makes them even that more effective at critical reasoning when they see things both on websites and as they do this lateral reading and see other sources talking about it. So anyway, that’s my quick response to it.

 

[Dr. Joel Breakstone]: Thanks so much, Jevin. At this point, we want to see if both of you would be willing to offer one very specific, concrete tip to take away from this session, in terms of how to help young people be better at making sense of online information. 

 

[Diana Graber]: Well, I have a thought because sometimes I have parents do this at workshops, and it’s so easy to do, so anybody that uses Facebook knows there are all kinds of conspiracy theories and fake information on Facebook, so if you have a child that is interested in this topic, maybe have them sit down with you and scroll through your own Facebook feed and find something that is making you scratch your head, like, “Does that sound real?” And do this whole exercise with them. In this case, I would open something up, and I’d have your child and you give it the C.R.A.P. test. I’d say, “Hey, is this a very current article? Is it reputable?” And this is where you could do your lateral thing, where you open a second tab and you look to see what the website is and if it’s a real website. Maybe you go to snopes.com, check it out; look at the author. Is the author somebody that should be writing about this topic? And then finally, is there a loaded language? Does it seem like it has a purpose or point of view? So kind of fun to do it together with your kids, but the most important part of that is—I do this all the time on Facebook—is if it’s fake news, tell Facebook. It’s so easy to do. Hit the three little dots in the upper right hand corner. A drop down menu comes on. It allows you to mark it as fake news, and it’s super effective. I think I irritate people because I do it so much, but I’ve noticed that I do it and literally 10 minutes later, in some cases, Facebook has marked the post as false information. So, you know, we have to help the fact checkers out, and teach your kids how to do it! They might find it to be fun and do it on their own social media networks. 

 

[Dr. Jevin West]: There’s so many things, so just pinpointing on one is always hard, but if you pinned me on one, I would say that when talking to students, if they see a news item or they see a post on Facebook or Twitter, Instagram or TikTok, it creates an emotional response to them. It makes them excessively mad or happy or fearful, to pause. So something that we say a lot is “think more, share less.” Let’s think a little bit more about it—pause before we share—because really, when we study this at the population level, it’s us that’s really sharing it. There are fake new purveyors, there are bad actors, there’s propagandist opportunities out there, injecting stuff in, but it’s ultimately us sharing it, so if they recognize that and they pause a little bit more because a lot of the stuff that does get shared are the emotion-evoking news headlines—I mean, we’re human, again, remember—and so I just think that that’s one of the things that I would say to take home. And then the other thing, and this isn’t even specific advice, is just that anyone can get better at this. It just takes practice. We all need practice. I teach a class on calling BS, and I need to practice all the time too. So I would just say that anyone can do it, they can get better, and we have to do it. I think democracy depends on it.

 

[Diana Graber]: Yeah.

 

[Dr. Joel Breakstone]: Yeah, I couldn’t agree more in terms of the stakes that are at play here and the need for practice. Beyond that move of lateral reading, which I already described, the one other tip that I would offer is that move that we saw professional fact checkers use of click restraint. They didn’t click on the very top results when they did do a search. We see overwhelming research that most users click on the top result, and many people think that the search engine algorithms are providing the best results at the top of the page, and that’s just not accurate. There’s all sorts of search engine optimization that goes into the ordering of results, and so even just taking 15 or 30 seconds when you do a search result and to scroll down and to look at the little snippets that go along with the search results, and to look at where the result’s coming from can really determine the kind of information that you’ll encounter, and finding hopefully better information when you click through, so rather than just impulsively or promiscuously clicking at the top search result, of pausing for a moment and looking for what might be a better starting place for your search about additional information on a given topic. We have another question here from Diana in Boston. She writes, “Teaching all of this is a huge undertaking. What policies, if any, are in place to be sure that schools are really taking on this critical issue in developmentally appropriate ways, especially with the increase in computer-based time students are spending as a result of the pandemic?” 

 

[Diana Graber]: I can probably take that one. There’s some groups that are doing some really important work in this area, where they’re passing legislation state-by-state, requiring media literacy be taught in the classroom. I was hoping that Michelle from NAMLE would join us today, but the National Association of Media Literacy and Education does a lot of this work, so it’s happening, but it’s not happening fast enough. So what I would say to parents, you know, so much of cyber civics happens because parents ask for it. Go to your school principal and say, “Look, my kid is online all the time. They need media literacy,” and have them integrate it into the curriculum because it’s really important, but we have to speak up and make sure that our states are legislating for it and that parents are asking for it and that principles are making it happen. 

 

[Dr. Jevin West]: Yeah, I would say what Diana said is absolutely true about parents and teachers going and trying to make this happen. So in the state of Washington, which I think was one of the first states—we haven’t done everything right in the state of Washington in education, I promise you that—but this is one that they’ve been doing quite well, that it was basically a teacher that collaborated with a policymaker and said, “This has to happen.” And this actually happened many years ago, even before the fake news phenomena has really taken off, at least as a movement; it’s certainly been around for a long time. But other states are now following and requiring this, and I’m glad to see that. It should have been done a lot sooner ago, but I’m so glad you asked that question. We are literally working on a report right now that we’re going to release hopefully—it’s hard to say exactly, but I would say within months and not years. I would say, hopefully within the next couple months, which, we’ve gone through every single state and looked at the policies around this kind of education from media literacy, digital literacy, digital citizenship, as Joel and others talk a lot about, and how they overlap with Common Core standards, etc. And so we’ll be releasing this report, which is all 50 states, and it looks at the policies being implemented right now, and this will be a report we just give out, of course. Everything we get out in the center. So you can follow us; you can either contact me if you want some more details on that or you can just go to our center at cip.uw.edu, and we always are posting things constantly, and so when that report comes out, you can be aware of that. So feel free to contact, but there is a lot of activity; that’s the good news. We just need to keep making it happen. I think parents and teachers are the ones that make it happen. 

 

[Diana Graber]: Yeah. One other thing because I forgot their website when I was speaking, Media Literacy Now. I believe they have a map on their website that shows where your state is in terms of this media literacy legislation. 

 

[Dr. Joel Breakstone]: There’s just a follow-up question on this, which was, “What abotuu policy on a larger scale, e.g. should we be calling our senators about this?” And I think you certainly can be. I think that advocacy can’t hurt at this point, of pushing for greater intervention, in trying to make this kind of instruction happen on a larger scale so that we do see implementation of these kinds of ideas in instruction, and that certainly can only make it more likely to be integrated across the curriculum. We now have a live question, as well. Go right ahead. 

 

[Dr. Anirban Banerjee]: Can you hear me?

 

[Dr. Joel Breakstone]: Yes.

 

[Dr. Anirban Banerjee]: So you touched on this, and my question is that it seems that everything that you’re saying is almost universal to—I mean, my WhatsApp group friends could actually listen to this and benefit—and my question is, how do kids react to misinformation differently because of their developmental state? You’ve touched on one, which is that they become overly cynical too early on, I think, compared to an adult. Are there other ways that they react to misinformation specifically because of the developmental state? Please help me understand how my kids would react to this that I cannot…

 

[Diana Graber]: I can touch on that, but what age of kids are you talking about? 

 

[Dr. Anirban Banerjee]: Eight and eleven. 

 

[Diana Graber]: Yeah, so you know, something that I mentioned earlier, what I find, it’s really hard teaching this to kids under the age of 12 and 13, and that’s simply because it takes about 12 years of age for kids to develop the abstract thinking skills that they need in order to do ethical and critical thinking. And so a lot of times with younger children, you’re teaching them yes and no, black and white, and I wish every parent could stand in a sixth grade classroom because you can almost see the wheels starting to turn, where they’re able to start seeing the gray. And so I would say, it’s almost really hard to teach kids that young to be critical thinkers and understand fake news or whatever, and so that’s why I’m such a big proponent of holding them back. They shouldn’t be using social media apps that are purveyors of the stuff until they’re old enough to really understand what they’re encountering, and until then, it’s really incumbent upon you to be with them and looking at things together and explaining it to them and pointing out why it’s not real and how you figured that out. I wish there was an easier answer, but it’s a spectrum. Kids take a little while to develop all that upstairs so they can do this for themselves. 

 

[Dr. Jevin West]: Yeah, I’ll just jump in. It’s a really hard question. It’s not my area of expertise. I look to my colleagues on this call for this, and also my colleagues that are in my department in the Information School at University of Washington, like Katie Davis and Jason Yip, who study this full time, like they study kids’ digital world, how kids react, the psychology, the culture, everything around kids—actually from 0 to 8—but I’ve been trying to entice them to think about the misinformation aspect of it, so if you check back with our digital youth group, hopefully they’re gonna be doing some more studies over the next year in this space. So I think there needs to be a lot more research around this. I think Diana’s intuition is right. At leas,t my own personal intuition is that it’s hard for them to understand all the complex elements that go into trying to discern truth or especially when, you know, I guess I remember when I was a kid, when I read the textbooks, when I was in middle school, and that was beyond this age group, in high school, I just thought everything I read was right. I mean, I shouldn’t have been thinking that, really, and then it was that day that hit me, that, “Oh my goodness! This person that’s writing my chemistry book might not have everything right. Holy cow!” And it’s sort of like a seismic shift in in how we know what we know about the world, and then when you start to learn more about epistemology and all the different ways we go about thinking about that… These are tough concepts. However, that said, I do want us to do more research in this area. I do want us to engage it at some level, just because kids are going to be hitting it. I just personally don’t have—this isn’t my research area, so again, I will look to my colleagues, but I would say there are several really strong digital youth groups around the country at universities that are doing this work, and we have two colleagues on here that really are, I think, people to look for to stay on top of that kind of stuff. 

 

[Dr. Joel Breakstone]: Yeah, I’d just take up one piece of what Jevin said, which is this idea of how there’s this sort of fundamental shift of thinking about, information is coming from somewhere. And that I think that we can at least begin to build that idea into younger students. We don’t necessarily have to be teaching students, how do you go and engage in lateral reading and seek out other sources? But it’s really important to begin to develop this idea that information is coming from somewhere. That information is not free-floating and that that’s a crucial idea online and in the world in general. That it’s important in schooling too, that we want students to have a sense that textbooks are constructed narratives too, and that there’s somebody behind that information, and that we should take that into account, and we have seen younger students do that work of having the idea of, information comes from somewhere, and that basic idea of, it’s not just like, I saw it online; it must be true. That’s a crucial idea to begin early so that you don’t have to have quite so many bursts of bubbles of, lik, “Oh my gosh, why didn’t someone tell me this sooner? I’m 12 or 13.” Beginning to inculcate that notion of information coming out of somebody and that we should at least consider the authorship of that information. 

 

[Dr. Anirban Banerjee]: They’re spending so much time on virtual platforms, you know, particularly for the next months, but thank you very much. I really appreciate—

 

[Dr. Joel Breakstone]: Yeah, absolutely. I think in this particular moment, it is crucial for us to be preparing our students, given this digital landscape. Well, I want to thank everybody for joining us today. It’s been a pleasure to engage in this conversation. Diana, Jevin, thank you so much for all of your contributions, and I want to just pass it back to Pam to wrap us up. 

 

[Dr. Pamela Hurst-Della Pietra]: Thank you so much, all of you, for coming and participating in such a helpful and informative discussion. And thank you, Joel, Jevin, and Diana for sharing your insights and ensuring that we’ll all leave today more media-savvy than we were 90 minutes ago. Please share the YouTube video you’ll receive of today’s workshop with your fellow parents, teachers, clinicians, researchers, and friends, and please follow us on social media at the account shown on your screen. Our discussions about digital media use and children’s well-being will continue throughout the summer, fall, and winter, with weekly Wednesday workshops. Next week, on Wednesday, August 26th, we’ll host a conversation about the transition back to school. School’s back, now what? And the following week, on Wednesday, August 2nd, we will discuss cyberbullying and online cruelty. We hope you’ll join us. When you leave the workshop, you’ll see a link to a short survey. Please click on the link and let us know what you thought of today’s workshop. Thanks again, and everyone be safe and well.