Fake News: Finding It, Fighting It

Episode 9

Do you know how to identify fake news? MediaWise’s Katy Byron discusses teaching students how to determine what’s real on the internet, and Professor Gordon Pennycook exposes why people believe things that aren’t true.

 

Earn professional development credit for this episode!

Fill out a short form featuring an episode-specific question to receive a certificate. Click here!

Please note that because Learning for Justice is not a credit-granting agency, we encourage you to check with your administration to determine if your participation will count toward continuing education requirements.

Subscribe for automatic downloads using:

 

Apple Podcasts | Google Music | SpotifyRSS | Help

Resources and Readings

Katy Byron
MediaWise: Instagram | YouTube | Twitter | Facebook

What is MediaWise?

Lessons:

Gordon Pennycook
University of Regina | Website | Twitter

Recent research:

 

Transcript

Monita Bell: There are flying bat people living on the moon. It’s true—I have proof. I have pictures. It’s right there on the front page of the paper. But of course, that’s fake news. And fake news is nothing new. The New York Sun published the bat people story in 1835. Orson Welles’ radio account of invading aliens was so convincing that [allegedly] thousands of people ran screaming into the streets. 

There have always been people who want to fool us, especially as spreading this information helps them to get more money, or power, or both. There are always going to be people who fall for even the most out-there ideas. After all, somehow the National Enquirer still sells magazines. What is new is just how much fake news assaults us every single day.

And what’s changed, and seems to always be changing, is the many different ways that disinformation and misinformation get to us. People often share false news stories on social media far more than the fact-checked stories that debunk those stories. By the time fake information is out there in the world, it may be too late to get it back in the bottle.

So what’s a concerned citizen, much less a dedicated educator like you, to do about this? You’re listening to The Mind Online, the podcast for educators from Teaching Tolerance. I’m your host, Monita Bell, and this episode is about fake news—finding it and fighting it. We’ll talk to behavioral science professor Gordon Pennycook about why we believe things that just aren’t true. 

G. Pennycook: I study the science of human stupidity, I guess is one way to call it. I do research on misinformation and fake news, and why people believe the kind of weird things that people believe.

Monita Bell: But first, we’ll hear about MediaWise. Their goal is to teach a million students how to identify and expose fake news. 

Speaker 1: I reposted an article from a website that was not legit at all. I remember at that moment when somebody left a comment under that post saying, “You know this is not real, right?” I remember saying, “I know now.”

Speaker 2: The architecture of the social internet incentivizes the sharing of such information, regardless of whether it’s accurate. So like a lot of people online, I have at times ended up having my brain hacked, fearing stuff that doesn’t exist and knowing things that aren’t true.

Speaker 3: I really believe that one of the most impactful things that we can give to this next generation is the ability to be properly equipped with a discerning eye. 

Speaker 4: A new curriculum that helps kids learn how to learn is essential.

Speaker 2: I really believe that through online civic reasoning, we can learn better strategies for judging the reliability of the information that is inundating us all the time.

Speaker 3: This is a project that I really believe in, and I’m so proud to be a part of it. 

Speaker 1: Let’s get it started.

Monita Bell: Katy Byron works for the Poynter Institute, a nonprofit journalism organization. She’s the program manager for their media literacy project, MediaWise. 

Katy Byron: The main goal of the project is we want to teach students, middle school and high school primarily, the difference between what is real and what is fake on the internet. So, how to discern what’s accurate, what’s reliable, how to navigate this very confusing world of digital information that is just exploding exponentially.

Our main top-line goal is we want to reach 1 million students by 2020. This is a nonprofit project. Yeah, ambitious goal, but completely achievable in my view. It will be free for download, so attention all teachers, F-R-E-E free. Some of the lessons and assessments are available now on Stanford History Education Group’s website. They call it SHEG for short.

One of the primary reasons that it is free is because we want half of the students that we reach with this project, so half of those million, we want them to be from underserved and low-income areas. So this is really, we wanted to lower that barrier for entry to this information because it is something that everyone needs access to.

We’re also doing events across the country in middle schools and high schools, with the help of Local Media Association, is another partner. They’re going actually into schools and teaching some of the kind of simpler tips just to get people a taste of the curriculum.

Speaker 4: Even as a professional fact-checker, I’ve been fooled by misinformation online, and I think that’s why it’s so important you guys are here. You don’t have to be a professional fact-checker. You don’t have to be a professional journalist to sort fact from fiction online.

Speaker 5: Florida man stabs his best friend for liking his meme instead of reacting with “ha ha.” This is not a real news story. This is satire. The biggest tip to that is actually the URL. 

Speaker 6: The URL will often have clues. If it doesn’t quite look like a URL you’re accustomed and used to seeing, that should raise some flags in your mind to make sure what you’re looking at is actually true.

Speaker 7: I think the first one is fake, because I Googled, “Shark on the freeway Hurricane Harvey,” and there were about three or four sources on Google from reputable sources that said that picture’s fake, this isn’t real. So I was like, okay that doesn’t seem right to me. 

Speaker 8: Every single time there’s a hurricane, there’s always that same photo of that shark swimming up a highway.

Speaker 9: This is a zombie claim. It’s something every hurricane this photo gets shared, but it’s not—it’s not real.

Speaker 10: I feel like this event gave me basic journalist rules like fact-checking, and I think it’s going to help me be a better journalist in the future.

Speaker 11: My wife told me that I was bringing my son here tonight. I didn’t ask any questions. I didn’t want no trouble. Very happy that I brought him. Really informative. I have friends that send me stories all the time, and I’m like, I don’t think this is true. And now I know how to go about and check to see if it is or is not really true. Thank you.

Katy Byron: It’s pretty amazing. This part of the project has really taken off, much more so than we expected. We have more than 60 schools requesting events right now. We’re really teaching specific skills that are arming students with the ability to figure out what’s real and what’s not. So if they are going through their social media feeds, or they’re looking on YouTube, they can learn these skills to figure out, okay you know what, maybe this isn’t what it looks like on face value. 

Even if it went viral, that doesn’t necessarily mean it’s legit. That’s a big campaign for us, is this legit hashtag, is what we’ve been using for that. 

Monita Bell: What are you actually doing during these events? What are students doing in this interactive way that you’re describing?

Katy Byron: One example that ... It’s very easy, but it’s something that people just don’t know about. It’s called a reverse Google image search. What that means is, let’s say you’re on Twitter, or Facebook or Instagram. You right click on a photo, let’s say it’s a Tweet with a photo that went viral. You right-click on the photo, and then it’ll open up a tab and you can say search Google for this image. And then you can see everywhere that that image has appeared on the internet in the Google search results.

A lot of times, if it’s something called ... what we call a zombie claim on social media, it means something that has been repeatedly debunked and proven wrong through fact-check organizations like PolitiFact, FactCheck.org, Snopes. Or there will be actual stories written about this photo that continually goes viral, even though it’s totally bogus. That’s why we call it a zombie claim. It rises from the dead no matter how many times you kill it. 

Monita Bell: I love it. 

Katy Byron: Yeah, I love it too. 

Monita Bell: How do students respond when you take them through that? Does it blow their minds? 

Katy Byron: Some of them have heard of it. Some of them have not. I would say most of the time the teachers have not. I find that really interesting too. The curriculum that Stanford is creating and testing is called Civic Online Reasoning, based on looking at how three different groups consume information online differently. 

There was the students from Stanford, right, and then professional fact-checkers, and professional historians. The fact-checkers were significantly better at figuring out what was real and accurate, and what was not. 

Monita Bell: I think that’s a great segue into the teen fact-checking network, that you’re involved with. Can you tell me more about what that is?

Katy Byron: Yeah, the teen fact-checking network is basically a six- to eight-week program. We’ve been doing it with about 20 students. We work with them to teach them how to do fact-check videos for Instagram. They will write a script, and we’ll teach them also about journalism, and fact-checking, and that process. They’ll record their own Instagram videos and we’ll post them to our Instagram account. 

It’s like by teens, for teens. They’re not only fact-checking things they find online, which is a service in itself, but they’re also teaching teenagers in these stories how to do it and come to those same conclusions on their own. 

Speaker 17: February 12th, a picture of a rainbow and a tornado colliding was posted onto a Facebook account called Courtside Happenings. They claimed that this phenomenon was called a tornabow. To figure out if it, this is true or not, you can search the term “tornabow” in Google. Unfortunately, no. While the concept is extremely cool, you won’t be seeing any of these around any time soon. 

Right away, you’ll see an article by PolitiFact that debunks the post. According to the article, the image of the tornabow is actually from a website for a former e-publication called PrimalUrgeMagazine.com, where it’s described as a photoshopped manipulation and digital artwork by Corey Cowan. 

Katy Byron: I’d really like to pilot with a journalism teacher, or any kind of teacher really, if they have students that want to get involved, they think they want to make this a project for their class. I’d love to experiment. If there are any of your listeners that want to try that out with us, check out our Instagram account @MediaWise and just follow us and send us a direct message on there, and myself or someone on my team will get back to you. 

Monita Bell: Okay, everybody, pay attention to that, @MediaWise. Get at Katy and her team. She wants to work with you. I love it. Thank you for that. 

Katy Byron: No problem. 

Monita Bell: Okay, so you’ve also done these videos with John Green, the author of The Fault in Our Stars. Can you tell us about those videos, and what’s in them, and how teachers might use them?

Katy Byron: As you mentioned, John Green, he’s a best-selling author very popular with young adults, but he also has a YouTube channel called Crash Course, which I think has something north of 9 million subscribers by now, which is crazy. They produced a 10-part series on Crash Course hosted by John called Navigating Digital Information. It’s basically a preview of the curriculum. 

So you can use it as a teaching tool in your classroom. They’re all around 15 minutes long and really go in depth on various topics like social media, Wikipedia. 

John Green: Look, the internet is different for each of us, and never more so than in this era of endlessly personalized and customized information flow. I don’t know if we’re going to be able to figure out how to fix the internet in the next 10 weeks, but I do believe that each of us can improve our approach to information on the internet.

To do this, Crash Course is working with MediaWise, a project from the Poynter Institute designed to help students evaluate the accuracy of digital information. MediaWise, and therefore indirectly this series, is funded by Google, which owns YouTube. Google also loaned Crash Course its initial funding way back in 2011, although we eventually paid them back.

I’m saying all of that, and I will say it again throughout this series, because it’s really important to understand where funding comes from when evaluating the accuracy of digital information, including when you’re evaluating the accuracy of digital information about evaluating the accuracy of digital information. It’s evaluating the accuracy of digital information all the way down.
Katy Byron: Let’s say you see an Instagram meme, and you’re like, oh this is really funny, but it’s totally wrong and inaccurate. And then you share it. You’re kind of part of the problem. You’re misleading and you’re spreading misinformation, and that’s what we’re trying to prevent, because misinformation is really rampant.

Monita Bell: So you’ve got all this research showing how bad we all are at fact-checking and distinguishing between what’s real and what’s fake. You’ve got these awesome events that you’re doing, engaging directly with teens, and you’ve also got your network of teen fact-checkers. So through all of this work and all of this engagement, what are some of the most important things you have been learning about how young folks are receiving and using this information? 

Katy Byron: I would say the most important thing that I’ve been learning is how desperately they need it. I mean, the resounding feedback I got from the students was actually that they understand without being told that this is really important stuff, that it does have a real impact, that it is something that they should be paying attention to. They recognize that.

I thought that was really interesting because I think there’s also a very personal connection that teens have with this material because we’re not trying to address issues with misinformation and bullying on the more localized level, but a lot of these skills are applicable, you know? And that’s the stuff that kids are so scared about, is misinformation about themselves being spread and going viral in their school community online.

That really struck a heart chord for me. It was so sad to see these kids talking about that. I mean, my biggest takeaway right now is that it’s so hard to be a teenager in today’s universe with digital info. It’s really tough. I’m happy that we’re making an impact and teaching them things that will make it hopefully a little bit easier for them, and make them feel more confident knowing when something’s real or something is completely bogus.

Monita Bell: Is there anything that our educator listeners should know, that we haven’t covered yet? 

Katy Byron: I would say one other thing, we are actually using some of the platforms that these kids are spending all their time on during the presentation to actually teach them. So we actually tell them, “Okay get out your phone and open up Instagram,” and we have them vote in live polls about whether they think something is real or fake on our Instagram account. 

Then they can see the results because it’s actually native to how they’re consuming this stuff. Something that teachers have always kind of raised an eyebrow on like, “Oh, should I be using Instagram to teach my kids?” but I would say the answer is yes because if you want to teach them about digital literacy, you’ve got to go where they are and look at a fake Instagram account, look at a bad piece of information that went viral in the app, you know? 

There’s actually … in general, teenagers don’t love Facebook, right? Everyone kind of knows that. Teenagers on the lower income side of the scale, actually more of them use Facebook than other platforms, which was a surprise to me. So we actually keep Facebook as part of where we push our content and make sure we’re servicing that audience too because we don’t want to leave anyone out of getting access to this information. 

On that note, I should mention the Is This Legit campaign. So anytime—this is the other thing—any teacher can do and promote with their students if you see something on the internet that you think, okay, I don’t know if this is legitimate or not, comment with #IsThisLegit or share it with #IsThisLegit and tag @MediaWise. And my team and I will help you figure it out if it’s real or not. 

We’ll walk you through the process. Okay, this is the first step of how you can figure this out on your own. We get a lot of requests through Facebook, our DMs, all the time about that. On Instagram, we get tags all the time. Twitter, lots of commentary on that. That’s a way you can kind of directly engage with us, and we can also kind of teach you along the way and help students.

Monita Bell: Look for this episode of The Mind Online at tolerance.org/podcasts. You’ll find links to the MediaWise videos and social media accounts, Civic Online Reasoning curriculum from the Stanford History Education Group, and John Green’s Crash Course on navigating digital information. All free and ready for the classroom.

Now, a quick break.

Did you know that Teaching Tolerance has other podcasts? We’ve got Teaching Hard History, which builds on our framework, Teaching Hard History: American Slavery. Listen as our host, history professor Hasan Kwame Jeffries, brings us the lessons we should have learned in school through the voices of leading scholars and educators. 

It’s good advice for teachers, and good information for everybody. We’ve also got Queer America, hosted by professors Leila Rupp and John D’Emilio. Joined by scholars and educators, they take us on a journey that spans from Harlem to the Frontier West, revealing stories of LGBTQ life that belong in our consciousness and in our classrooms. 

Find both podcasts at tolerance.org/podcasts, and use them to help you build a more robust and inclusive curriculum. 

MediaWise can teach us how to find and fight fake news, but what makes us fall for these phony stories in the first place? A prominent group of researchers is dedicated to finding out, including social scientists like Gordon Pennycook. 

G. Pennycook: I study the science of human stupidity, I guess is one way to call it. I do research on misinformation and fake news, and why people believe the kind of weird things that people believe.

Monita Bell: Gordon Pennycook is a professor of behavioral science at the University of Regina in Saskatchewan, Canada.

G. Pennycook: For me, when I say “fake news,” I mean something really specific that is a news headline that is entirely made up, like just made up by somebody and it’s spread online. That’s people believing things that are untrue, and often they’re highly political things, things that might impact who one might vote for, or how they engage with people and so on.

Monita Bell: What have you found in your research that points to why we believe false information? 

G. Pennycook: If you expose someone to a fake news headline, the simple act of having read it makes it seem more likely to be true subsequently. The reason for that is that when you’re processing something, when you’re kind of figuring out what something means, if you’ve done it before it’s easier and that makes it seem more true. Even people who see headlines that are inconsistent with their political ideology, things that they have a kind of strong reason to reject, they still have this increasing belief based on exposure. That’s kind of one piece of the puzzle.

The other piece, the one that I focus more on, is we want to see whether reasoning helps or hurts, because it might be that the reason that it spread so much is that people are just intense partisans. They see a fake news story, and they kind of want it to be true. So they convince themselves, using their reasoning, to actually believe these things. If that was true, what that would mean is that the tendency to reason are kind of high level, you know, the kind of thing that separates us from the other animals is also hurting us. 

That would be really scary, because if it’s the case that we are using our reasoning to kind of bolster our partisan tendencies, then what we have to do to kind of solve the problem is make people less partisan. That’s maybe not impossible, but not easy certainly. But I mean the good news for my research is that that idea that we people are intense partisans, it just does not hold up to the data.

In the context of fake news at least, people are more reasonable than we thought. They can actually use their reasoning to facilitate accurate belief formation. If you show them fake news stories, even ones that they’re kind of ideologically predisposed to believe, for the most part, people are pretty good at rejecting them. People who are better at reasoning, people who are more analytical, are less likely to believe fake news regardless of whether it’s consistent or inconsistent with their political ideology. Political ideology is just not that important in the context of fake news. What’s important is whether you’re willing to think about things. 

The reason that people fall for fake news is because they’re just being lazy. They’re not paying attention to the things that they’re seeing, and that certainly is more amenable to intervention than making people less partisan. That’s what education is for, to teach people to be critical thinkers. What that suggests is that we don’t have to go into the kind of quagmire of trying to make people less partisan, which also might be an ethical problem. We can’t just change people’s beliefs, but we can change the way they think. 

Monita Bell: I know you’ve also studied how smartphones have changed the way we think and process information. Can you speak to the ways that smartphones might affect how we process fake news and misinformation in general, and maybe in ways that we haven’t experienced before the Technology Revolution, you might say? 

G. Pennycook: It’s unclear if technology is per se changing the way that we think. That is, specifically changing the kind of mechanisms in the way that I would be thinking about it. It’s changing what we see, and therefore what we think. Do you know what I mean?

Monita Bell: Yeah, that makes sense. 

G. Pennycook: So the research I did before was that the same people who are better at recognizing fake news, people who are more analytical, they rely less on their gut feelings, people that you might say are just kind of more logical, they’re closer to what you’d expect from someone like Spock in Star Trek, if you’re interested in that kind of thing. People who kind of question their intuitions, who question their beliefs, who kind of value reason, those people not only are they less likely to believe in fake news and things like that, they’re also less likely to offload their thinking to their smartphone.

What that means is ... Let’s just imagine that you’re at dinner, and you can’t remember an actor’s name. What you do instead of thinking about like what movie she was in, you just look it up on your phone. Instead of thinking about it in your head, you think about it with your phone. People who are more analytical are just less likely to do that. They don’t use their phones to think in that sort of way to the same degree at least. They think with their heads. 

And so the question, which we don’t have an answer to, is what is the consequence of that? That is, what happens to us if we have smartphones for—and it will be for some people—50, 60 years in which we’re getting easy answers all the time instead of thinking ourselves. What happens when they come across a problem that isn’t so easy, that we can’t just Google? Are we going to just give up faster? Are we going to kind of forget about those questions, and just try not to approach them?

Monita Bell: Yeah, as someone who frequently looks up actors’ names when I can’t remember …

G. Pennycook: I should add that everybody does that, too. I do it, too, because it is good to have that information. We are, on average, more accurate if we have access to the full culmination of human knowledge. It takes two seconds to find answers. That’s a beautiful, just wonderful thing. But not everything, you can’t Google everything, so there’s going to be cases where maybe it not only won’t help us, it might actually hurt us. There’s not enough research on this, basically. 

Researchers are always catching up to technology, so we’re always going to be behind, just necessarily.

Monita Bell: I was going to ask if we can train ourselves and our students to kind of fight the effects of ... well, specifically what you were talking about earlier. So this idea of prior exposure. Like, I read this before, and so that repetition tends to, in our minds, equal truth. What can we do to train ourselves to fight the effects of that?

G. Pennycook: That one’s difficult. In fact, we found people who are more analytical, those people who are better at detecting fake news, they are as susceptible to that effect of exposure, as people who are more intuitive. And so, that implies that this is not something that we can just change. It’s kind of an ingrained aspect of the way that our brain works, but we can change the way that that impacts our behavior, for example.

That can allow us to kind of be more skeptical about the things that we believe, the way that we question our own beliefs. If you think about it, people are actually really good at questioning other people’s beliefs. So we have the capacity to actually engage in that questioning process, but we don’t quite hold ourselves to the same standard. But, I think in theory, we could. 

Monita Bell: Mm-hmm (affirmative). 

G. Pennycook: This is not some boring thing. This is actually a super exhilarating, exciting, fun activity to be able to use the powers of your brain to figure out what’s true in the world. This is what the rule of science is. We’re all kind of scientists in a sense. I think education should be focused on not just teaching, but revealing to people how fun it can be to engage our capacity to reason.

It’s also fun to sit there and watch Netflix with your mouth open, and just put on Game of Thrones. That’s fun too. But, we can also ... People do sudokus, people actually engage in tasks that are just about thinking. We can do that about what’s true in the world, and that’s a lot of fun too. We have a responsibility to do so, I think, and so that’s important.

Monita Bell: Thank you for taking the time to join me for this episode of The Mind Online. I’m your host, Monita Bell, managing editor for Teaching Tolerance.

I want to give a special thank-you to my guests, Gordon Pennycook of the University of Regina, and Katy Byron of MediaWise. This podcast was inspired by our Digital Literacy Framework, which offers seven key areas where students need support developing digital and civic literacy skills, and features lessons for kindergarten through 12th-grade classrooms. 

Each lesson is designed in a way that can be used by educators with little to no technology in their classrooms. The Digital Literacy Framework and all its related sources, including a series of student-friendly videos, a professional development webinar, and a PD module, can all be found online at tolerance.org/diglit. That’s tolerance.org/D-I-G-L-I-T.

This episode was produced by Barrett Golding, with help from Jasmin López. Thanks to New Record Studio in Jersey City, New Jersey, and CJTR Regina Community Radio in Saskatchewan for recording our guests. Our production supervisor is Kate Shuster. Our music is from Podington Bear. 

You’ll find links to all the resources we discussed in this episode at tolerance.org/podcasts. Just look for The Mind Online, and find this episode. If you like what you’ve heard, please subscribe and share with your colleagues and friends. When you share on Twitter or Instagram, use #TeachDigLit. 

See you next time.

Correction: The audio for this podcast identifies Gordon Pennycook as “a professor of behavioral science.” His official title is “assistant professor of behavioral science.”

x
A map of Alabama, Florida, Georgia, Louisiana and Mississippi with overlaid images of key state symbols and of people in community

Learning for Justice in the South

When it comes to investing in racial justice in education, we believe that the South is the best place to start. If you’re an educator, parent or caregiver, or community member living and working in Alabama, Florida, Georgia, Louisiana or Mississippi, we’ll mail you a free introductory package of our resources when you join our community and subscribe to our magazine.

Learn More