© 2024
NPR News, Colorado Stories
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Researcher Cites Uptick In COVID-19 Vaccine Misinformation

MICHEL MARTIN, HOST:

Twenty-one percent of adults in the U.S. say they're, quote, "pretty certain" they will not get a COVID-19 vaccine and that more information will not change their minds. That's according to research out this week from the Pew Research Center.

Why might that be? Well, Facebook has been a hotbed of misinformation and disinformation about COVID-19 vaccines. And this week, the company said it would remove all false information about the virus that could, quote, "lead to imminent physical harm."

We wanted to understand what kind of misinformation people are being exposed to when it comes to COVID-19 vaccines, what's being said and what could be done to stop the spread of false narratives. Claire Wardle has been looking into this. She is the co-founder and U.S. director of First Draft. That's a nonprofit that's focused on research to address misinformation. Wardle co-authored a recent report called "Under The Surface: COVID-19 Vaccine Narratives, Misinformation And Data Deficits On Social Media." And Claire Wardle is with us now.

Thank you so much for being with us.

CLAIRE WARDLE: My pleasure.

MARTIN: So first of all, could you just tell us how prevalent COVID-19 vaccination misinformation is and the ways in which we're being exposed to it?

WARDLE: So we have seen misinformation about vaccines online for a number of years, but we've certainly seen an uptick this year, and we've seen an uptick in the last month since the new announcements around potential vaccines. And there's a whole host of different narratives out there about the vaccines, but there's also an absence of accurate information. People just have questions.

So talking about Facebook's decision, yes, there are false claims that they're taking down, but there's a lot of people who are just asking questions. Facebook can't take those down, but people have got genuine questions that we're just not answering adequately enough.

MARTIN: So do we know who's creating this content and why it's so successful?

WARDLE: Some of it is to make money. People are trying to drive clicks to their websites where they're selling health supplements. So there's those kind of people. There are people who are just trying to crave connections with a community. So in many ways, they have become part of some of these communities online where people are sharing information, and they want to be part of that community.

And so this idea that the government is controlling you - people's lives have been turned upside down this year. They're looking for explanations. And unfortunately, there are conspiracy communities providing what looks like explanations, so people are kind of seduced. There's a very kind of a social element to this.

And then some people are just doing this to cause trouble, to see what they can get away with. You know, the anonymity of the Internet allows them space to create these hoaxes and falsehoods, and they sit back and laugh. So there's a number of different motivations for people doing it. And there are, of course, different interventions that we need to roll out based on those different motivations.

MARTIN: OK, so let's parse a little bit more the Facebook statement that it would, quote, "remove false information that could lead to imminent physical harm." Your reaction to that is that - what? Does it go too far? It's just too vague? What do you think of it?

WARDLE: So if I was to, you know, list a hundred examples, actually, very few of them hit that barrier. There's a lot of people that are posting information that goes right up to that boundary. But you'd be very hard-pressed to say that false rumor is going to lead to imminent harm. And so, yes, if somebody says this vaccine was made from aborted fetus material, that is false, and they will take that down.

But if somebody's raising the question to say, I think the media is in cahoots with Big Pharma, you know, to control us, if you're a content moderator, do you take that down? If somebody says, you know, my family member was part of the vaccine trial and they had a terrible reaction, is Facebook going to take that down? Are they going to do the necessary fact-checking to pick up the phone to say - so there's a lot of noise right now that these policies sound great in a press release, but when you actually look at the examples, it's very difficult to say, would we take down all of that speech, 'cause that leads to these arguments of censorship, and nobody's allowing us to ask these questions.

MARTIN: I guess I'll also ask about the use of the word imminent. I mean, I would think of imminent physical harm being - you know, there have been some terrible stories recently where - of people in relationships, for whatever reason, sort of encouraging each other to harm each other, sort of goading each other into sort of physical - you know, into activities which you know could lead to harm. But I don't know how one would argue that preventing someone from taking a vaccine would ever lead to imminent physical harm.

Do you know what I mean? I don't know how you could even make that argument.

WARDLE: Exactly. And I think one of the things that we argue is that there's a lot of emphasis on individual posts. And what we're missing is the daily drip, drip, drip, drip, drip of low-level vaccine misinformation, none of which would break Facebook's barrier. But we don't know what this looks like if over a couple of years, you see this kind of content that's questioning the government, is questioning the CDC, is questioning Dr. Fauci. None of those posts, you know, would pass that test. However, what does it look like if that's what you see every single day? And we have almost no research that allows us to understand that longitudinal impact of misinformation.

MARTIN: In your research paper, your team looked at misinformation in three different languages spanning 41 countries. Could you just tell us a bit more about that? What did you learn from that?

WARDLE: So some of the most dominant narratives in English was about liberty and freedom. So there's a lot of narratives in this country and Australia and the U.K. about this being about control. It's actually less about the safety or efficacy of the vaccine.

In Spanish, it's much more about morality or religiosity, concerns about the ingredients in the vaccine. So there's a lot to this that's beyond, is this post true or false, which unfortunately, I think, is where the conversation has got stuck, and we've missed the long-term implications of misinformation online.

MARTIN: I'm wondering if this is in the scope of your research, but I was just wondering, is there something about vaccines and vaccinations in particular that stimulates this kind of false information?

WARDLE: So I think we have to go back to recognizing we can't just tackle the misinformation if we don't tackle providing quality accurate information. So if I walk down the street and ask many people, tell me about the science of a vaccine, how does it work, most people would actually struggle to explain that. And so we have to also recognize what we fail to do.

And I think the idea that you're injecting a piece of a virus into somebody to make them stronger in terms of defending against it - that's hard to get your head around if you haven't had that explained properly. So there's this idea of a vaccine being a kind of a foreign body entering, and it seems counterintuitive. So I think, actually, there's a lot of learning here that we failed to do over decades of giving people adequate understanding of why vaccines are such incredible examples of science.

MARTIN: That is Claire Wardle. She is the co-founder and U.S. director of First Draft. That's a nonprofit focused on addressing misinformation and disinformation.

Claire Wardle, thank you so much for talking with us.

WARDLE: Thank you very much. Transcript provided by NPR, Copyright NPR.