© 2024
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Increased Social Media Usage Creates Perfect Conditions For Spread Of Misinformation

AILSA CHANG, HOST:

Our new socially distant reality has drastically changed how people are getting their information. Specifically, people are online more than ever before, and that could open the door to even more of the kinds of election interference that we saw in 2016. We are spending this week looking at the different ways the pandemic could affect the democratic process in the 2020 election. And joining us now for the latest installment is NPR's Miles Parks. He covers voting and election security.

Hey, Miles.

MILES PARKS, BYLINE: Hi there.

CHANG: Hi. OK, so people are stuck at home. They are online a whole lot more. Can you just explain - what might be the consequences of that when it comes to elections?

PARKS: Right. So specifically, people are on social media more than ever before as well. Now, usage on social media was already trending up over the last couple years, but this pandemic has kind of supercharged those gains. Facebook and Twitter have both announced huge year-over-year first quarter user gains, and the amount of time people are spending on these platforms is also increasing.

We also know that social media manipulation was a key component of Russia's interference efforts four years ago. So with people online more and getting more of their information on these sites, experts who study this stuff who I've talked to are kind of sounding the alarm that a few months away from the election, this could be a real problem again.

CHANG: OK. So given that there is a whole lot more use of social media right now, does that mean that there's more misinformation or disinformation online as well?

PARKS: The short answer is yes. I talked to Steve Brill, the CEO of NewsGuard, which is a company aimed at tracking false information. And he told me the amount of sites peddling hoaxes regarding the coronavirus specifically have more than tripled just since they started tracking a couple months ago.

He also said he expects those sites that are right now peddling conspiracy theories around the coronavirus - that at some point in the next couple months, they're going to switch to start peddling conspiracy theories about political information once the country starts talking and thinking about the election more.

CHANG: Interesting - I guess not surprising. Well, social media companies like Facebook and Twitter are saying that they have done a lot to fight manipulation on their platforms. So is that panning out? I mean, how much better is the landscape since 2016?

PARKS: You know, these companies have poured millions of dollars into changing things about their interfaces to hopefully make this less of a problem this time around. Mark Zuckerberg, Facebook's CEO, had an investor call just a couple weeks ago where he talked about how fact-checkers had marked more than 4,000 pieces of content related to the coronavirus as false.

But a lot of critics of these platforms are just not satisfied at this point. For one, there's resource - research - excuse me - that says this sort of fact-checking efforts - they don't work because the very act of repeating a lie or a misleading statement just to correct it or to tell somebody it's false still makes it more likely that a person will believe it just by seeing it. I talked to Ann Ravel, who's a former commissioner on the Federal Election Commission, about this.

ANN RAVEL: We know that when false statements are placed on platforms, they spread rapidly. And so dealing with it later is never going to be good enough.

PARKS: What Ravel says and what a lot of experts I talk to say is that the only way this problem's really going to get fixed is with government regulation but that that's definitely not going to happen this year, considering all the free speech issues that go along with the government getting involved in this space.

CHANG: OK. But what about our role in this as the public? Is there any sign that all this talk about fake news the past few years has meant people are just better at spotting it now?

PARKS: I wish I could say that's the case, Ailsa, but it does not seem like that. The problem is that these platforms reward a sort of tribe mentality and that there's a lot of research that shows people can be super educated, super critically minded and that they still ignore any...

CHANG: OK.

PARKS: ...Evidence that goes against their sort of identity. There's also research that shows these sites polarize the public. So when you think about all of that and you think about what...

CHANG: Right.

PARKS: ...That could mean for the election, people are really worried about it.

CHANG: That's NPR's Miles Park.

Thank you, Miles.

PARKS: Thank you. Transcript provided by NPR, Copyright NPR.

Miles Parks is a reporter on NPR's Washington Desk. He covers voting and elections, and also reports on breaking news.