© 2024
NPR News, Colorado Stories
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
KUNC's The Colorado Dream: Ending the Hate State has arrived! Join us each Monday through Nov. 4 for a new episode.

A case for holding tech companies responsible for their algorithms

MICHEL MARTIN, HOST:

As we just heard, both Democrats and Republicans have started talking about regulating Facebook and other Big Tech companies. We want to talk more about what that could look like. During the Senate hearing, whistleblower Frances Haugen said critics should focus on the company's algorithms. That's the artificial intelligence social media companies use to rank or promote content. Roddy Lindsay agrees. He's a former data scientist who worked on algorithms at Facebook, which, we'll say again, has been a financial supporter of NPR. He wrote an op-ed for The New York Times about how to regulate algorithms. And he began by explaining why they're so dangerous.

RODDY LINDSAY: There are some real risks associated with these algorithms that can amplify content that once might have been relegated to a fringe corner of the internet. This type of content can take hold and spread very quickly in a very short period of time. I think as a society, we're grappling with, you know, whether we want these algorithms in our lives and whether they're compatible with democracy.

And the good news is that there's a relatively straightforward fix that Congress can make, which basically says if you're in the business of using these algorithms to distribute content, you as a platform need to take responsibility in that. And you could be held liable for that. And I think what that would do is that - it would change the incentives for these companies to change the way they create these feeds and create their products and go more towards a model that puts control back in the hands of users and make it, you know, understandable, the way that they're ordering content and get it out of the hands of these sort of black-box AI algorithms.

MARTIN: You've described it as a straightforward reform. If it's straightforward and simple, why do you think it hasn't happened yet?

LINDSAY: You know, Congress has grappled with, well, how do you do this in a way that doesn't, you know, run afoul of the First Amendment? You know, you don't want Congress getting - you know, saying this is the type of speech you can host. This is the speech that you can't host. I mean, that would obviously run into the First Amendment issues.

But if you just focus on the amplication (ph) part of it, there's some historical precedent for this. You know, in the 1940s, cities around the U.S. were plagued with these sound trucks that would drive around with these huge horns and amplify music and commercials and things like that. And so some cities started passing some laws regulating these sound trucks, saying, you know, you can't drive around and blast this speech into people's homes at this very loud level. And the Supreme Court actually upheld Trenton's - Trenton, N.J.'s law in 1949. So there is some precedent for the Supreme Court saying there are ways in which we can limit the amplification of speech - not the speech itself, but how it's amplified and distributed.

MARTIN: But what about the business side of it? I mean, I think critics', like Frances Haugen's, point has been that one of the reasons that Facebook doesn't do this or hasn't done this on its own is that it's been very profitable. The businesses they've conducted, it has been extremely profitable. What is your sense of - and I know you don't speak for Facebook. That's not why we called you. But what effect would this have on their business model?

LINDSAY: You know, there's no incentive for them to get rid of these algorithms themselves, which is why it needs to be Congress to step in and say there needs to be a comprehensive regulation that limits these for everybody, not just for one site or another. So even if Facebook or YouTube wanted to do the right thing, they really can't today because of market pressure. And they have shareholders, of course, to report to. So if you were to say, you know what? We're going to unilaterally decrease our revenue by 10 or 20% - your shareholders would yell at you.

But if it's Congress that can step in, then everyone's impacted the same. And after that, you know, everyone's - there's this - there's a level playing field. And all the companies can get back to building products that actually benefit their users without relying on these algorithms to be the primary driver of engagement. So I think, you know, while the companies might say that it would be harmful for them, I think that would be very short term.

MARTIN: Before we let you go, do you mind if I ask you a question - personal question?

LINDSAY: Absolutely.

MARTIN: How do you manage your social media diet knowing everything that you know? Knowing everything that you know about the way these algorithms work, how do you manage this for yourself?

LINDSAY: Well, actually, on Twitter, I use their chronological feeds, their non-algorithmic feed, which is one of their options. And, you know, what I find is that it lets me discover things that may not be the most polarizing or engaging content. It may be sort of more boring. But I find that it's actually a better reflection of the people in our lives and, you know, what's happening out there.

You know, not everything needs to be the - not everything that you'd see on social media needs to be the most engaging content. It might be beneficial to have a more, sort of, boring feed. And it's not something that other people prefer. And I've been using it myself. And I think it'll be totally fine.

MARTIN: Roddy Lindsay is a former Facebook data scientist. His New York Times op-ed is titled "I Designed Algorithms At Facebook. Here's How To Regulate Them." Roddy Lindsay, thanks so much for talking with us.

LINDSAY: Thanks for having me.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.