© 2024
NPR News, Colorado Stories
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
KUNC's The Colorado Dream: Ending the Hate State has arrived! Join us each Monday through Nov. 4 for a new episode.

How AI could perpetuate racism, sexism and other biases in society

AILSA CHANG, HOST:

AI, or artificial intelligence, is the phrase of the moment, right? From politicians to scientists to tech billionaires, it seems like everyone is scrambling to better understand how this new technology can impact all of our daily lives - for better or for worse. And right now, there does seem to be more questions than answers. For example, one concern - how the massive amount of information gathered by AI and incorporated into other technologies could perpetuate racism, sexism and other biases. To better understand that issue of racial bias, I spoke with Safiya Noble, an internet studies scholar and professor of gender studies and African American studies at UCLA. She's the author of the book, "Algorithms Of Oppression: How Search Engines Reinforce Racism." And I asked her how she thinks artificial intelligence can be used to sustain racist systems.

SAFIYA NOBLE: So the thing that keeps me up at night most is things like machine learning, which is really when we give instructions to computers to look for anything it wants - this kind of pulling in of lots of different kinds of data, looking for patterns and making predictions around a set of outcomes. This is really the way that machine learning works. It's where a machine is pulling in more information than the human mind necessarily can do in a...

CHANG: Yeah.

NOBLE: ...Very short period of time and then kind of culling through that and spitting out predictions. Now, we see this kind of machine learning in the topic of the day right now, which is ChatGPT.

CHANG: Right.

NOBLE: That's all a predictive text analytic. It's not a human being or a set of people or experts who are really culling through with a type of understanding to help us, you know, make sense of what we're going to receive as an output from those types of technologies. I'm concerned about that, and I'm concerned about the way in which machine learning and predictive analytics are both overdetermining certain kinds of outcomes.

CHANG: Give us some specific ways that biased AI is affecting people's lives right now.

NOBLE: Well, one of the things we know, probably - and I will start with what I think of as the most life or death dimensions of AI - biased AI, discriminatory AI - would be the use of AI in things like criminal sentencing and, you know, determining whether a person is likely to be a risk or not and keeping them in prison or in jail or releasing them on bail or releasing them entirely.

We saw from the very important research done by Julia Angwin and her team around the COMPAS recidivism prediction software a couple of years ago how Black people who were charged with crimes were more than four times likely to be sentenced to very severe punishment, as opposed to white offenders who were committing violent crimes and were much more likely to be released on bail.

CHANG: And why is that? What kind of information is being fed into AI to determine these sentences that would be inherently biased against Black people?

NOBLE: What is used to determine these kinds of predictive AIs are things like histories of arrests in a certain zip code. So if you live in a zip code that has been overpoliced historically, you are going to have overarresting. And we know that the overpolicing and the overarresting happens in Black and Latino communities. That's just a fact. So if that is a main factor in whether you are likely to commit - in predicting whether you're likely to commit another crime because lots of people in the zip code you live in have been arrested more than, let's say, you know, in South Central LA where I live versus in Beverly Hills, then you are more likely to be considered a risk. That has nothing to do with you. That has to do with the history of structural racism in policing in the United States.

CHANG: What do you think is needed to ensure that AI systems do not further contribute to systemic racism? What would you suggest?

NOBLE: I think we need a very robust digital civil rights agenda in this country. And, of course, many other scholars and activists have been concerned about the way in which our civil and human rights are constantly encroached upon by these different kinds of technologies, as well as other kinds of rights - you know, copyright protections, intellectual property rights. We have a massive diversity problem when it comes to racial and ethnic diversity. We have a total exclusion, for the most part, of Black and Latino/Latina workers, Indigenous workers and women of color. So there's no question that, if you had a more diverse workforce, you might detect some of these things. You might ask different questions. You might ask some of the harder questions. But fundamentally, we have to have a robust human and civil rights framework for evaluating these technologies. And I think, you know, they shouldn't be allowed into the marketplace to propagate harm, and then we find out after the fact that they are dangerous and we have to do the work of trying to recall them. That would be a very powerful start.

CHANG: Safiya Noble, internet studies scholar and professor of gender studies and African American studies at UCLA. Thank you so much for this conversation.

NOBLE: Thank you, Ailsa.

(SOUNDBITE OF GNARLS BARKLEY'S "CRAZY (INSTRUMENTAL)") Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Brianna Scott
Brianna Scott is currently a producer at the Consider This podcast.
Jeanette Woods
Ailsa Chang is an award-winning journalist who hosts All Things Considered along with Ari Shapiro, Audie Cornish, and Mary Louise Kelly. She landed in public radio after practicing law for a few years.