Search Engines Like Google Are Fueled By Racist, Misogynistic Algorithms, Says MacArthur Fellow

Safia Noble burst into tears upon hearing the news of her MacArthur scholarship – when she finally answered the phone after a week believing the Chicago number was robotically calling her.

Noble studies prejudice on the Internet and how search engines like Google and Yahoo exacerbate racism and prejudice against women. She is the founder and co-director of the new program at the University of California at Los Angeles Internet Critical Inquiry Center.

Many people of color, women and LGBTQ scholars have attempted to warn of the dangers lurking online over the past decade, she says, but it wasn’t until the 2016 presidential election that people began to understand. The Cambridge Analytica scandal has drawn attention to the fact that search engines manipulate people’s thinking.

“There are a lot of technologies out there that watch us, collect data about us to manipulate us or to steer us in a certain way or really to deny us opportunities,” she says. “We believe this is one of the most important civil rights works in the Internet.”

In her work, Noble examines how search engine algorithms exacerbate racist stereotypes and prejudices against women. Algorithms are sets of instructions given to computers – but the public doesn’t know how that works on the platforms people use every day, she says.

Many women experience racial bias and sexism in algorithms when they search for words like girl or girl, which can bring up pages full of pornography, she says.

“There were just a number of racist stereotypes and sexism, and that was part of the way the algorithm sorted information on the web,” she says. “Ten years ago, when I made an argument like this, people were like, ‘Safiya, it’s impossible because algorithms are just math and math can’t be racist.’ “

But Noble explains how, applied to millions of searches on the Internet, the mathematical formula gives the worst possible results.

Many people assume the search results are based on what is most popular – a myth Noble wanted to unpack in his book “Algorithms of Oppression”.

For years, Noble has studied particular phrases – “black girls”, “Latin girls” and “Asian girls” – that conjure up pornography without mentioning words like sex or porn.

“Girls of color were synonymous with pornography,” she says. “They are a small minority of the population compared to the whole, so they have no agency and no control as a community, as individuals over the pornographic representation.”

In the book, she says she also explored what it means when industries like pornography can “spend and optimize keywords that belong to people and communities who have already faced a long history of fake. racist and sexist representation in popular culture “.

People don’t realize that search engines are advertising platforms, not just knowledge spaces, Noble says. She wants people to understand that search engines work well for finding information like when a cafe’s closing time, but not for complex questions about people and history.

“These are nuanced types of queries that are actually fed into an ad engine,” she says, “which means those with the most money to pay will be more visible.”

One obstacle is that people trust big search engines like Google, she says.

Before Dylann Roof shot and killed black worshipers in South Carolina in 2015, he found white nationalist websites through search engines. In this painful case, Noble says Roof reported that he ended up on such sites because he was trying to understand the media coverage of George Zimmerman’s trial for the murder of Trayvon Martin, a key moment that ignited the Black Lives Matter movement.

“When Dylann Roof did this research to try to figure out ‘who was Trayvon Martin? And “who is George Zimmerman?” “She says,” very quickly he fell into a hole in white supremacist and white nationalist, anti-black, anti-Semitic websites. “

Regulations that prevent search engines from collecting information about people over time and using that to direct their attention to certain types of content could help, Noble says. She envisions a digital future where people see Google as an advertising platform, but also different types of search engines organized by librarians, teachers or experts.

“We could differentiate more so that when we have certain types of queries, they maybe take longer to get a response instead of an instant response,” she says, “more like the difference between fast food and organic slow food. “

Karyn Miller-Medzon produced and edited this interview for broadcast with Jill Ryan. Allison Hagan adapted it for the web.

Rosemary S. Bishop