After many years of frustration over racial searches and search suggestions, we still see this type of racism everywhere we turn on the internet. In our class reading, Algorithms of Oppression How Search Engines Reinforce Racism by Safiya Umoja Noble, the author talks of how Google would not take the blame for the matter, stating that these types of suggestions or results are “Glitches.” That could be true, but if it were a glitch, then how does this type of racial bias turn up on search engines other than google such as Bing and yahoo? I’m going to suggest that it is not a “Glitch,” but rather poor choices of people who are writing the browsers and websites SEO. It may seem that many companies ultimately outsource web and browser design to other parts of the world who will work for pennies on the dollar and do not have a type of racial sensitivity but rather are just trying to get by and earn a small paycheck to support their family.

Facial recognition discrimination is the next area of racism or sexism that we are seeing through online technology. As if it were not enough to have racist browser results or racially biased SEO, now we are seeing companies that use AI facial recognition failing at identifying women and women of color. Joy Buolamwini, a graduate student at MIT, tackles this issue, stating that the software “mistook women for men 19% and dark-skinned females for men 31% of the time.” Many companies like google and amazon have taken steps to fix these issues in their software. Still, it seems odd that the same companies calling racial search results or suggestions a glitch are now also having problems with facial recognition, mainly in women of color. I do not believe this to be a coincidence but rather a poor hiring when it comes to the people who write the algorithms and web data.

With all of the racial bias, we have discussed on the internet; the big thing is the consequences for information being portrayed. When people see this type of bigotry on the net over and over, it starts to give people negative ideas and stereotypes about certain ethnicities that amplify what already exists in society. The targets of the majority of these search results are not just people of color, although they are at the forefront; also, women, in general, are being attacked by poorly written SEO and search suggestions. Some of the things that I have seen doing research tend to relate women to porn mostly, but also within search results, I have looked up as research for this project. It makes people of color, especially African American women, portray animals or porn results. The problem here is that certain people start to look at human beings in this way. Women are not looked at as a beautiful species, but instead, they are looked at as sexual objects or something disposable. In this day and age, we cannot let this happen.

Change is what we need for the issue at hand. Companies such as Google and Amazon need to be accountable for actions when it comes to their algorithms. A way to do this is to standardize terms in what they can use for SEO and correct what has negatively impacted searching on the web. Government regulations most likely need to be put in place that fines these companies somewhat like censoring what could be played on the radio or television when I was growing up in the 1980s. It obviously takes time to write algorithms and SEO for the internet, but employing teams that specifically tackle this problem might be a good start to end the racism and sexism we see on the net today.

Works cited


Chulu, Henrik. “Let Us End Algorithmic Discrimination.” Medium, Techfestival 2018, 18 Sept. 2018, medium.com/techfestival-2018/let-us-end-algorithmic-discrimination-98421b1334a3.

Lapowsky, Issie. “Google Autocomplete Still Has a Hitler Problem.” Wired, Conde Nast, www.wired.com/story/google-autocomplete-vile-suggestions/.

Noble, Safiya Umoja. Algorithms of Oppression How Search Engines Reinforce Racism. New York University Press, 2018.

Qu, Tracy, and Iris Deng. “Computer, Your Biases Are Showing.” South China Morning Post, 23 June 2020, www.scmp.com/tech/big-tech/article/3090032/sexist-and-racist-ai-can-result-wrongful-arrests-fewer-job.

Vincent, James. “Gender and Racial Bias Found in Amazon's Facial Recognition Technology (Again).” The Verge, The Verge, 25 Jan. 2019, www.theverge.com/2019/1/25/18197137/amazon-rekognition-facial-recognition-bias-race-gender.