Algorithms of Oppression

I found the content of this book to be both truly fascinating and also upsetting. Most of your average people believe that the internet or Google would be free of biases, and yet that is far from the truth. These are not harmless sites, but are instead programmed by people with their own prejudices and biases, and that is transferred to the machine of the internet, and then becomes something that affects all of us.

Noble used the search term “beautiful” and received pictures of mainly white people, while when searching “ugly” it was a much more racially balanced group of results. White women and men being propped up as more attractive than black women and men is something that is already rampant in society (such as television, movies, and magazines as we read last week), and that is not something that we need constantly reinforced by a supposedly neutral party, such as Google.

These biases in searching and internet results may be influenced by the individual biases of their programmers, but one would have to think that this is also something influenced if not ordered or protected from those much higher up the chain. As Noble writes, these search engines contain “decision making-protocols that favor corporate elites and the powerful, and they are implicated in global economic and social inequality”(Noble 29). These results directly influence and everything from choosing what to buy and where to buy, to political votes and donations. These technological practices (such as machine learning) are embedded into Google and the internet with their own set of values, that affects all who use it socially, economically, and politically. With over 3.5 billion searches a day, Google’s biases and prejudices truly have a stranglehold on us. We must do our best to foster change and get rid of these biases in Google’s searching. It can be done, as Noble wrote, and we must make a concerted effort to do so.

This entry was posted in Blog. Bookmark the permalink.

1 Response to Algorithms of Oppression

  1. andrewguglielmo says:

    Exactly. As Noble said, the algorithm was learning from what people were searching, but Google should have acknowledged sooner that allowing harmful and “pornified” results simply go to the top because that stuff exists on the internet and is what makes them money is not the best way for a search engine to work. Like we saw in the readings from the past two weeks, systems that are used by large amounts of people can have a major impact on the collective memory and as a result, can harm entire generations. Google, or even the federal government like I stated in my post, needs to provide its people with a safe and reliable search engine that prioritizes legitimate and useful information that is not reliant on ad dollars in order to have a good product.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s