Although all the readings we’ve done so far in this class have been enlightening and informative, I found that Algorithms of Oppression put the importance and relevance of Black Digital Humanities into perspective for me in a very meaningful way. The examples that Noble uses throughout the book to demonstrate the everyday adverse effects of commercially-driven search engines and other information circulation technology were certainly eye-opening. I don’t think I truly understood the extent to which innovations like Google Search can structure the information we are exposed to until I read Noble’s book. Something that was on my mind while I was reading it was the controversy last fall over Twitter deleting all posts and pictures related to the hashtag “bisexual” and blocking users from utilizing it. Although they eventually restored the ability to use it, their inability to reasonably explain why they had blocked it in the first place infuriated Twitter users and the bisexual community in general. Bisexual “erasure” is a major issue even within the LGBTQI+ community, and Twitter’s decision was yet another setback in bisexual visibility and inclusion.
Many users felt that Twitter blocked the hashtag because the search term “bisexual” and bisexuality in general is seen (as Noble puts it) “pornified.” Bisexuals often feel that even people within the queer community see them as oversexualized, promiscuous, and “not really queer,” so to have a major social media outlet like Twitter further silence and trivialize bisexuals was devastating. Noble’s descriptions of the ways that black women and girls are sexualized by the structuring of Google’s search results immediately called the Twitter controversy to mind and made me realize just how strong an influence the Internet and media can be on community image and personal identity. I was not surprised to read that Google Instant, another Google search tool, initially excluded terms like “bisexual” and “Latina” because of the “pornified” top results associated with them. Yet as Noble points out, Google itself is responsible for these results, and is also responsible for fixing such issues. The idea that the most “relevant” search results we are shown are created by our own online behavior and activity is only partially true; advertising money determines much of what we see at the top of the page. It’s far too late in the game for Google to naively claim that they are not responsible for what they show us and how it affects us. If the future is online, then those that curate the media put in front of us have to take responsibility for its impact, whether it relates to black women and girls, the bisexual community, or the white supremacy websites that led Dylann Roof to commit mass murder in the name of racial hatred.
Dani Massaro 9/26/18