Although search engines such as Google can be great tools that can be used to access vast amounts of information useful in the production of reports, projects, blog posts and much more, they have a dark side where they can be used to promote racist ideologies and propaganda. The article “What Happened When Dylann Roof Asked Google For Information About Race?”, highlights a deadly situation where some of these racist ideologies were presented as search results to a teenage boy, Dylann Roof, when he searched for information on the Trayvon Martin, George Zimmerman tragedy that occurred in 2012. After reading the article he claims to have typed in ‘black on white crime’ into google, and the racist ideologies and the websites that create them appeared. When hearing about the kind of search results that were presented to Dylan, NPR decided to investigate what happens when things like ‘black on white crime’ are searched. When they began their investigation they came to the disturbing discovery that before as the typed b-l-a-c-k-o-n, Google’s autocomplete function suggested searches such as black on white crime, black on white violence, black on white crime statistics, and black on white racism. Meanwhile, the top suggested searches when typing w-h-i-t-e-o-n were white on white crime, white on white, white on white acid, and white on white kitchen. This struck me as an odd comparison considering that when the word ‘black’ was typed in, the autocomplete suggestions were all related to crime, but on the other hand as the word ‘white’ was being typed, the autocomplete suggestions were related to the color. NPR questioned Google about this to which they replied that they do their best to adjust the autocomplete suggestion function to avoid situations like these. Further on in the article NPR made another discovery, when starting searches such as j-e-w-s-a-r-e and a-r-e-m-u-s-l-i-m-s, suggested searches such as are jews evil, and are muslims bad appeared. At this point in the article I was just shocked that this was even a thing. Autocomplete suggestions involving terms related to minority groups were all negative towards those groups, but the suggestions following the term ‘white’ were all random and had no negative connotations. This being the case, it makes sense that Dylann Roof was given this kind of information when searching black on white crime. This problem with the autocomplete suggestion function is being taken advantage by white supremacy groups looking to spread their propaganda. 

After finishing the article and contemplating the information I provided, several thoughts crossed my mind. First, who is really responsible for this? The simple and easy answer is Google and those that are responsible for optimizing the search engine, but is it really right to blame this all on them? Google works by using an algorithm that takes into account popular searches. If these really are searches that are most frequently searched then the fault lies more on us as a society. 

Racism is an issue that has been around for ages and although it has improved to an extent, it is still very much present. It may not always be obvious or something you hear or see everyday but as this article highlights, a-lot of the times these groups have major ‘underground’ networks that they use to spread their ideologies which occasionally reach the attention of our younger generation. Since human nature is to believe just about everything we see online, it doesn’t surprise me that every once in  a while someone comes across this information, becomes radicalized, and commits a tragedy against a minority group. 

This is definitely something that I hope Google and other popular search engines are working to fix, but with all the laws protecting freedom of speech and the like it could be a very long time before we see any improvement in the suppression of white supremacy ideology and its outreach. 

Your Name: Carlos Lima
Image Alt Text: