Google Search May Show Inappropriate Bias When Displaying Controversial Figures To Users

Google Search’s autocomplete results may not be the best bet to rely on in certain circumstances, especially when looking up controversial figures and topics of discourse.

Search engine and social media algorithms have a sense of secrecy to them, stemming from the fact that the companies behind them wish to keep things so. Opening up an algorithm’s detailed functioning is essentially granting an open invite to the internet for cancellation or ridicule (perhaps even both). However, people aren’t exactly dumb. Quite the opposite, as a matter of fact: the internet allows literally anyone with Wi-Fi or data to access all forms of information. This usually leads to people just smart enough to note patterns figuring out if there’s something off about an algorithm. A huge example of this can be derived from Twitter’s auto-photo cropping AI, which the community outed for having racist tendencies almost immediately. Essentially, in order to meet Twitter’s display requirements, the AI would display a propensity towards cropping out racial minorities (i.e. not white people), in an attempt to edit photos down. People eventually took note of this and even published their own examples in the interest of providing evidence. Twitter eventually responded to the criticism and affirmed their attempts to work on amending mistakes. However, this is a clear, bold example of why massive corporations don’t want users digging too much into their algorithms and AI: since these things were built by humans, they will unintentionally partake in their faults and biases.

In fact, Google’s own search engine has also been the target of criticism from individuals who cracked the code (to some extent). How did everyone reach this conclusion when the company notoriously gives no real insight into the algorithm’s functioning? Well, it comes down to a tiny little concept called reverse engineering. Researcher took inspiration from an experiment conducted some time ago that attempted to slightly differentiate between Twitter and Instagram’s algorithms. They went through some arduous labor to essentially grab at potentially controversial words and try them all out between platforms. This revealed that a word such as fentanyl was an acceptable hashtag on Twitter, but a banned one on Instagram. So clearly, the algorithms have their own rules and tendencies. After that researchers decided to apply the same process to Google Search, but with famous figures.

Alex Jones, the notorious conspiracy theorist who actively harassed a Sandy Hook victim’s family into shifting houses on multiple occasions, is simply listed as a radio show host on Search results. Anders Breivik, a Norwegian terrorist, is labelled as a convict. Jake Angeli might be the worst offender: The Q’Anon guy wearing horns and fur as he broke into the Capitol during the famous 2021 riot, is labeled an American activist in Search’s English version, and an actor in its Arabic version. While Richard B. Spencer appears as a publisher in some searches, and an American Editor in others. Cute.

H/T: TheConversation.
Read next: Websites Show Pandemic Driven Comeback, Outpaces App Usage Growth Rate
Previous Post Next Post