Updated: Jun 3, 2021
By Suyash Kothari | United States
Contemporary issues often take a binary form, with one distinct side pitted against another: Leave or Remain, Republican or Democrat, religious or anti-religious, developed or undeveloped. While standing on distinct sides is useful to assemble shared ideals, the extreme partisanship today goes beyond productive debates. Both side s engage in identity politics, where the issues at hand take a backseat and denouncing the ‘other side’ as evil is the rule. Complex, nuanced issues are reduced to false dichotomies.
Social media platforms have made mass communication and sharing ideas free and easy, which should help parties solve their issues. The current structure of social media, however, makes it more likely to produce or consume extremist information.
Here, I draw upon Canadian Philosopher Marshall McLuhan who reminds us that “medium is the message”— the mediums over which information is relayed should be a focus of study, rather than the content of the messages themselves.
Take YouTube for example. As Kevin Roose aptly put it in his 2019 New York Times article, YouTube’s suggestions feature can “steer you towards Crazytown”— in other words, it can create a feedback loop where videos get more radical and extremist. This feature, along with algorithmic updates incentivizing YouTubers to create content that maximize watch time, are well-intended and effective. Flashy titles and intros to get more clicks are no longer effective; instead, creators have to focus on making content more engaging.
However, the unintended consequence was that far-right groups made their content more inflammatory and hence more engaging. Compounded with the ‘up next’ feature, this content made alt-right parties more popular. Roose illustrates this with the example of Caleb Cain, who in his early twenties got sucked into the alt-right rabbit hole (‘red-pilled’) as a result of YouTube’s suggestions. Later, new left-wing YouTubers’ videos managed to pull him out, only to get him into the left-wing rabbit hole.
The problem lies in computer science and broader policy contexts. Managers and executives in tech firms like YouTube must make sure new algorithms consider polarizing or brainwashing effects, even if they are unintended. However, there is no profit incentive to do so, reason why the government must enforce regulations and audits to give executives the incentive to consider political issues. This urges the U.S. Congress and other governments to increase their data and computer science expertise so that regulations are effective and technologies are up to date.
Adichie’s TED-talk about the “Danger of a Single Story” is more relevant in today’s social media-driven world than ever. Check it out here: https://www.ted.com/talks/chimamanda_ngozi_adichie_the_danger_of_a_single_story?language=en
Filter Bubbles confine us to opinions that reaffirm our own. We are less willing to listen to each other and find the common ground needed to tackle issues like climate change and gender inequality, which is why it is vital for companies to check themselves (or be checked by the government or consultants). The best minds are being used to make algorithms for the company, but not solutions for society. There must be a mechanism that makes the market favor algorithms and solutions that reduce the hyper-polarization of today’s politics.
The views and opinions expressed in this entry are solely those of the author(s) and do not necessarily reflect the position of The Zeitgeist. Assumptions made in the analysis are not reflective of the position of any entity other than the author(s). The Zeitgeist does not verify the accuracy of any of the information contained in the entry. The Zeitgeist is not to be held responsible for misuse, reuse, recycled and cited and/or uncited copies of content within this entry.