Introduction & short definition
Social media makes up an essential part of our life. Every time we log into networks like Instagram, a whole new world opens up that is individually designed and tailored to our interests and needs. But even though we use it so frequently, many people might still be unaware of the phenomenon of polarisation.
Polarisation isn’t something new, in fact, it has been shaking societies worldwide for so long that we can see real-life effects, such as the recent overthrow of abortion laws in the US, which caused a massive backlash on social media. People who were pro-life argued vehemently against the people representing pro-choice and the other way around. It was such a polarised topic that the two opposing sides could not find anything they could agree on, which is one characteristic of polarisation.
But what exactly is polarisation?
In the Oxford Dictionary, polarization is defined as “the act of separating people into two groups with completely opposite opinions on a topic”, and one big reason it has significantly increased over the past decade is social media.
History of Social Media
In the early days of social media, between 2003 and 2004, when there were platforms like Friendster and MySpace, it was only about showing the world which bands you like and who your friends are. No heated political discussion, no spread of outrage and a non-toxic environment. This changed over the following years. With the introduction of the Like Button on Facebook in 2009 and the Retweet Tool on Twitter, everything you post can now be ranked and easily reach out to millions of people.
But what makes this so toxic, and how did services that promised to connect us became one of the strongest forces driving us apart?
Role of social media
Social media has become a major source of news as well as it has played an ever-increasing role in political discussion. But even though the internet and networks like social media have made it easier than ever to access different sources of information and to spread it, it “has not made us better at finding viewpoints that are different from our own”, as Kiran Garimella said in his doctoral dissertation about “Polarisation on Social Media”.
Algorithms and echo chambers
One of the biggest reasons for this is Algorithmic newsfeeds. Before they were introduced, the newsfeed was chronological. Now algorithms use your personal data to suggest the content you will enjoy the most. So if your browsing history suggests you like food, you will see loads of recipe reels on your feed. That’s it. But while it might sound like a good thing because it is very easy to find your people and it’s nice to have a newsfeed that is perfectly tailored to your interests, we might have become a bit too comfortable in our “filter bubbles”.
The term “filter bubble” was coined by the internet activist Eli Pariser. In his viral TED Talk, he defined it as a “personal, unique universe of information that you live in online. And what’s in your filter bubble depends on what you do. But the thing is that you don’t decide what gets in. And more importantly, you don’t actually see what gets edited out”.
Filter bubbles are also often referred to as “Echo chambers”, which is a great imagery since the situation of hearing your own voice is exactly what’s happening on social media because the users only consume content that reinforces their already existing views.
So in terms of political discussion and forming an opinion, it is really important to make an effort to get out of your filter bubble in order to escape the ever-increasing cycle of only consuming content that agrees with your beliefs and to broaden the breadth of content you consume.
Outrage
But it’s not only this issue.
The introduction of the algorithmic newsfeed also converted social media into what the experts call an “outrage machine”.
To understand this, it’s essential to know that contrary to what a lot of people might think, social media is not actually free. Instead of money, we pay with our personal data, which the social media companies then use to display targeted advertisements. Because of this business model, the designer do whatever it takes to keep us engaged, and what keeps us engaged are emotions. More specifically, strong emotions like anger, sadness or even outrage, which is a strong feeling of anger or shock at something that you feel is wrong or unfair. That’s why you see more tragic videos and posts, for example, about war than good news content. The rise of video content and reels on Instagram is another perfect example of this because the short reels are more engaging than just photos.
So if social media exposes us to such emotion at an abnormal level, it becomes harder and harder for us to identify what’s really worth the outrage and what’s not, and healthy discourse and conversation become even more difficult, which again results in polarisation.
Example BLM discussion
One example where polarisation and the effects of algorithm filtering became particularly visible was the Black Lives Matter Discussion in June 2020. After a white police officer had brutally killed George Floyd, a Black man, by kneeling on his neck for almost 10 mins, the internet went crazy. The video of Floyd’s death went viral within one day/ hours of his death, and on June 2nd, your feed was probably full of black square images using the hashtag #blackouttuesday #blacklivesmatter or #BLM. If you saw it, it might give you the impression that everyone agrees that racism and police brutality are disastrous and need to end instantly, but of course, that is not the case. If it was, there would not even be a need for this debate. That’s exactly why it is so important to not be fooled by your social media bubble because only sharing your thoughts and opinions with like-minded people will not cause the change you demand.
Possible Solutions
So what can we do to get out of our bubble on an individual as well as on a societal level?
First of all, it is crucial to spread and raise awareness of this topic. If people don’t know they are in their own perfectly curated social media bubble, how should they get out of it? In addition to that, being aware of the fact that we are only exposed to agreeing content will make us more open and sensible to other opinions. So the second step we can undertake is to diversify our feed by following as wide a range of people with as wide a range of experiences as possible.
Thirdly we should always keep in mind that cutting ourselves off from people who don’t completely agree with us isn’t helpful at all. Instead of shaming other people for different views or their lack of knowledge, we should ask them why they think this way or share an alternative perspective. In the example of the BLM debate, this could mean to sit down with your parents and talk to them about white privilege.
But what about the social media platforms themselves? As already said, with their wide range of information, the increase in social feedback through the Like and Retweet Button, their spread of outrage to keep us scrolling and the increase in filtering power, which makes us avoid reading conflicting information, they are a huge catalyst for polarisation.
So platforms like Instagram, Facebook, Twitter and TikTok are good starting points to decrease polarization. There are several approaches and ideas to address it with the aim of a new structure for social networks. One of them is to ban fake accounts and ensure that the users give some proof of identity so that people would feel less shielded by anonymity and, therefore, less likely to engage in polarising activity. Another is to alter the algorithms so that they no longer confront the users with content that triggers an intense emotional response. Lastly, Kiran Garimella suggests exposing people more to opposing viewpoints while showing the credibility of a source or the expertise of a user since it increases the chances of other users believing in the content.
In conclusion, it can be said that social media, with all its features is a huge catalyst for polarisation, but if we understand how it works, we can address it and try to make a change. Thereby we should always keep in mind to keep the discussion going.