Content is everywhere, and it bombards us all day, every day. From the first moment in the morning when we open our eyes and turn on the TV or check our phones for the latest news on social media, to the last moment of the day, when we check it once more just before we go to sleep. We believe we are up to date, and are soaking in a diversity of opinions, but in reality, we are trapped in a ‘filter bubble’.
It is true, we are trapped in a bubble. Not a soap bubble which pops as soon as we touch its surface, but in a big, information-filtered bubble which feeds us only news which has been selected just for us. The term “filter bubble” was coined by internet activist Eli Pariser in his book The Filter Bubble: What the Internet is Hiding from You. According to Pariser, people are getting less and less exposure to conflicting viewpoints and are isolated intellectually in their own informational bubble. Filter bubbles focus our attention on the same feeling, the same person or phenomenon – such as Donald Trump, or refugees crossing borders.
It must be algorithms!
It does not take long to scroll down our favourite news pages or our Facebook feed before we come to the conclusion that we are watching a matching pattern of information. We have always surrounded ourselves with people we agree with and have sought information we agree with. Our favourite website browsers such as Google, Facebook, and YouTube didn’t need long to realize that the more they adapt their content to our preferences, the longer we will be using their services.
Consequently SEO came into use. SEO stands for ‘Search Engine Optimization’ and is used by search engines. SEO considers how search engines work, what people search for, analyses the actual search terms or keywords typed into search engines and with optimization makes sure we find exactly we are searching for. That way we get the instant feeling that we have gotten exactly what we wanted, but we forget there is always a broader topic covered in the background. It is like driving on the seaside – you are in such anticipation of finally capturing the view of the sea that you forget to enjoy the scenic greenery on your drive to the shore.
Does that mean that the ‘filter bubble’ helped Trump win the election?
How do we observe the phenomenon of the ‘filter bubble’ in practice? The clearest and the most obvious example was the US election and the race between Hillary Clinton and Donald Trump. People shared their beliefs all over the social media, engaged in endless conversations about who was the most appropriate candidate, and disagreed with anyone who thought differently. To top it all, we only read news that would confirm what we believed in. Newsrooms and media houses might be perceived as objective, but in the end their reporting is far from that.
Everyone is talking about Facebook’s personalized news stream and the possibility that it pushed Trump to victory. How did this happen? According to the researchers, Facebook hosts a huge portion of the political conversation in America. In a 2015 study run by Facebook, researchers tested the ‘filter bubble’ hypothesis by “looking at ten million de-identified Facebook users who self-reported their ideological affiliation over a six-month period” (Bakshy, Messing, and Adamic 2015). They came to the conclusion that Facebook algorithms make up only 1 percent of our browsing preferences and the major influences are our friends and the social community we establish.
However, trouble arises when we think we are getting a representative view of the world, when we are clearly not – but we don’t know it. During the US elections we were exposed to ‘fake news’ produced by random people on social media and picked up by serious newspapers. Skilfully fabricated articles tapped into the pro-Trump and pro-Clinton prejudices, which spread with the speed of sound. As usual, the controversy would be shared at top speed, without anybody thinking twice if the news we had just read was based on fact. Living in a ‘filter bubble’ created such personalized perspectives that people became blind to other points of view. The Tump ‘unpopularity’ somewhere along the way lost the ‘un’ and allowed him to win the election.
It’s not only the ‘filter bubble’, it’s us, too
As already mentioned, it is not only the algorithm that leads us to thinking the news is created just for us. The lack of diverse political views on our Facebook feed is mainly the consequence of self-censorship. There is strong evidence in the Facebook research mentioned above that people are actually exposed to a great deal of diversity through Facebook. But we tend to create our own ‘filter bubbles’ by wanting to avoid political conflicts, reporting inappropriate posts, or even unfriending the people whose opinions we disagree with.
Who, then, should be believed? If you read Pariser’s book, you might come to the conclusion that the world we live in is wrapped into a giant ‘filter bubble’ where all of our lives are pre-planned by factors we cannot influence, although we can at least learn to notice them. But maybe you should reflect on yourself – check which friends you stopped following on Facebook, visit a new website you haven’t visited before and as usual, go out of your comfort zone to meet people with different, but definitely interesting, perspectives.
Support us!
All your donations will be used to pay the magazine’s journalists and to support the ongoing costs of maintaining the site.
Share this post
Interested in co-operating with us?
We are open to co-operation from writers and businesses alike. You can reach us on our email at [email protected]/[email protected] and we will get back to you as quick as we can.