Counteracting the Opaque Filter Bubble

Ella Barnett, May 14 2018

The personalization of technology has been the biggest ongoing trend of the last five years. We have come to not only expect personalized services, but to continuously demand higher levels of personalization in our possessions, our experiences and our internet use.

But what is the cost of having our lives tailored specifically to us as individuals? Our demand to surround ourselves with an online world of news and entertainment has caused us to increasingly perceive the world through ‘filter bubbles’. Filter bubbles color our perception of society, not only showing us exactly what we want to see, but more importantly depriving us of information that might be influential to our worldview.

So how do we begin to counteract this? Firstly, we must understand and accept that filter bubbles exist, and play an often hidden, yet very significant role in shaping our habits.

Filter bubbles have been around from long before the invention of social media, and even the internet. Psychologist Leon Festinger discussed these tendencies in his theory of cognitive dissonance written as far back as 1957. According to Festinger, humans are always looking for people they agree with, that they like, and that have similar worldviews to themselves. However, the presence of social media and personalized algorithms has exacerbated this process.

The problem with these algorithms is that they are hidden from the user, meaning that people are often completely unaware that this happening. A 2015 study suggests that upwards of 60% of Facebook users are unaware that content is specifically curated for them.  

So where does that leave us? It is not to say that filter bubbles are inherently dangerous; in fact their effect can be mitigated as long as they are transparent and you are aware of their existence; as long as we can choose how we create our bubble. However, in today’s world where communication has become increasingly dominated by social media and the internet, these filter bubbles have become increasingly opaque, leading to increased issues including a lack of awareness and empathy.

But this isn’t new. You’ve heard it in Obama’s parting speech, from comedians such as John Oliver, and in one of the ever increasing number of  articles circulating the world wide web. But how many of you have actually actioned what you’ve read? How many of you have taken the next step?

Although social media has been a key ingredient to the increased opacity of filter bubbles, social media can also offer a solution. Tools and algorithms have been developed – or are being developed – to show people the skewness  of their reading and news preferences. Google have also admitted that their algorithms show results that skew towards answers that align with how a question is asked, but are now trying to resolve that problem and provide first page links that come at the specific topic from both sides.

I know what you’re thinking and yes, currently it is the individual’s responsibility to force themselves to deviate from the personalized path. This is in part because although you can encourage people to like or read something from ‘the other side’, it is entirely up to them to actively listen and take on board what is being said.

VICE has recently released a tool that helps you figure out just how much of a bubble you have cocooned yourself in. Why? So that you get a more balanced, and therefore more nuanced perspective of the world. In being notified about the things you hate, you are forced to at least acknowledge that those  opinions exist, and to furthermore try and understand that perspective. It forces you to think and consider the other side. As the old cliché goes – but definite point of – every story has two sides. You might be pleasantly surprised at what you find.

I conclude with some additional actions and applications that can help create a more transparent filter bubble by identifying where your – and your information sources – get their biases from.

  1. Consider going into Incognito Mode on Google Chrome when you are researching a new topic. Without any of your Google profile fall back on, any bias of your results will be lessened
  2. Habitually delete your cookies. The more cookies you have, the more the internet can suggest what to show you next.
  3. Go into Facebook and see the profile they have built for you. This is the information they use to target ads to you. Notifications>>Settings>>Ads.
  4. Familiarize yourself with the biases behind your information sources. Consider who they are run by, who supports them, and who sponsors them.
    (When you get a library guide on how to figure out source bias, you know its a real thing)
  5. Look beyond where you’re simply being fed information, and actively seek out sources that might contradict your worldview. This could include other news sources or people with different backgrounds and opinions.
  6. Start to actively listen to people who disagree with you.