
The danger of the filter bubble
A look into the filter bubble, and what we should do about it.
The internet is a breeding ground for confirmation bias — the idea that people tend to search for, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs or hypotheses. While the idea of how much information we can access is expansive, the amount we are actually seeking out and parsing through is incredibly small. We tend to think of the web as an expanse in general. The world is at our fingertips! We have all of the knowledge we could dream of! Unfortunately, we tap into a fraction of it.
The filter bubble
More specifically, Internet activist Eli Pariser coined the term filter bubble to describe the phenomenon that we have a contained ecosystem of information we are exposed to, based on our pre-existing beliefs. Due to advanced search, filter, and recommender algorithms, users get “less exposure to conflicting viewpoints and are isolated intellectually in their own informational bubble.”
Yikes.
You might be thinking that this is simply a byproduct of the game — of course, if you are looking at a book at Amazon, you will be recommended more similar titles. What’s so wrong with that? In some cases, yes, this can be helpful. Similar to browsing the fiction shelf at a library, you may stumble upon more books to read. However, this becomes problematic based on the context that it is occurring within.
For example, let’s take a simple Google search process. If I search for “why the earth is flat,” the results I get simply confirm my belief.

The results simply build upon my biased search terms. After all, my search terms themselves assume that the earth is indeed flat — now I just want to know why. Although the fundamental assumption may be faulty, Google does not know or care. It simply provides the best-matching result in less than a second. What is problematic is that even if the user does not click into a false article, the headline is still there existing in the results. Just this headline may be enough corroboration for a claim in someone’s mind.
Eli Pariser calls this the dangerous “you loop” — where you get stuck in consuming content that supports your views.
Why the filter bubble is destructive
This is much different than how information used to be experienced. If I wanted to find out more about the shape of the earth, I would have to go to a library, find a book on Earth science, check it out, and then read it. There are a lot of barriers and friction here that would stop me from simply confirming my baseless assumption within seconds or using an emotional response to answer a logical question. Even having to physically go to the library eliminates emotion-packed, ad-hoc searches — because I would have to go to the library.
The stakes are not always high, but they do affect the way our lives unfold, in such an insidious way that we do not even realize. This may cause our views to become stagnant or unchanging, even for harmless things like our taste in movies. This may stop us from new experiences or ideas. Today, what we are seeing is curated in every possible way in order to keep our attention and make a profit. The product has become our attention, and what gets our attention? Content that relates to our feelings, that fulfills our longing to fit in, and fulfills our larger self-identity. This makes us comfortable, which is detrimental because growing often means being challenged and being uncomfortable.
Still, feeling safe is a human desire, and like everything valuable, it is being sold.
Netflix is sending us in loops, too
Consider Netflix thumbnails, for example. You may think Netflix’s thumbnail design is universal, but you are not seeing the same Netflix that all other users are seeing — you are seeing a curated version, down to each thumbnail being shown. In this compelling article, Vox breaks down how Netflix tailors thumbnails to your past watch history. If you have watched romance in the past, the thumbnail of a movie may adjust to spotlight a loving shot within it. If you have watched action in the past, the thumbnail of that same movie would adjust to spotlight the heist scene. It is meant to resonate with you and excite you.

Of course, this may seem harmless, but over time I can’t help but wonder — is this shrinking my worldview? Is this pushing me to be stuck in an infinite loop of my own beliefs and “likes”? Maybe my taste in movies and music is no longer mine at all, but what an algorithm is telling me due to decisions I have made in the past.
(Technologies such as social media) lets you go off with like-minded people, so you’re not mixing and sharing and understanding other points of view … It’s super important. It’s turned out to be more of a problem than I, or many others, would have expected.
-Bill Gates
How we can burst the bubble and grow
There is no perfect solution to the filter bubble problem. As long as these algorithms exist, they will tailor themselves to our likes. But what if these algorithms are designed more intentionally to juxtapose opposing viewpoints? What if there is a different recommender system we can be using to expose ourselves to different content? Maybe this will not happen, but I think at the very least, we can spread the word that this filter bubble exists. Most people tend to view the Internet as diverse and vast, and do not know that the way in which we use it limits us.
I, for one, am going to try wandering through different areas in a bookstore more often, or just being in spaces and positions where my views could be challenged. This is what makes for a more understanding and empathetic environment, which is exceedingly hard to control on the Internet as it scales past our control. What we can do as users is to intentionally seek out disconfirming evidence, and intentionally try to disprove our own theories, even down to simple theories like our music tastes. Maybe there is more that we are missing out on.
Thank you to Cliff Lampe for the lecture material that inspired this article! This material was discussed during the Online Communities class taught by Dr. Lampe at the School of Information.
Stay in touch!
Follow me on Medium! Ambika Vohra
Follow me on Instagram! https://www.instagram.com/onbeinghumancomics/
Email: ambikav@umich.edu