The Bubbles of Social Media

It is no secret that web companies tailor their services, news and search results to our personal taste. The consequences are that we are trapped in “filter bubbles” and don’t get exposed to information that could broaden our world view.

For example, Facebook checks which friends’ links you are clicking more and is editing your profile and news feed accordingly. It decides for you who are most likely to interest you based on your past interactions with friends. Facebook is not the only one doing this algorithmic editing of the web. Yahoo News, Washington Post, EBay and Amazon are just a few of the websites who do personalization on some level.


In the TED talk above, Eli Pariser explains how different people get different results when searching the same topic, and are not aware that their results are different. The internet is actually showing us what it thinks we want to see without us knowing.

This “filter bubble” defines and dictates our online world. It filters according to our demographics, gender, profession, lifestyle and interests and this filtering process decides what gets in and what stays out. We are surrounded by information junk food as Eli Pariser calls it. These algorithms are the ones in controls of our flow of information and users are slowly becoming passive recipients of information.

We usually click on what we’re interested with the most. As a result, we keep getting more and more of the same topics. So basically we are missing any data that is out of our interest-bubble and we might be getting a one-sided story. But is the right to filter information should be in our hands or the hands of corporations, web companies and algorithms?

Is excessive personalization a cause for concern? Yes, in a way. By collecting information about users, it functions as a big brother and creates more user-privacy issues. In some countries, like China, it can lead to censorship since a government-friendly algorithm decides which news users will read. Pariser defines Google and Facebook as the new gatekeepers and in a way, they are. As two of the most leading companies around, they have a responsibility to keep users safe from the dark side of personalization, biased information and big corporations’ financial interests.

We should have the right to control what gets in or not because we need to be able to meet new ideas and perspectives. Can we and should we control these filters? The greatness of the internet is that it gives us the ability to get to much more information than in the past. And since nowadays we are getting our information from different sources, it allows us to get diversity of perspectives and a more objective overall view on events due to multiple information channels. But how reliable is the information we get?

It is important to remember that the internet functions as both a feedback mechanism and a marketing tool by companies. Our online consumer and personal profiles are constantly being targeted by companies. One example is Hunch <> a company with a mission to build a “taste graph” and to connect every person on the web with their preference for camera, car, clothing etc. Their “taste algorithm” can predict if users will like a certain item and they map out every user through taste-technology. Interestingly enough, in a research conducted by 3digitlminds, it turned out 75% of Asian consumers prefer customized products and support personalization.

It is true that there is a need for companies to become more transparent about their filtering process and applying more diversity into their search results but on the other hand, companies also have the right to keep their MO and algorithms as a trade secret. Many believe that there needs to be some kind of a balance especially when dealing with issues of user’s privacy and trust.

There are some advantages to filtering and personalization; random information is filtered based on certain personal factors and we get a more personalized and relevant results. It suits the fast-pace of modern life; we want instant food, instant downloads and instant, customized results that are tailored to our taste and preferences. We want to save time and efforts by filtering out what we don’t want, but is that a bad thing?

The downside is that many feel the internet should be a free zone. Sometimes users’ research relates to work or business, and sometimes to personal life. Filtering algorithms are editorial in nature and an algorithm has no way of knowing the nature of the search and it will usually target the first search users run and determine their preference based on that alone. We got so used to living in our bubble that if we’re not careful, we will end up not wanting to go outside.

Things are very easy today; when we want an answer to a question, we simply Google it. There is less uncertainty. And yet, when you search for something and receive different results based on filteres and personalized algorithms, it seems that there is no certainty after all.


1) NY Times: “Your Own Facts”

2) Web Pro News: “Should Google and Facebook Be Filtering Our Content For Us?”

3) Reve News: “The Bad and the Good of Filtering Information”

4) Berfrois: “How Personalization Changed Society”

5) Socialistic: “An Interview with Hunch’s Hugo Liu on Personalization Technology and the Taste Graph”

Related Posts

No Comments Yet.

Leave a Comment