Technology is our friend: I still can't see the "Filter Bubble"
Back To Normal

August 16, 2014

I still can't see the "Filter Bubble"

Good job, Internet! A few years ago, everyone was worried about the "Filter Bubble" imposed on us mainly by Facebook and Google. Does anyone remember? Eli Pariser got famous with a book, TED talk and all that, and the German Wikipedia refers to Google and Facebook’s newsfeed as the main examples of how algorithms bias your view of the world.

I was always skeptical about that since I believe that persons should know how to google stuff just like they know how to treat an article in a newspaper, so that’s a matter of education. And since my Facebook newsfeed not only consists of media, but also of persons who I befriended for a variety of reasons, I always believed that “people making me aware of stuff that I would not have searched for” was one of the greatest qualities of Facebook.
During the recent Israel-Gaza-war, this was exactly what happened. Although I was shocked from a moral standpoint (and therefore kept the screenshot of that tweet below), this is exactly what happened to my timeline:

(And by the way, I don't follow Lexi Alexander - I stumbled upon this tweet engaging in a Facebook comment discussion on Israel-Gaza, and I don't even remember wether the post sparking the conversation was pro- or anti-Israel/Palestine). 

Since the filter-bubble-book came out, Facebook has worked a lot on the Edge Rank, or now newsfeed algorithm, and Google has played around with integrating Google+ contacts and contents into newsfeeds - but honestly I can’t see the filter bubble effect happening in my internet. Maybe that’s a special freak accident that does not happen in your internet, but I doubt it. I think the filter bubble concept has become obsolete, and here are the reasons why:

1)      Choice of sources
Just like at the newsstand or in my TV viewing behaviour, I get to choose my sources. There is a filter bubble when I constantly watch Fox News. Or when I constantly watch Russel Brand’s Trews. Or when I only read The Guardian or only read the Daily Mail. The same applies to a Facebook newsfeed. When I only like (and comment, and share, and click) news of that one source, I will of course see more of that. And less of others that I don't like. But Facebook consists of more than just sources from media: I have friends. People that share my views on football, others that share my views on politics, others that share my views on business and so on. That does not keep those who I discuss football with to post a link with regards to politics, and that makes my newsfeed rich. And somewhat unbiased. And when I google “Gaza”, I am able to differentiate between the Jerusalem Post and Haaretz. Again, that’s an education problem, not a technology problem.

2)      The unfiltered newsfeeds
The concept of personalization is nearly as old as the internet, but many sites are simply not able to or do not want to tailor the experience to my very own needs and behaviour. When friends and I visit the sites of Spiegel, BILD or Focus, to name three big German news portals, we see exactly the same. Even my favourite sports destinations can’t filter the news in a way that I would like from a filter bubble: to see my favourite team’s most unsignificant news story on top. Because it is more significant to me than a 30mn-Euro-transfer within the Spanish League. On top of that, networks like Instagram or Twitter do not use filters at all. I see what is posted, sorted by time (and sponsored tweets from time to time), and the only additional mechanism are retweets (apart from Twitter's nerve-wrecking notifications). Here, the choice of sources basically is the only determining factor. So if you want a filter bubble, you may get one. But if you don’t, you won’t.

3)      Related articles algorithms
There’s not only algorithms that choose what is displayed to us, but also algorithms that propose stuff we may not be explicitly looking for. I think we have seen remarkable progress in this area. In the early days, the “relation” was always defined as “more of the same”. So you look at a printer on Amazon, you get a recommendation for more printers. But sometimes the algorithms work in a way that you get recommended suiting toners for printers, or paper, or simply well-fitting stuff that’s just not another printer. Especially right after you bought one. And here you go with journalism: Anyone of the older guys remember how in print magazines two editors took different stands on one and the same topic? Pro-contra pages? This is what “related articles” recommendations can achieve. Precondition is that you have enough articles that you can calculate relations on. Like Facebook for example. When I captured the screenshot below, I knew that I can finally relax about the filter bubble. This happens all the time: by some friends on Facebook, an issue is raised, mostly referring to strange blogs and “alternative news”, for example if my toothpaste will kill me or if vaccinating children is an idiotic idea. Or in this case, if sunscreen causes cancer (which, a few days prior to vacations in Turkey and Greece, suddenly was quite relevant to me). And either algorithms or simply Google will help me to make up my mind. See this beautiful example from Facebook (which made me write this whole post anyway):

The upper article, the original post, was sent to my newsfeed from a friend (who I know, so I have an opinion of him/her), refers to science and comes from "real farmacy com". It promotes a view that sunscreen does not help with skin cancer at all and in some cases is even counterproductive. The related link recommendation comes from "I fucking love science", and answers directly to that article above (showing that it is complete bullshit). I trust "" more. I have a relation to it, I have been following the Facebook account for years, and the article convinced me to keep on using sunscreen. Problem solved.

Of course, the job of understanding the relation between content source and actual content still has to be done by the recipient. It is simply something else if the state department of health issues a warning or if my neighbour issues a warning about something. Everyone can still choose who to believe. But like in this case, I can easily get both sides of the story. That’s basic communication skills, and I sometimes think that everyone who complains about the filter bubble has the strange expectation that technology should solve that problem, too. It won’t. Technology is our friend, as long as we make an effort to understand and apply it. It’s as simple as that.