Blue Feed, Red Feed: Challenging Political Filter Bubbles

Many US citizens were shocked when Trump was elected as the 45th President of the US in 2016. They were led astray by their own Facebook news feeds, seeing only negative posts about Trump and positive posts about Hillary. These Facebook users were living in their own politically divided filter bubbles, leaving no room for the acknowledgement of information that opposes their personal beliefs. This not only causes the American nation to be even more politically polarised, but is also harmful for our democracy. Therefore, the Wall Street Journal created the application Blue Feed, Red Feed, which juxtaposes both liberal and conservative posts, thus challenging the damaging political filter bubble.

Blue Feed, Red Feed was developed in 2016 by Jon Keegan, showing blue (very liberal) and red (very conservative) feeds side-by-side. By selecting different topics, users can discover two distinct political views on the same subject. Whether the sources are classified as liberal or conservative is based on a 2015 study by Facebook scientists Eytan Bakshy, Solomon Messing and Lada Adamic published in the journal Science.

Blue Feed, Red Feed timeline

Example of a Blue Feed, Red Feed timeline

The ways in which the scientists classified the articles into liberal or conservative is fairly straightforward. First, a data set of 226,310 “hard news” stories was created by classifying news stories and opinion pieces from 81 of the most shared news sites on Facebook. Certain English keywords were used as indicators of what should be classified as “hard news.” Secondly, each piece of news in this data set was given an alignment score, which is based on the average of the self-described political affiliations of people who shared that particular news piece. For example, if the proportion of conservative Facebook users that shared a particular news piece was greater than 50%, then the news piece would be considered conservative in this framework. Furthermore, based on this alignment score, each piece was further classified into one of the 5 categories: very liberal, liberal, neutral, conservative and very conservative. To appear in the actual blue/red feed itself, articles needed at least 100 shares and come from sources with no less than 100,000 followers.

What they found out by collecting, filtering and analyzing the content shared by 10.1 million Facebook users is that depending on their political preferences people receive drastically different information regarding the same topics. Facebook users are separated into distinct filter bubbles depending on whether they have conservative or liberal leanings. This is why Blue Feed, Red Feed is so significant. It provides a relevant, easy to use tool which displays in real time how opposing news sources depict the same topics such as guns, abortion, immigration, etc. With this tool, people are able to escape filter bubbles and gain a better understanding of opposing viewpoints, which otherwise would have remained invisible.

One potential, though controversial interpretation of Blue Feed Red Feed concerns the manipulation of people through Facebook, which has a strong and undeniable power to create filter bubbles. One of the main points of the Facebook business model emphasises the creation of a personalised online “habitat” in which one can calmly dwell, seeing only posts from like-minded friends. However, this proves to be problematic. For example, when taking voting into account, Facebook aids by connecting individual behaviour to voter files, consequently targeting ads to those voters. It is clear that most people today get their news mainly from Facebook, therefore they cannot ever be free from this bias, never really reaching any form of an “objective” outlook on the political spectre. Moreover, if this concern is looked at from the perspective of Facebook as a business, it only amplifies the fact that Facebook may be using our data to manipulate us. Their ultimate goal to make their users stay on the website as long as possible and help foster “safe” environment in which people are not triggered by the posts of their extremist friends.

In conclusion, Blue Feed, Red Feed, illustrates the disparity in political news on Facebook, by providing a side-by-side look at the variety of liberal and conservative perspectives of similar topics. Even though the Internet is rapidly changing, and the data is constantly transforming, Blue Feed, Red Feed posts are still being updated every hour. Moreover, each source used for the project was first carefully examined, chosen out of millions, and classified in one of five categories. Overall, Blue Feed, Red Feed bursts Facebook’s filter bubble by providing a unique lens to see how current news stories are being manipulated on Facebook to support a specific political agenda.

Written by: Amber Kouwen (11674105), Nanda Mohamed (11845910), Lucia Holaskova (11742321), Ivana Sramkova (11826711), Desislava Slavova (11832517) and Aidan Fahle (11788178)

Bakshy, E., S. Messing, and L. A. Adamic. 2015. ‘Exposure to Ideologically Diverse News and Opinion on Facebook’. Science 348 (6239): 1130–32.
Bakshy, Eytan, Solomon Messing, and Lada Adamic. 2015a. ‘Replication Data for: Exposure to Ideologically Diverse News and Opinion on Facebook’.
Bakshy, Eytan, Solomon Messing, and Lada A. Adamic. 2015b. ‘Exposure to Ideologically Diverse News and Opinion on Facebook’. Science 348 (6239): 1130–32.
Bilton, Ricardo. 2016. ‘The Wall Street Journal’s New Tool Gives a Side-by-Side Look at the Facebook Political News Filter Bubble’. Nieman Lab (blog). 18 May 2016.
Graff, Ryan. 2016. ‘How WSJ Used Data and Design to Show Americans Their Polarized Politics and Media’. Northwestern University Knight Lab. 21 June 2016.
Keegan, Jon. 2016. ‘Blue Feed, Red Feed’. WSJ. 18 May 2016.

Posted in Uncategorized