group 4.5

Group Members:

Anete E.

Emma G.

Jil M.

Jonathan M.

Lara R.

Week 1

“Blue Feed/Red Feed” (06.03.2019)

On May 18, 2016, Jon Keegan published a presentation on The Wall Street Journal called “Blue Feed/ Red Feed”. Two feeds are displayed on one page, side-by-side and are updated hourly. The blue feed on the left presents ‘very liberal’ posts from sources published on Facebook, whereas ‘very conservative’ is displayed on the right, covering topics such as “President Trump”, “Health Care”, “Guns”, “Abortion”, “Immigration”, “ISIS”, “Executive order” and “Budget”. The project is based on a study called “Exposure to ideologically diverse news and opinion on Facebook” that was previously conducted by the scientists Bakshy, Eyta and Messing, Solomon in 2015 in which they categorized numerous posts and analyzed the exposure of users to ideologically determined Facebook news.

Screenshot (3)

The researchers tracked and analyzed the top 500 shared sources as well as the content of 10.1 million Facebook user’s feeds, who themselves indicated their political views on their profile. From there, a political ‘alignment score’ is calculated for each article, in order to determine its political nature as well as the larger category that it will be put in, ranging from ‘very liberal’ to ‘very conservative’.

Hence, the data used for this project are these numerous private Facebook profiles as well as the findings of the 2015 Facebook research paper. In short, the project was conducted in order to expose how reality, in this case, a user’s Facebook feed, might appear drastically different for distinct users depending on their political views, due to potential filter bubbles or echo chambers. Consequently, this can result in users not keeping an open mind, which is essential for qualitative political debates.

The researchers used the software development tool Graph API, which is the primary tool to pragmatically extract information from Facebook. This also ensures that the content relevant to the research is being pulled from Facebook automatically. However, the software developer tools and the ‘alignment score’ employed by the researchers in order to determine which posts finally appear on the two feeds are not explained in detail and need to be further researched by the reader.

Despite the positive sides of this project, one can detect several possible limitations as well as potential improvements that could make it even more beneficial. For instance, the researchers set the terms that sources must have at least 100.000 followers and the included posts must have been shared at least 100 times by Facebook users, which excludes a lot of sources that still might have a potential influence on user’s political opinion. In addition, the fact that they used Facebook users’ self-described political orientation as a base for the source’s political value might lead to distortions and impreciseness. Furthermore, the project excludes the sources and websites that have been shared by users from in a broader political spectrum, including The Wall Street Journal, but also social media platforms like Twitter and Youtube, which might lead to a rather extreme simulation of a Facebook feed. Finally, one could argue that the five categories from ‘very liberal’ to ‘very conservative’ lead to a limited and oversimplified classification, especially considering that in the final project these are further narrowed down to two feeds. A possible proposition for the project would be to include Facebook users’ opinions in the form of surveys or opinion polls in order to examine whether they are aware that they scroll within a filter bubble or not.

 

To conclude, the project demonstrates that by indicating personal political views on the Facebook profile, it will influence what kind of posts and sources will appear on one’s News Feed. Therefore, the delivery of the project resulted in a successful visualization of both feeds side-by-side. Nevertheless, there are other aspects of the project and the research that are questionable and/or could have been done differently. For instance, the technique of collecting and categorizing data appeared to be limiting. Namely, certain posts and sources, who still could have a strong influence on one’s political opinion, were excluded from the research, because they did not fit within the established data collection framework. Additionally, to collect the data from Facebook, the software development tool Graph API was used; however, the execution and application of the tool were not thoroughly explained to the reader. Consequently, it was challenging to learn about the process of the project’s realization. Nevertheless, the project still succeeded at bringing attention and informing Facebook users of the threat of echo chambers and filter bubbles.

References

Bakshy,Eytan; Messing, Solomon, and Adamic, Lada A. 2015. “Exposure to ideologically diverse news and opinion on Facebook”. Science. https://education.biu.ac.il/files/education/shared/science-2015-bakshy-1130-2.pdf

Keegan, Jon. 2016. “Blue Feed, Red Feed”. WSJ. http://graphics.wsj.com/blue-feed-red-feed/#methodology.

 

Week 2

Facebook tracking exposed: Investigating algorithms (13.03.2019)

What did we do?

For this week’s experiment, we created a fake Facebook profile with the name Diego de Jong, who is a middle-aged man that has a Colombian mother and a Dutch father, who was raised in Bogota and currently lives in Amsterdam. We created his persona with the ongoing migrant crisis in Colombia in mind, in which thousands of people migrate from Venezuela to Colombia due to horrible inflation rates and other issues within the country.

What did we intend to show?

Our intended goal was to have the Facebook algorithm, that is responsible for showing recommended content and tailored advertisements, pick up on the anti-feminist and conservative political views of our profile. Those political views were reinforced due to the fact that he ‘liked’ several conservative newspapers, the current president, who has a more right-oriented political agenda, as well as sharing and commenting on numerous articles from those aforementioned sources.

How did we do it?

We conducted this experiment for seven days, on which we shared, liked and commented on multiple posts at least five times a day and kept a diary of each day, summarizing the interactions. In addition, we tried to make the profile look as authentic and real as possible, by continuously developing Diego’s character through posting pictures and updates, reacting to posts and engaging with articles and friends, as well as fill out every detail about him in his profile, to which Facebook (or rather the Browser extension facebook. tracking. exposed we used for our experiment) sent us this notification;

This notification is interesting as it admits to tracking our personal information and uses it to create tailored content right away! Furthermore, for the sake of the experiment, we accepted every single friend request sent to us and tried to engage with our new Facebook friends as much as possible, for instance by reacting to their posts and answering to a few messages that we received.

What did we observe?

After seven days of active engagement on the site every day, we noticed some peculiarities in means of what was appearing on the profile’s News Feed. The top posts and sources that appeared were those with whom we had engaged before and ones that were similar in content compared to the previously ‘liked’ or ‘shared’ posts.

For instance, on the second and third day, we engaged with some right-wing sources and posts by ‘liking’ and/or ‘sharing’ them. Consequently, when scrolling through the feed on the fourth day, we encountered a lot of sources that covered several political topics from a right-oriented viewpoint. Additionally, since we indicated that our character is from Colombia on the profile and engaged with content that covered the countries news stories, almost all of the posts that appear on the News Feed are in Spanish and about the internal politics of the country. This accentuates how one’s country of origin plays an important role in generating a filter bubble, which is an algorithm used on Facebook that makes it easier for users to engage with like-minded people. To compare, our News Feed also differentiates even though we all live in the Netherlands, because of our diverse origins and languages we speak.

Furthermore, in line with our character’s explicit conservative and hostile views on the migrant crisis in Colombia, we observed a gradually evolving pattern in the recommended articles on the Facebook news feed. Hence, the latter often showed news stories about international conflicts between people with a migration background and locals, even from news outlets which are normally considered as propagating rather neutral political ideas. However, most of the news stories on the feed with this kind of topic are shared by newspapers we liked that have a more or less biased view on certain topics such as migration in this case.

To sum up the experiment; we observed that the comments and reactions of other users to those type of posts often coincide with our character’s views and political opinions. We can also see that the news posts we shared or liked are being fed back to us, which might indicate that a filter bubble has already built up in the short duration of this experiment. At the end of the experiment, we accumulated 342 friends in total and were messaged by some of them in different languages.

 

Week 3

Facebook’s Filter Bubble Exposed

by Lara Rittmeier, Emma Gasparin, Anete Ezera, Jonathan Matalon, Jil Meyers

 

An overview of the fake Facebook profile we created

 

AMSTERDAM, March 19 2019. In the contemporary digital landscape, ‘filter bubble’ or ‘echo chamber’ are well-known terms that represent a scenario in which a user is enclosed in a digital ‘bubble’ with like-minded users and posts echoing their beliefs and values rather than challenging them. These terms are mostly associated with Facebook or other social media platforms or news outlets feeding the content you have been engaging with back to you. To examine the workings of a ‘filter bubble’ and unfold the influence of algorithms, we set up a fake Facebook profile in the name of Diego de Jong. We indicated his origin, his political interests, his profession and other details about his personal life in order to observe if and how Facebook’s News Feed differs from our own News Feed. In addition, we intended to keep track of the kind of content that would be shown and prioritized in the course of the experiment. We conducted this project in strong relation with our persona’s views on the ongoing migration crisis in Colombia, which is therefore also the imagined country of birth in our created profile.

After one week, we collected the data and gathered our observations of the peculiarities we noticed about the content on the user’s News Feed. Namely, similar posts and political viewpoints which were ‘liked’ or ‘shared’ kept appearing on the News Feed, alongside with posts of the same site of one particular newspaper, called El Nuevo Siglo. Also, the user’s origin was a noteworthy factor, which determined the focus in terms of the location of political news and used language. However, we discovered other significant patterns after gathering and analyzing the data from Facebook Data Extractor. Further, we created a visual presentation of the data:

 

A visual representation (circle packing) of the data

 

This visualization displays the difference in the type of content that was distributed on the News Feed of the fake profile. As it shows, posts (50,92%) are the greatest portion of content that appears on the News Feed. Photos follow with 27,20% and videos with 21,88%. However, there were no events displayed on the News Feed, which makes sense as we did not engage with any events during the time of the experiment. Moreover, as we mostly engaged with posts, it corresponds to the high percentage of posts appearing on the News Feed.

Examples of posts on Diego’s News Feed

 

To continue, most of the posts we engaged with were from a particular news outlet El Nuevo Siglo which correlates with Diego’s strong political stance on anti-immigration matters. The latter is known for spreading rather conservative political news, which is why we chose it as the main news outlet for our persona to engage with on a regular basis. With that in mind, we noticed that other recommended sources and posts appearing on the News Feed had a very similar narrative. When browsing the News Feed, numerous posts are by the news outlet El Nuevo Siglo which was to expect regarding our constant engagement with the source. Moreover, as we have engaged with quite a few posts about the Colombian football team Millonarios F.C., as well as with the Dutch club AFC Ajax which the algorithm picked up on and subsequently continued displaying stories about football on the Feed. Furthermore, it is worth mentioning that we explicitly stated Diego’s passion for football and the Millonarios in the ‘Intro’ section of his profile.

 All in all, we argue that a vast number of posts appearing on the Feed are closely related to the information that we were feeding the algorithm with. In other words, Facebook provided mostly stories about political incidents in Colombia, the two above mentioned football clubs, as well as miscellaneous posts about either Diego’s home country or the Netherlands, which paints a more or less accurate picture of the persona we created. However, we also observe that the data and the general outcome of this project are not as explicit and clear in order to draw definitive conclusions, but that might change in the future development of this experiment.