Group 3.1


Week 3: Chelsea Bergerman

Journalism can be understood as construction and distribution of articles on current affairs, telling a story to the media-consumer in order for them to comprehend what is going on in this world. The shift of consuming reports and news from traditional media outlets to social media platforms has constructed a new form of construction and distribution of the news items. The generation and distribution of data online, creates a new perspective of journalism in the digital age as raw data can be set into information and ultimately tell a story. Data journalism uses the latest ways of gathering data, processing the data with tools, methods and algorithms, saving the information in databases and visualizing it for the media consumers to recognize and connect with the data. The data that has been processed, whether it is with numbers or graphs, can be used to create a narrative.

Following that data can create a story, we looked into the gathering of data and algorithm of Facebook with Tracking.Exposed (fbtrex) using a fake persona. The fbtrex systematically displays what data Facebook gathers which are only publicly available posts. These public posts are for example group posts, posts by media pages, or visibly open posts of users. The last blogpost presents the phenomenon of the Filter Bubble, the shape of an internet bubble that includes the things that you like, your activity and who you are friends with, processed in a constant evolution of creating and refining a Filter Bubble. Facebook’s News algorithm promotes the circulation of content that users find the most interesting and relevant, meaning that the gathered individual information is put into algorithmic processors to create a feed that is meant to be most relevant to the user. The investigation on how the Filter Bubble is created commenced with a blank page, the fake persona. This persona is a 25-year-old sales associate from Haarlem, and a right-leaning conservative.

The creation of the persona started with interacting with posts that included interest of Thierry Baudet, beauty pages such as Glamour, and furthermore liking suggested posts what initiated the rise of the bubble. The feed started to obtain a leaning towards right-wing politics, including posts that were not necessarily correlating on geographic level but rather share the same political ideology. In this sense, the filter bubble’s algorithm revealed that when interested in a certain political view, the algorithm will also pick up posts from other countries of residence that match the same ideology to present on your feed.

When analyzing the data that the fbtrex had gathered, we calculated the percentage of the formats of posts collected on the Feed. Table 1 visualizes the data in a pie chart, and demonstrates that the primary form of posts was a standard post, following by a group post. The group posts consist only of the group “Forum voor Democratie”, which the persona joined to institute the interest of her political beliefs. The primary form of posts, which is the standard post, is more diverse and shows the authorship and profiles that are directly connected with these posts and displays how many times the algorithm chooses to present it to the user. In Table 2 we calculated the percentage of authors that posted the standard posts and is portrayed in a pie chart. The main authors were beauty pages, and more remarkably pages such as “Zwarte piet moet blijven” and Donald Trump. These two pages could be picked up by the algorithm as these they tend to lean to the same political beliefs, or being followed by the same users. This can suggest that the algorithm is filtering and recommending posts by the same belief, confirming that the filter bubble has been shaped. The usage of creating a fake persona does bring up the questions that the usage of the profile is not similar to a real personal profile, meaning that the activity may be very low and not particularly specific resulting into Facebook having a very wide spectrum of posts to present to the user.

Screenshot 2019-03-20 at 08.49.20Screenshot 2019-03-20 at 08.50.43

This critical look into Facebook’s data gathering, analyzing user’s activity to the recommendation feed, and putting data into visual representations is an example of how data journalism works in the modern world of journalism. The research of understanding the deeper meaning of the inner mechanisms Facebook reveals that the diversity of information and variety of ideologies are all connected to the filter bubble, with the result that the fake profile is another participant in Facebook’s recommendation system and therefore enabling the system to continue its machinations.







Week 2: 

Problematisation of this Journalistic Research

        Since the birth of the internet users have been creating content online for a multitude of reasons, perhaps foremost personal expression. When the Web 2.0 arrived and social platforms were given to us so we could produce, share and rate our own content in a common space, the possibilities for the market and later politics became evident. Social media outlets are quickly pushing away traditional media as the primary method of reaching an audience for a growing business or a politician running for office. The majority of these platforms are free-to-use and generate a profit by using alternative models than a subscription based one. As users we generate an incredibly amount of content each day just by being active on one of these platforms; this data is of incredible value to parties looking to efficiently target certain audiences, which is where platforms like Facebook make their money. The question then arises if this dynamic can lead to potential abuse, who are the victims, how can we track this and if we have found it, how do we stop it?

Discussion of the Tool

        For our look into Facebook and what goes on behind the scenes of their algorithm, we turned to Tracking.Exposed, specifically Facebook.Tracking.Exposed. The web browser add-on, tracked and collects publicly visible posts on your Facebook feed, this data is then aggregated on a website along with the data of other users. In an attempt to increase transparency between users and the algorithm, provides data to researchers to better comprehend the inner mechanisms of online filtering as well as helping them find ways to circumvent at least some of the dangerous social effects of personalisation algorithms. The add-on aims to tackle, specifically, the phenomenon of ‘Filter Bubbles’. Due to Facebook’s aggressive filtration and recommendation system, people often find themselves stuck in online echo chambers, seeing only what they already know, believe and agree with, affecting the individuals autonomy of making a well informed choice. This phenomenon is what we hoped to test through our fake-account experiment.

Our Fake Account and Angle of Inquiry

        For our online personality, we created “Chelsea Bergerman”, a 25 year old sales associate from Haarlem, a right-leaning conservative. With the upcoming elections, we decided to make Chelsea a vocal supporter of Thierry Baudet, the leader of the party ‘Forum for Democracy’. We began creating this personality first through the general information, location, age, and ethnicity (through the stock photos image of a caucasian woman). We began by giving her interests, liking pages regarding matter we thought she’d be interested in; the FVD, fashion magazines, and Thierry Baudet. And soon enough we began seeing the filter bubble arise. Seeing posts from the topics we signed up for, but then being recommended posts from the official page for Donald J. Trump. We were being recommended articles that, while not from her country of residence, matched up perfectly to her political leanings. While on our personal facebook feeds, not a single one of us was recommended anything from Trump himself or any of his cabinet; instead seeing posts from news sources that were critical of him.

        Additionally, an interesting recommendation on Chelsea’s feed was a post regarding the death of actor Luke Perry, what was surprising however was the page that made the post; “Zwarte Piet moet blijven” or Black Pete must stay. The topic of Zwarte Piet in the Netherlands is one of controversy. Stemming from a Christmas time tradition of people dressing up as Santa’s helper Black Pete the chimney sweep, it has long been called racist due to the costumes heavy history of black face. Despite a few outliers, generally the liberal people have declared it racist, calling to abolish the tradition, while the conservative leaning people argue that despite it looking exactly like blackface, it is not. For Facebook to look simply at Chelsea’s likes and sharing algorithm regarding political policies and recommend her a page with a definitive stance on this racially charged topic was a reflection of how the filter bubble can be debilitating to the awareness and autonomy of it’s users.


        In recent elections Facebook has been accused of mishandling the visibility, removal and promoting of political posts. Not only that, more and more stories make the news claiming that fake news companies are actively pushing the production and spreading of this fake news to misinform voters and disrupt democratic elections. As the user interface of Facebook only presents us with the stage on which we operate as users, we often forget the gears turning behind the curtains, the algorithms that perform their magic. With this research we aim to get a deeper understanding of the inner workings of these algorithm so that it can present us with a toolbox to better survive in a landscape where our online presence has been used with our ambiguous consent. The difficulties with our research lie in creating a new persona to feed the machine and thereby enabling the system to continue its machinations. If the desire with this research is to stop the spread of fake news, our road to this potential discovery is filled with enabling behaviour.

By Ashna, Eva, Jip, Rosa, and Ferdi