‘Mark Hooker’: Our Fake Facebook Personality to Tackle Filter Bubbles

Algorithms – the basis of all computing technology. Without them, Web 2.0 as we know it would cease to exist. They tell the computer how to execute a certain task(s) and then begin to accomplish the desired goal. However, even though algorithms are indeed a revolutionary concept, they have evolved into something more corrupt than emancipatory. One of the most famous examples of this targeted use of algorithms is Facebook. Facebook is designed in a way that it functions to display posts on a user’s news feed according to their likes and interests, thus triggering a positive reaction. For example, if I am liberal, the algorithm will not show me posts from conservative news sites. This creates ‘filter bubbles’ or ‘echo chambers,’ which could be classified as some sort of safe haven of information catered to individual groups of users.

Due to these problems, Mark Hooker was born; a dog-loving American man, who enjoys American football, going to church and technology. We created this fake persona in order to analyse the Facebook algorithm, focusing specifically on the other side of the political spectrum – therefore making him like things that are different from our own personal viewpoints. For that reason, we made him into a flat earth believer, anti-abortion advocate, and an overtly religious Trump supporter. We worked on his profile for more than one week, constantly sharing and liking new content which was in line with his beliefs and values. To do this, we used the tool FbTrex, which collected all public data from Mark’s profile, and stored them in its server. Thanks to this browser extension, we were able to keep track of all the sponsored posts Mark was receiving, and therefore, we could create statistics which served as an argument in favour of the presence of a ‘filter bubble.’

Due to these personalised information bubbles, a user’s ability to critically evaluate information is damaged, which is a major threat to the political sphere as people generally do not seek information once they have already obtained it. Therefore, if users receive their daily news from Facebook, which shows them information catered to their viewpoint, they will not go looking for it elsewhere, thus leading to a gap in information.

Posts that appeared on Mark Hooker's Facebook timeline, sort by subjects

Posts that appeared on Mark Hooker’s Facebook timeline, sort by subjects

Posts that appeared on Mark Hooker's Facebook timeline, sort by form

Posts that appeared on Mark Hooker’s Facebook timeline, sort by form

There was strong evidence concerning the existence of filter bubbles on Mark’s profile. Keeping in mind all the personality traits we assigned to him, his values and political opinion, we found out that the biggest number of posts present on his profile were about dogs. This included mostly pictures and some articles, coming up to 194 in total. Indeed, this goes in line with his interests, as he himself owns a St. Bernard rescue dog, showcasing strong love for man’s best friend. The second most prominent category was made up of pro-Trump posts, of which there were 150, with articles being the most frequent medium. Mark liked news pages supporting Trump and his policies such as Breitbart and Politico, so it only makes sense that these kinds of posts would rank high statistically. The third category was news about technology and inventions, mostly from the news source Wired. We conceived Mark to be a construction engineer, therefore it was only fitting that he would receive related posts on his news feed. The rest of the content was mostly concerned with Christianity, sports, and anti-abortion, thus showcasing a low variety overall. As predicted, Mark did not receive any liberal news or any other posts that contrasted to his own worldview.

Posts that appeared on Mark Hooker's Facebook timeline, sort by source

Posts that appeared on Mark Hooker’s Facebook timeline, sort by source

However, Facebook’s algorithm seems to be so smart to find a way around the fake persona Mark. From giving friend suggestions of the creators themselves to posts on Mark’s timeline that go in line with the creators’ interests, Facebook knows everything and can easily identify if a profile is indeed real or fake.

In conclusion, a lot of valuable information can be learned from this experiment. For example, we are now able to say with confidence that we are indeed constrained into our own filter bubbles. These filter bubbles have a strong outer layer, which can be compared to a cell membrane. This ‘information membrane’ doesn’t let anything in or out. In order to leave, we need to be able to actively diffuse ourselves through this membrane and deliberately look for information that would counter our own viewpoints. Only in this way are we able to falsify information, thus separating fake news from reality.

Written by: Amber Kouwen (11674105), Nanda Mohamed (11845910), Lucia Holaskova (11742321), Ivana Sramkova (11826711), Desislava Slavova (11832517) and Aidan Fahle (11788178)

References
facebook.tracking.exposed. n.d. ‘Facebook.Tracking.Exposed’. Accessed 16 March 2019. https://facebook.tracking.exposed.

Posted in Uncategorized