Most people view their Facebook feeds as a window to the world, painting a bigger picture of what is happening around them. However, a “bigger” picture does not mean a full one. Many do not realize that they have a very limited set of information sources due to filter bubbles, implying that the content you receive is related only to the interests of you or your friends. Furthermore, echo chambers, which refer to the reinforcement and repetition of the same views and beliefs, further prevent the consideration of other ideas. It gets more problematic when we include the numerous advertisers who are trying to influence and change public perception.
In order to closely analyze this problem, our team decided to compare two completely different Facebook profiles using FbTrex. One of the profiles belongs to a real person and is regularly used, whereas the second one is a fake profile created specifically for this research. Both profiles are separated by different interests, beliefs and political views. With this research we want to outline the effects of filter bubbles, echo chambers and advertisers, in order to raise awareness about the biased realities created on Facebook feeds.
To combat personalization algorithms found on many social networking sites including Facebook, FbTrex was created to highlight the faults and problems associated with automated decision making on these social network sites. Moreover, it wants to explore the benefits of publicly owned datasets, collect forensically valid evidence to exert their rights, as well as educate people on algorithm influence. Furthermore, FbTrex wants to expose how Facebook affects people’s interpretation of information. Some examples include targeted political advertising, misinformation, echo chambers and algorithmic censorship. FbTrex enables citizens, authorities and communities to keep Facebook accountable for their decisions. This feature is best utilized by ordinary people, as those are the ones most affected due to their minimum freedom and power on the platform. FbTrex wants all users to have the ability to decide how Facebook shapes their informative experiences.
FbTrex works by having users share some data found on their Facebook account, not their personal information, but rather what Facebook generates for them. FbTrex first creates a copy of the user’s timeline with the user’s consent, and then reuses the data collected to perform analytics, comparisons and statistics. This may concern some people, as it requires sensitive information from all users. However, FbTrex has self-imposed limitations to their own data ethics such as only observing timelines and not individual profiles or pages. Furthermore, they only store public posts on their server and users who install the extension have full control over their data, since they can delete what they submit whenever they want. Lastly, no one has access to an individual’s data unless the owner grants them access. With all of this in mind, FbTrex hopes to raise criticism of Facebook’s current data exploitation model and to empower more people in this age of information.
Our fake account was “Mark Hooker”, a 32-year-old male born in Massachusetts, USA. He is currently living in The Netherlands. Based on predetermined characteristics, Mark follows and engages with Facebook content connected to NFL football, Wired magazine, anti-abortion and religious publications, together with Trump-positive content.
On Thursday, March 7th, 2019, Mark’s public Facebook profile was created. In the course of one week, his online persona was strengthened daily by liking and sharing content based on his aforementioned likes and dislikes. On Thursday, the majority of news stories recorded on Mark’s timeline were from right-wing sources Politico and Breitbart. On Friday, Mark’s feed contained a very limited number of topics which revolved around abortion and Trump. On Saturday, the majority of posts on Mark’s feed were connected to anti-abortion and dog-related videos. In order to switch the Facebook algorithms a little bit, Mark started liking, sharing and commenting on more diverse content, such as political memes and Trump-supportive articles. Moreover, on Sunday, which is church day for Mark, what was noticeable on his news feed were the variety of suggested religious (catholic) groups. This was impressive, because it proved that even a four-day account has already provided enough data for Facebook’s algorithms to create a news feed without any articles that would even in the slightest oppose Mark’s views. In order to switch the account a little bit again, on Sunday Mark focused on tech-related articles as well. Thus, on Monday, Wired’s articles were prioritized on Mark’s feed instead of anti-abortion and political content. Additionally, Mark started receiving targeted ads on Monday. However, Facebook probably managed to track the VPN because the ads were more connected to the likes of the students that were using Mark’s account on Monday instead what Mark himself likes.
In conclusion, by using FbTrex we were able to recognise the filter bubbles each Facebook account lives in. Our two accounts had two different personas, and Facebook very precisely presents posts in their feeds according to their distinct interests.
Written by: Amber Kouwen (11674105), Nanda Mohamed (11845910), Lucia Holaskova (11742321), Ivana Sramkova (11826711), Desislava Slavova (11832517) and Aidan Fahle (11788178)
facebook.tracking.exposed. n.d. ‘Facebook.Tracking.Exposed’. Accessed 12 March 2019. https://facebook.tracking.exposed.