group 2.3

Fake News & False Flags – A Statistical Study

In the Article ‘Fake News & False Flags’, journalists Abigail Fielding-Smith & Crofton Black used a number of sources and statistics to firstly explore and finally reveal the role of the Pentagon and the American government in the propagation of misleading Iraq propaganda. Statistics are firstly mentioned in the sub-heading of the article: “How the Pentagon paid a British PR firm $500 million for top secret Iraq propaganda”. Most of the statistics in the article are money figures, like media operations costing over  “one hundred million dollars a year on average”, and the Bureau’s discovery of transactions worth “$540 million between the Pentagon and Bell Pottinger”. Since the reliability of these statistics play a considerable role in the article and the Bureau’s reliability, we looked into how the Bureau conducted their research and how they’ve obtained their statistics regarding Bell Pottinger, the PR firm involved in information operations and US content creation during the Iraq invasion.

The investigation ‘Fake News & False Flags’ was published by The Bureau of Investigative Journalism in collaboration with the Sunday Times. The Bureau, a non-profit news organisation researched these statistics themselves. The data however was created by the Pentagon, the US army and the Department of Defense. The Bureau “traced the firm’s Iraq work through US army contracting censuses, federal procurement transaction records and reports by the Department of Defense Inspector General. They also used Bell Pottinger’s corporate filings and specialist publications on military as a source.” So the Bureau may have found and shared the statistics, but they did not create them. They simply used US contracts and written records to find their figures.

The data shows mostly the process of a government paying an immense amount of money to make profit from a war created by false intentions and top-secret propaganda work. This data, mostly made up of dollar figures, tells the extensive story of the dollar’s consistent journey starting from the pockets of the US Department of Defence and landing into those of Bell Pottinger. During the Iraq invasion, the data clearly shows that a military intervention may have been on the media forefront, but in the background an even bigger media propaganda intervention was taking place.

More than just telling us, this data shows us that during the US invasion in Iraq, the US government was running an immense news campaign in the country they were invading. The purpose of the campaign was to advertise democracy and spread a false sense of American ‘aid’ in the American-created war-torn country. The campaign was for a huge part outsourced to professional advertising and media companies; content which the employees didn’t even know they were going to create until they got to Iraq, and unaware of the impact their work would have on Iraqi civilians’ lives and on US citizens’ false perception of the war.

Mostly involved in the promotion of these figures is the Bureau of Investigative Journalism. The statements that Lord Bell – co-founder of Bell Pottinger – gave to the Pentagon, the CIA, the National Security Council and the documents discovered by the Bureau are the main sources of the the data outlined in this investigation. The Bureau of Investigative Journalism made discoveries such as Pottinger being the source of employment for over “three hundred British and Iraqi people” and the identification of transactions worth $540 million. Overall, the Bureau provided most of the information included. Additionally, Crofton Black and Abigail Fielding-Smith, the two authors of the article contributed to the promotion of the figures by including them in their article.

In our opinion, there is no other potential interpretation of the data found in the article and the message it expresses. The Bureau’s sole purpose was to expose Bell Pottinger’s and the Pentagon’s involvement in the War. They may have hoped but not aimed for another outcome, however following this investigation, Bell Pottinger filed for bankruptcy due to their direct involvement in conflicts overseas made public.

Looking at the bold claims presented, we can see that it is supported with reliable data and trustworthy sources. The research relies on the data to conclude that yes, a substantial sum of money was indeed exchanged between the Pentagon and Bell Pottinger. The data unveils many contracts between Bell Pottinger and Us contracting censuses and transaction records. The reliability of this data and the bold titled heading is the reason this investigation and article has found so much ground. Due to the fact that the journalists used this data to make claims and built a solid foundation for source discovery, the conclusions of this article are indeed aligned with the data.

Authors: Malik Zarth, Natalia Dercz, Lorenzo Canci, Lana Al Zouheiri, Thomas De Boer and Marijn Snijders

    Facebook, an algorithm case study

Problematization

The reason for creating a fake persona for investigating Facebook’s algorithm is to gain more knowledge on how Facebook targets people and how your personal choices affect the filter bubble in which users end up.

By creating a fake persona it is possible to experiment with a different filter bubble than those in our personal facebooks, and by collecting and comparing that with the data of your own profile, gives you the possibility to investigate and analyse those differences.

Another possible research method is comparing two Facebook profiles which are hugely different. This method can be successful but the profiles aren’t customizable. By creating a fake persona it is possible to investigate one particular filter bubble.

The research we’re conducting right now is ethically questionable because although we are trying to veer away from stereotypes, the persona we are creating still lives up to certain stereotypes about race and religion. Another reason why this research is ethically questionable is the project itself. Creating a fake persona to fool other users and fool Facebook’s algorithms into giving us bias, and preconceived notions of who a person should be according to their Facebook likes and shares, is a research with both great benefits and controversial ethics. This can be also problematic when Facebook users think that our fake persona is a real user. More so when our fake persona, Mohammed, has religious & political interests that some people may disagree with, and use these as a way to judge him. Lastly, there is always a possibility that real users will emotionally connect to the fake persona, which is another unethical outcome.

Fbtrex

Fbtrex is project managed as a GPL free software community and it is part of a larger project: facebook tracking.exposed. The project works as an extension of Google Chrome or Mozilla Firefox and gives the ability to track Facebook’s tracking and filtering systems. The main goal of this project is to create a fairer online space for both users and developers. Inspired by the peer to peer free internet of the past, tracking.exposed have created their own manifesto in which they describe their main goals and ethics.

In particular, ‘fbtrex’ is a tool aimed to expose facebook bias and unfair algorithm systems that limit the information that a user is exposed to on a daily basis. Among media students like us,  this knowledge about Facebook’s recommendation helps us to understand that this project aims to spread more awareness and eventually to better these algorithms and filtering systems in order to create a fairer and more open online world.

We used this tool to gather and harvest data from our own personal Facebook pages and then created a fake profile page in order to highlight how different and personalised Facebook feeds become.

Fake persona

The persona we have created is Mohammed Abadi, a 25 years old Iranian Muslim man, currently living in the German capital; Berlin. Mohammed was born and raised in Baghdad in a Muslim family, where he also obtained a bachelor degree in the Arts before moving to Berlin in 2015. There, he worked on his master’s thesis in ecological design and after obtaining his degree he began to look for a suitable job in the design market.

Mohammed is an aspiring tattoo artist. Fascinated with his religion and arabic calligraphy, he aims to take away the sometimes over provocative and vulgar designs of some tattoo artists. He has also developed an interest in cars and how they have been changing over the years. Mohammed is a proponent of heteronormativity and does not agree with the LGBT beliefs.

    Furthermore, Mohammed has a passion for 19th century Russian literature and new school music, with particular interest in artists such as Migos.

Mohammed is highly passionate about his roots. He has faith and is proud to be a muslim. He is not radical, and is highly pessimistic about global news coverage on Islam, finding it biased and unfair.  Even though he has been living in Germany for a few years already, he still feels a very strong connection to the Middle East and Iraqi culture such as with his food and history. Because of this strong connection to Iraq, Mohammed finds it hard to find his place in German society and feel included.

The main goal of creating Mohammed was to find out how a Middle Eastern immigrant in his twenties would be treated within Berlin’s mainstream society. We wanted to see if his faith and ethnicity would make him feel excluded and how it might have affected his Facebook timeline, and more importantly, his Facebook recommendations.

Authors: Malik Zarth, Natalia Dercz, Lorenzo Canci, Lana Al Zouheiri, Thomas De Boer and Marijn Snijders

Facebook Algorithm Exposed

 

Our Facebook profile is Mohammed Abadi, a twenty-five year old aspiring tattoo artist. Mohammed has been born and raised in Iraq. He obtained an Arts degree at the College of Fine Arts in Baghdad. After finishing his bachelor studies, Mohammed moved to Berlin where he attended the Bard College in order to write his master’s thesis. Currently he is trying to find a suitable job, while pursuing his dreams in becoming a professional tattoo artist. Mohammed is very close to his family and highly passionate about his roots. With his passion towards religion and arabic calligraphy, he aims to take away the sometimes over provocative and vulgar designs of some tattoo artists.  He has also become very interested in cars and how they have been evolving with developing technology over the years. Mohammed is a proponent of heteronormativity and does not agree with the LGBT movement and the beliefs they share.

Process

We created the persona and fed it horizontally, which means that we did not favor one any subject more than another. We kept feeding the profile for a week and wrote down daily entries in a diary to keep track of what was liked and shared by the profile we created.

Most of the liked and shared content was news posts from both Western and Middle Eastern media channels. Another category we focused on, were pages concerning the events and information about Berlin. We also fed the profile with posts about Iraqi culture, such as history or food. Due to our fake persona being religious, “he” paid a great deal of attention to posts about Islam and Muslim communities. Mohamed aspires to be a tattoo artist, therefore we also liked posts about tattoo art. Moreover, he is a fan of the Porsche brand and cars, so we added him to numerous Porsche lovers groups.

During the week of feeding our fake persona, we installed a data collecting tool called fbtrex. The mechanism scraped all the public posts on the timeline of Mohammed’s profile and made it extractable.

After a few days of feeding and nourishing our fake profile we collected the data of the public posts, downloaded it and inserted into an Excel sheet. Afterwards, we organised the data and filtered the important information from the data out of the irrelevant pieces that were included. Finally,  we visualised the found data by creating graphs in Rawgraphs.io.

Findings

Analysing the raw data obtained by harvesting our persona facebook feed provided results that revealed suspected profiling, selection and consequent isolation of certain characteristics and attributes of our fake persona.

-News                     201 41,1 %

-Infotainment          11 2,2 %

-Culture                  50 10,2 %

-Religion                 37 7,5 %

-Food                      1 0,2 %

-Cars                      27 5,5 %

-Tattoo                    30 6,1 %

-School                   2 0,4 %

-Government          4 0,8 %

-Brand                    8 1,6 %

-Personal               120 24,4 %

Total                       491 100 %

The extracted data revealed that the Facebook algorithm specifically focused on certain attributes we implemented in the profile. Our fake persona liked and shared posts from various Facebook sites, such as The Guardian and personal pages of Berlin tattoo shops. Facebook registered this behavior through their algorithm and ‘fed’ back content that was clearly targeted according to facebook’s profiling of our fake persona. Findings show that even though the created persona lives had lived studied and worked in a major european capital said persona’s facebook feed is extremely targeted and filtered based on the middle eastern heritage and cultural connection of our fake persona. This is furthermore reflected in the extracted data of news pages that the facebook algorithm provided to our fake persona. While our persona liked both middle eastern and western news the data revealed that most content presented was from middle eastern news sources. Moreover, posts related to religion and culture were clearly focusing around the middle east and were presented more often than pages related to western culture like cars and tattoos.

Conclusion

By doing this project, we realized that with personalization algorithms Facebook users indeed do not have much control on what is presented on their feed. The pages we like, content we share is chosen by us, therefore what we see on our starting pages depends on what we engaged with, however we cannot regulate what exactly we see. Surely, there was not enough time for us to see the real filter bubble and how it influences the content we are being exposed to. On our profile, news and Middle Eastern posts were the most popular on the feed, however we mostly liked content related to these subject matters, hence it was impossible for Facebook algorithm to show us anything else.

Authors: Malik Zarth, Natalia Dercz, Lorenzo Canci, Lana Al Zouheiri, Thomas De Boer and Marijn Snijders