The Deleted Russian Bot and other Facebook Experiences

Our plan for our Facebook bot was to set up a profile for Alexander Ivanov, a Russian expat who moved to The Netherlands and is currently learning Dutch. He is a 35-year-old programmer who was actively involved in Facebook, thus liking and sharing posts about Russia and its army in particular. Our intention was for him to be ‘interested’ in events within The Netherlands to get to know the country better. Regarding his background, we came up with the idea that he went to Orenburg State University in Orenburg, which is more nationalistic than other parts of Russia and would explain where his expressions come from. This bot would therefore collectively be designed with the intention of researching the filter bubble of Russian nationalism.

Despite our carefully thought out profile, we did not succeed in creating it. Our bot was disabled by Facebook whilst it was trying to verify the picture, so we came as far as merely submitting general information, such as his name, e-mail and phone number. Because of this, we were unable to even post or follow other people like we intended to. We were consistent with using the same IP address whilst using the bot, so the fault did not lie there. We also considered that the phone number we needed to use to verify the account was perhaps already in use, but after checking it, this claimed not to be true and that the number was not linked to a current Facebook account at the time of making the bot. As a group, we discussed the cause of the unsuccessful Facebook page even further and came to the conclusion that the fault lied in the picture, as it was retrieved from Shutterstock.

First Step of Verification

First Step of Verification

The Result of the Verification

The result of the verification

If we had succeeded in actually making the Facebook account, we would have expected to see a completely different timeline altered to the specific interests of our bot. Especially considering how different his life and thoughts are to ourselves and also considering the difference in gender.

To us, our research proved that in this present time, it is more difficult to set up a Facebook account than it was before. It seems like their verification process is significantly stricter after issues surrounding previous events such as for example the 2016 presidential elections of the United States of America.

Ethically, we thought that making the bot was not wrong per se, but the idea of making a fake profile proves otherwise. For research purposes, we understand that this was a task that we needed to complete. However, taking terms like ‘catfishing’ into consideration, we unanimously agreed that this research was not ethically justifiable despite using the photo from Shutterstock. Using someone else’s photo and attaching it to a different identity without their consent is outright wrong.

Whilst discussing the fbtrex tool in class we had our apprehension regarding the feature. Now that we have had the feature for a week, we do not see this as a fit tool for this kind of experiment due to the fact that the tool is suited for quantitative data research, which was not the case for this small group-work research. Our research into these filter bubbles was aimed at looking at the overall ‘experience’, meaning we focused primarily on qualitative data, such as the differences in our newsfeed or general activities.

However, our group members have noted that early in January Facebook had changed it’s algorithm to favour content posted by friends rather that any news outlets etc. Upon discussion and comparison of our news feeds we have come to the conclusion that our feeds still primarily focus on friend-activity, events around us, advertisement primarily focused on shopping, as well as random entertainment videos. One of our team members, Nadia, who is most active on Facebook, as well as with a clear-cut political inclination, had more news articles show up than the rest of us. The other team members did not really notice a significant change that stood out. We expected that the people who are more active on Facebook would have a more tailored and different newsfeed than the people who are less active, but we did not find this to be true.

In our personal opinion on how this research could be improved for further studies, rather than just acting as a confirmation bias towards filter bubbles which have been taught to us in lectures, this kind of tool and research must be approached from a quantitative perspective. Multiple factors were not controlled and thus do not provide reliable data for analysis and comparison between accounts.

Posted in Uncategorized