Bergendahl Olle and Rehn Andreas
Political
filtering on Facebook?
ABSTRACT
In this bachelor degree thesis we’ve conducted an
experiment on Facebook using fictive persons, also
called personas. The experiment is based on the theory of the filter bubble and
the filtering of information. Our main focus has been on answering the question
“Does Facebook filter the information that reaches a
user, based on who the user is, what political party the user sympathizes with,
and what kind of friends the user has?”. Questions
we’ve also brought fourth are: Can information filtering be used in a political
way? What effects could filtering have on a user? What
effects could filtering have on the society in
general?
In the experiment we’ve created 17 personas with a
strong political profile, separated into one left wing party and one right wing
party. Thereafter the personas have been interacting with each other during a
period of 16 days. At the end of the experiment we’ve extracted information
regarding which posts have been visible and which posts that have been filtered
out. We’ve also used the script Facebook Friend
Rankings to further develop an understanding of the social connections amongst
our personas.
Our results show that there - in an artificial
environment - exists a political filter bubble on Facebook.
A persona with left wing opinions is shown less posts from his right wing
friends than from friends that share his political beliefs. Although the
results are quite clear, it’s uncertain whether or not the results can be
applied on a more general level. Further studies needs to be done, especially
regarding the relationship between political filtering and the EdgeRank algorithm.
Keywords: Facebook, filter bubble, Eli Pariser, EdgeRank, Facebook Friend Rankings, censorship, political censorship,
social connections, personas, Jeremy Keeshin,