Department of Information Science
Heinrich Heine University Düsseldorf, Germany
Mrs. Mechtild Stock
Echo Chambers and Filter Bubbles of Fake News in Social Media:
Man-Made or Produced by Algorithms?
Concerning fake news and deception on online media, some authors stress notions of “echo chambers” or “filter bubbles” in order to describe communities of people which believe the same (maybe false) propositions. In the popular press, the construction of such communities is made by “bad algorithms.” However, what is the truth and what are lies as well as deceptions? What is the role of the algorithms when it comes to forming filter bubbles and supporting echo chambers? And what are the roles of individuals and their information behavior (posting fake content as well as reading, commenting, liking, or sharing it) in this process? Are there human selection biases or really misleading algorithms? In this article, we are going to analyze the interrelationship of knowledge, information and truth, ranking algorithms with side effects of producing filter bubbles (with the example of Facebook’s sorting algorithm), and, finally and most important, the role of individuals in the process of making and cultivating echo chambers. Here, we empirically study the effects of fake news on the information behavior of the audience while working with two case studies, applying quantitative and qualitative content analysis of online comments and re-plies. We describe the reactions of audience members to deepen our understanding of the pat-terns of the users’ cognitive states. Do users really produce or live in echo chambers?