Filter Bubble

Definition:

A filter bubble is a personalised, one-dimensional information environment created by algorithms and digital systems. These algorithms select content based on a user’s past online behaviour, preferences and interactions, resulting in a reduced variety of information and perspectives to which the user is exposed.

Discussion:

The term ‘filter bubble’ was coined by Eli Pariser in 2011 and has since attracted considerable attention in research and public debate. In contrast to the echo chamber, which is often understood as a consciously chosen or at least accepted environment, the concept of the filter bubble emphasises the often unconscious and technology-mediated nature of this information restriction.

Pariser (2011) argues that personalised algorithms used by search engines and social media platforms mean that users are increasingly isolated from information that contradicts their existing views. This can lead to a distorted perception of reality and limit the ability to critically engage with other perspectives.

Compared to the echo chamber, which is often seen as a social phenomenon, the filter bubble is primarily a technological construct. While echo chambers can be created through conscious decisions and social dynamics, filter bubbles are mainly generated by automated systems that aim to present users with ‘relevant’ content.

In a large-scale study on Facebook, Bakshy et al. (2015) investigated the influence of algorithms on exposure to ideologically diverse messages. They found that users’ individual decisions had a greater influence on exposure to opposing views than algorithmic personalisation. This suggests that filter bubbles and echo chambers are often intertwined in practice and can reinforce each other.

Zuiderveen Borgesius et al. (2016) argue that the impact of filter bubbles on news consumption may be overestimated. They emphasise that many people still use a variety of news sources and that the personalisation of news does not necessarily lead to a narrowing of perspective.

However, recent studies such as that by Geschke et al. (2019) show that filter bubbles can have real effects. They found that personalised news feeds can lead to reduced exposure to opposing political views, especially among users with more extreme political views.

Unlike echo chambers, which are often perceived as static environments, filter bubbles are dynamic and continuously adapt to the user’s behaviour. This can lead to a creeping reinforcement of existing views without the user being aware of it.

The debate about filter bubbles also has important implications for media literacy. While with echo chambers it is often argued that consciously opening up to other perspectives can be helpful, dealing with filter bubbles requires a deeper understanding of the underlying technological mechanisms.

Future research should focus on how filter bubbles and echo chambers interact and how to develop effective strategies to mitigate their potentially negative effects without completely abandoning the benefits of personalisation.

Literature:

Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130-1132.

Geschke, D., Lorenz, J., & Holtz, P. (2019). The triple-filter bubble: Using agent-based modelling to test a meta-theoretical framework for the emergence of filter bubbles and echo chambers. British Journal of Social Psychology, 58(1), 129-149.

Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin UK.

Zuiderveen Borgesius, F. J., Trilling, D., Möller, J., Bodó, B., De Vreese, C. H., & Helberger, N. (2016). Should we worry about filter bubbles? Internet Policy Review, 5(1).

Synonyms:
Filterblasen

Diese Seiten sind kopiergeschützt. Für Reproduktionsanfragen kontaktieren Sie bitte den Autor.