Erin Schumaker thanks for your article!

As explained in the article, it’s probably one of the side effects of algorithms intentionally made to invite you to consume more content, hooking you into a feedback loop through endless proposition of content that you'll most likely share, like, watch, listen, etc.


It is interesting to note that this intentional use of our cognitive biases — which can be called manipulation — could be positive if used in an ethical way.

Anyway, as it’s a more complex subject that it may seem, here are further reading:

1) DuckDuckGo’s study of the filter bubble leads to interesting results on how Google algorithms pick the content that appears in your search results. Well, of course, their motivation is obviously questionable because of their position against Google.

Measuring the Filter Bubble: How Google is influencing what you click
Over the years, there has been considerable discussion of Google's "filter bubble" problem. Put simply, it's the…

2) This experiment (in French) — “I tested the algorithms of Facebook and it quickly degenerated”, by Journalist Jeff Yates at Radio Canada — lead to some impressive differences between the test and control groups. However, because of the date of this article, the results should be taken carefully.

J'ai testé les algorithmes de Facebook et ça a rapidement dégénéré
CHRONIQUE - Croyez-vous encore que votre fil Facebook vous montre la réalité? J'ai fait une petite expérience pour…

3) This very well documented paper explores how different democracy theories affect the filter bubble. It also describes some tools and tactics used today to fight the filter bubble effect and their effectiveness and limitations.

“[…] viewpoint diversity is improved not only by aiming for consensus and hearing pro/con arguments, but also allowing the minorities and marginal groups to reach a larger public or by ensuring that citizens are able to contest effectively.”

Breaking the filter bubble: democracy and design
It has been argued that the Internet and social media increase the number of available viewpoints, perspectives, ideas…

4) This article highlights a study published in Oxford’s Public Opinion Quarterly (2016) that concluded a limited implication of algorithms in the “filter bubble effect”, stating that “the most extremely ideologically oriented users expose themselves to a high variety of information sources from the same ideology”. In other words, the study reminds us that the “filter bubble effect” exists without algorithms as we’re keen to confirmation bias and cherry-picking.

Academic research debunks the myth of filter bubbles
The interest for the " Filter Bubble" phenomenon reached an all-time high after 2016 US presidential elections (see our…

Note: mind your confirmation bias here, as this doesn’t disprove the existence of the effect. In fact, this is not surprising to have some contradictory studies ( both in results and conclusion) in science, and that helps to build the scientific consensus.

5) This large study from the University of Amsterdam points out the lack of studies and research being conducted outside of the US system on the topic and highlights the links between the context and certain aspects of the filter bubble effect.

Beyond the filter bubble: concepts, myths, evidence and issues for future debates (PDF)

Source: “Two IViR studies included in Parliamentary paper on the future of independent journalism in the Netherlands”

6) Finally, this study explores ways of influencing the filter bubble effect among a population, with the intent of reducing it. It shows interesting results, suggesting that awareness of the effect empowers the users, both in the understanding of mechanisms and control of the data stream they were exposed to.

Understanding and controlling the filter bubble through interactive visualization: A user study

Anyway, it’s a very interesting topic!

Thanks again 👍