top of page

Rabbit Holes and Echo Chambers: Experimenting with YouTube

Between Summer 2021 and Fall 2022, Debunk.org’s researchers conducted a series of experiments on disinformation and popular social media. We browsed through YouTube, TikTok, Facebook, and Instagram to see how interaction with disinformation and legitimate pages would affect the recommendations of the algorithm. Here is what we found out. Spoiler alert: the results were almost always depressing.


Introduction: How social media works


First, keep in mind that almost all social media operate based on the principle of engagement: They tend to provide material for you to watch, read, or see that keeps you hooked onto the website, regardless of whether the information contained there is reliable or not.


You can probably already guess that this does not bode well for the quality of information.

Photo showing laptop that has facebook and a phone

It also does not encourage diversity: In fact, many social media are known to generate “echo chambers”, environments such as groups, pages, communities, or hashtag-centred discussions where people interact exclusively with others who share the same radical perspectives, losing sight of the existence of alternative views and becoming more and more convinced of their ideas.


But sometimes social media can also create “rabbit holes”: Instead of just feeding you more of the same extremist ideas you are already interested in, they make you discover them in the first place, showing you more and more extreme content.


This is good for engagement, and bad for society: The more extreme your views become, the more likely you are to see people who think differently as enemies. As a reaction, you are likely to go looking for reassurance and confirmation of your opinion into echo chambers full of like-minded individuals. Interaction with echo chambers further confirms your views, making you feel vindicated and “good”. As a result, you spend even more time on the website, getting outraged at the “enemy” and feeling good about yourself. The more time you spend on the website, the more money the website makes by showing you ads.


Some social media do this deliberately; others developed similar practices organically, by accident; accidental or not, the fact is that it happens more often than we would like. Let us now look at some examples.


YouTube and anti-vaxxer content


First steps of the experiment

We conducted our first experiment on YouTube at the time of the height of the debate over COVID-19 vaccines. We began by watching a series of videos about vaccines in general, and about COVID-19 specifically.


In total, we spent around one hour browsing. To do so we took an incognito page to make sure that our browsing experience would not be influenced by our previous viewing habits.


We began from a mixed balance of neutral media coverage of vaccines, including videos that debunk the old claim that “vaccines cause autism”. But we also started interspersing our view with occasional bits and pieces of ambiguous information: Neither disinformation not disinformation at the beginning, just documentaries with “controversial” interviews with anti-vaxxers, and their critics debunking their views.


Recommendations start changing

Initially we tried to keep things balanced and did not immediately jump onto the anti-vaxxer bandwagon, but YouTube’s algorithm already guessed that we were up to something.

After viewing just a couple of videos, our recommendations page had already become severely skewed towards vaccine- and healthcare-related topics.


Keep in mind: YouTube’s homepage recommendations drive around 70% of everything we view! It is therefore incredibly important to know how the website decides what you will be shown there.

We carried on browsing, adding more views, based on recommendations. We tried again to keep it balanced, viewing a mix of pro-vaccine videos, neutral documentaries, criticism of healthcare companies, but not of vaccines, and assorted anti-vaxx videos of various intensities.


All hell breaks loose

The more we browsed, the deeper the algorithm pushed us towards “weird” content. Within ten minutes it was already clear that the recommendations were veering more and more in the direction of sensationalist content. Not necessarily disinformation yet, but clearly aimed at getting people riled up and outraged – and, presumably, wanting more. Who doesn’t like a good scandal, after all?


After less than half an hour of browsing, the recommendations started to be filled with more and more videos of COVID-19 denialism, and then things got weird: We started to get recommended videos on impending apocalypses, satanic possessions, societal collapse, and hard-core anti-vaxxer disinformation.


By the final 15 minutes in our browsing session, the recommendations section barely had anything informative or reliable to offer: The recommendations we got included videos on demonic invasions; content that falsely suggested deaths from a COVID-19 vaccine; numerous videos from notorious anti-vaxxer influencers; videos about protests against COVID-19 restrictions; videos warning of the (never-materialized) dangers of vaccines by a guy (deceitfully) claiming to be the inventor of the mRNA technology; “Orwellian dystopias” prophecies; videos about assorted mysteries; material from controversial alternative medicine proponents; and all sorts of sensationalist videos made to trigger strong reactions of scandal in the audience (Corrupt politicians! Free speech under threat! Uni professors saying outrageous things!).


Lessons learnt

At this point, we had had enough.


In our experiment we confirmed that YouTube leans very heavily on the “rabbit hole” mechanism. Echo chambers are, of course, less important for YouTube since communities are not a crucial component of its offering.


Communities do exist, but they form in comment sections, and are “vertical”, centred on interaction with the content creator, with which users can at best engage secondarily, through comments to the content or things like AMA streams (which, however, content creators can still control).

This is in contrast to other social media, where communities are “horizontal”: every participant can produce content – such as a Facebook post in a group, a Reddit comment, a tweet – of equal weight as that of any other participant.


Given the lower importance of communities and echo chambers, YouTube’s engagement can be driven by rabbit hole mechanisms instead: Scandal makes you click wanting for more.


Hope for the future?

In Fall 2021, not long after we conducted our experiment, YouTube announced it would be deleting anti-vaxxer videos.


The process was bound to take a while – not least due to the sheer amount of disinformation on vaccines at that time – but was, at least, a positive step.


How did that work out though?


YouTube was still alleged to be a repository of deceitful material on various topics, including COVID-19, as late as November 2022; and as to vaccines, some of the videos we visited during our experiment that contained disinformation or misinformation about vaccines and COVID-19 had been removed, but many were still online as late as December 2022.


Overall, a disappointing picture.


Stay tuned for the following experiments but know that if you are looking for uplifting news, you have come to the wrong place.

bottom of page