top of page

Rabbit Holes and Echo Chambers. Part 2: Experimenting with Facebook

In the first experiment we published, we discussed about YouTube. We encourage you to check that one out first, as you will also find a general explanation of how social media works. In this experiment, we will present what happened when we delved into the depth of Facebook groups.


Facebook’s groups are just the worst


The importance of groups for Facebook’s engagement

If YouTube’s engagement is helped by addictive content, in the case of Facebook, the “engine” is communities.


Since 2016, Facebook has shifted its focus towards building communities, and as part of this strategy, Facebook groups are now increasingly important vehicles to promote users’ participation. You might have noticed this yourself: if you join or interact with a new group a lot, your feed will often appear to feature a disproportionately large amount of content from that same group.


Groups, if centred on ideas or discussions, can quickly become echo chambers. And the interaction with groups, which Facebook encourages, can make one’s Facebook feed dominated by the talking points that permeate such groups.


Aliens and space

For our experiment, as we were rather tired of looking at vaccine disinformation, we decided to fly into space instead.


We created a brand-new Facebook account and used it on an incognito page to make sure there would be no interference from previous browsing habits.


Then, we started joining public and private groups about space exploration (NASA, Space X, and the like), alternating them with groups about space conspiracies: Moon landing hoax groups, aliens’ appreciation communities, and the like.


We tried to “tell” Facebook that we are equally interested in legitimate material about space, and disinformation one about the same topic. At the end we had a good mix of information groups and disinformation ones, the latter of which also included absolutely insane ones about the “imminent danger” of the Earth’s poles being about to shift and bring about the apocalypse, and goofier ones about such evergreen topics as “ancient alien civilizations” or crop circles.


What happens when you engage

After liking these groups, we went back to our feed, and we noticed that the content from the various groups was being presented in a rather balanced way. We would get some content from legitimate groups, followed by some disinformation material, then ads, then again legitimate material, and again disinformation one.


We then spent around 20 minutes interacting with a few conspiracy theory groups. If you have ever used Facebook, you know that 20 minutes of browsing is not much. And yet, the results were quite interesting.


After just 20 minutes of interacting more with groups geared towards disinformation and conspiracy theories, our recommendations for additional content in the feed became immediately skewed accordingly.


What you see in the feed

Keep in mind: The feed is supposed to tell you what’s going on in the pages and groups you follow, but in practice, it prioritizes content in a way that is outside of your control.


This does not have much to do with the amount of content that a page or group produces, its quality, or the amount of engagement it generates, and much more to do with the likelihood that this will keep you engaged.


In other words: half of the world might be heavily engaging with a group where high-quality information about the latest astronomical discoveries is being shared, but if you showed that you might stay on Facebook longer interacting with Flat Earth videos, that is what you are likely to get.

Unless you specifically ask to see the “Most Recent” content in the top-left list of buttons, you will be shown what the social media decides you have to see. And even if you click on “Most recent”, Facebook will immediately place a “Back to top posts” button at the top of your feed to encourage you to go back to their preferred order.


In practice, Facebook will encourage you to engage even more with the groups that it knows you like. As you can imagine, this means you can get stuck in echo chambers incredibly fast.


Anyone who has joined numerous groups on Facebook can see that by making a quick experiment: Look at your list of groups and scroll down; you will probably find some that are quite active (as indicated by the last activity information of the group), but that you cannot recall having ever seen in your feed in recent times. Perhaps you do not even remember joining them.


Instead, you will recognize from your feed the same handful of groups that you have engaged with more regularly and more recently. Facebook actively pushes you into a repetitive pattern of interaction.


What does this mean when you enter a disinformation space?


Echo chambers build echo chambers

In our experiment, we noticed that the echo chamber-encouraging pattern ensured that as soon as we engaged with disinformation material, we were pushed to consume even more.


Not only was our feed rapidly taken over by the disinformation and conspiracy theories groups we had joined: even our recommendations for additional groups were skewed towards more and more disinformation. “Official Flat Earth” and “Unsolved Mysteries & Paranormal Activities” content started populating our recommended posts in the feed.


The algorithm chooses groups we have never interacted with before based on our time spent browsing related topics. As long as you consume legitimate information, this is not too bad.

The problem is when one stumbles across groups that spread lies. One would be forgiven for thinking that Facebook does not care at all if its users get stuck in echo chambers of disinformation or extremisms.


A depressing picture

If YouTube browsing led us down a rabbit hole within one hour, 20 minutes of Facebook browsing would be enough to skew our browsing experience. Overall, the picture is not very encouraging. Within a surprisingly short time, the wave of disinformation material was well on its way to overshadow the legitimate one.


We may wonder what would happen if one were to go along with this and follow Facebook’s suggested browsing pattern.


In fact, we do not need to wonder. Facebook whistleblowers already alleged that the social media’s algorithm is designed in a way that dangerously facilitates the creation of echo chambers of radical, extremist content. Media reports are now rife with reports of how some of the conspiracy theories that Facebook and other social media allegedly encouraged users to engage with has strained or destroyed people’s relationships, quality of life, and sanity.


Keep watching this space for the next experiments. Will Instagram and TikTok do any better?


bottom of page