Social networks and algorithms: confinement, polarization… or opportunity?

Recommendation algorithms, those little mathematical marvels that decide what we see on social media, are supposed to make our lives easier. In theory, they serve us relevant content aligned with our interests. In practice, they trap us in a gilded cage where we only see what reinforces our preconceived ideas! But not everything is bleak—when used wisely, these same algorithms can also be a powerful tool for openness.

Algorithms: friends or foes?

If you’ve ever been sucked into a three-hour YouTube rabbit hole about “Were the pyramids built by aliens?”, you’ve experienced the algorithmic tunnel effect. Social networks operate on a simple principle: maximizing your engagement. The more you interact with content, the more similar content you are shown.

Facebook, Instagram, Twitter, TikTok… all use ultra-sophisticated formulas based on your browsing history, interactions, and even the time you spend on a post (yes, Instagram knows when you stare at that otter video for eight seconds too long). The goal? To keep you glued to the screen for as long as possible!

However, these algorithms can also be leveraged for good: they allow us to discover new interests, access inspiring content, and educate ourselves on various topics. The key? Learning how to manipulate your own digital environment!

Filter bubbles: between isolation and discovery

The concept was popularized by Eli Pariser in 2011 in his book The Filter Bubble: filter bubbles are parallel universes created by our online preferences.

Let’s take a concrete example:

  • You like electric bikes? The algorithm will flood your feed with tests, comparisons, and ads for the latest models.
  • You like conspiracy theories? In just a few clicks, you’ll be watching a video about reptilian overlords ruling the Earth (with ominous background music, of course).

The result: you no longer see alternative perspectives. Your online reality shrinks to an ideological ecosystem that reinforces your beliefs. But there’s another side to the coin:

  • If you follow diverse content, the algorithm will provide a more enriching mix.
  • You can force it to expose you to more varied sources by actively engaging with them.

Echo chambers: amplification and responsibility

An echo chamber is when a group of individuals only shares similar ideas without exposure to other viewpoints. The more an idea is repeated within the group, the truer it seems. This is how extreme movements gain momentum online.

However, this phenomenon can also be used positively: activists, researchers, and experts in science and ecology manage to create communities that advance knowledge. The real question is whether we use the echo chamber to evolve… or to confine ourselves.

Recent developments in algorithm regulation

2023: Implementation of the digital services act (DSA)

Adopted in 2022, the Digital Services Act came into force in 2023. This legislation requires major online platforms to provide detailed information about how their recommendation and moderation algorithms work. The goal is to ensure a better understanding of their impact and promote a more balanced distribution of information.

2024: Investigations and legal actions

  • October 2024: The European Commission demanded that platforms like YouTube, Snapchat, and TikTok disclose detailed information about their recommendation algorithms. This request aims to verify the transparency and impact of these algorithms, particularly regarding the protection of minors and the prevention of misinformation.
  • November 2024: Seven French families filed a lawsuit against TikTok, accusing the app of promoting content that encourages eating disorders, self-harm, and suicide through its algorithms. This case highlights growing concerns about the impact of algorithms on young users’ mental health.

2025: Continued transparency efforts

  • January 2025: The European Commission launched an investigation into X’s recommendation algorithm. It required Elon Musk to provide all relevant information about the algorithm to analyze its impact on content distribution and hate speech moderation.

How to regain control and reduce polarization

  • Diversify your sources of information: Follow media outlets with opposing viewpoints to step out of your comfort zone.
  • Adjust social media settings: Some platforms allow you to disable content personalization (yes, really—dig into the settings).
  • Use fact-checking tools: Before sharing a sensational post, verify it (AFP Factuel, Hoaxbuster, etc.).
  • Cultivate curiosity: If a piece of content seems too good (or too outrageous) to be true, investigate rather than clicking mindlessly.
  • Promote media literacy: Understanding how algorithms work and their effects helps navigate the digital ecosystem more effectively.

If I had to conclude this article, I would say that we are all responsible for our consumption of information. Yes, algorithms trap us, but we have the power to unlock the door. It requires some effort, but that’s the price to pay for a richer and more nuanced worldview. If you can switch channels when a TV show is really bad, or stop after one episode on Netflix to try a different series (which you’ll probably end up binge-watching anyway), you can do the same on the web! 😉

Sources