The many faces fighting disinformation
  • Advocacy
  • Artificial intelligence
  • Programming
  • Research

“Algorithms not designed to have quality information in their objective functions will naturally favour disinformation”

Guillaume particularly became aware of the “snowball effect” boosting conspiracy theories on the platform. “I realized the algorithm was promoting disinformation, very often more disinformation than truth. I looked at topics which were clearly disinformation, like flat earth theories, and realized that the algorithm was producing more flat earth than round earth videos.” Guillaume had identified a fundamental design flaw playing out at a massive scale. “We can only do so much fact checking. If the algorithm decides 70% of the views, it’s a losing battle.”

Unable to convince his colleagues to intervene – even to recognize the problem – he left the company and began AlgoTransparency in 2017, with the objective of exposing what content recommendation algorithms are showing people on different platforms. They currently look at Facebook, Google Autocomplete, Twitter Trending, and of course YouTube, where they monitor over 800 top information channels. The project runs on small grants and is now supported by Guillaume’s Mozilla fellowship.

At the moment, AlgoTransparency works primarily with journalists. The functioning of the tool is a bit complicated, and it takes time to explain how it works. Guillaume would like the tool to be useful to more people and broader in what it surveys. In general, he still sees a huge need for transparency into algorithms. “People either don’t know how to do it because they weren’t insiders, or are afraid because they don’t want to face big tech companies, or discouraged because there isn’t a real business in it” he says. Access to data is still an issue, and it is insufficient for platforms to be able to decide what information they share publicly. What’s needed is external pressure. 

Ideally, Guillaume would like to build out a team and expand monitoring across a range of platforms, then be able to work with fact checkers, analysts, and journalists to make this data digestible for different stakeholders: individuals, researchers, and the employees at these tech companies themselves – many of whom don’t want to look at their own problem. “Instead of small investigations, I want to look at the general issue. I want to make the data available and enable everyone to use it as they want.”

Guillaume Chaslot
Global
France

We can only do so much fact checking. If the algorithm decides 70% of the views, it's a losing battle.

Inspiring change

AlgoTransparency has had enormous impact. Guillaume himself has achieved media recognition as a whistle blower of sorts (he recently appeared in the film “The Social Dilemma”), as the policy discussion has increasingly turned towards the need for algorithmic transparency. The impact on platforms is more difficult to understand. “YouTube changed their algorithm 30 times to address the exact issues I was talking about”, Guillaume notes, referring to their actions in 2017 on the amplification of terrorist content. But they did not take action regarding disinformation like the flat earth content that he had brought to their attention.