December 15, 2022

By Gary Machado, Maria Giovanna Sessa, Rita Jonušaitė and Alexandre Alaphilippe, EU DisinfoLab

Discussions around online content moderation often turn into a blame game. Part of this blame game includes public authorities complaining that platforms would not be doing enough to limit the visibility of “awful but lawful” content. The work by our partner Check First helps us better understand whether public authorities have been active in notifying one particular online platform about possibly illegal content. Our assumption is that much can be learned from it to understand the present and future activity or inactivity of public authorities in flagging “awful but lawful” content to platforms. Hopefully, this work can serve as input for the enforcement of the Digital Services Act and contribute to the debate around freedom of expression, possible risks of censorship, access to information, hate speech, and the role of governments and platforms in these.

In October 2022, the video-sharing platform Odysee, which Dr. Eviane Leidig from the Global Network on Extremism and Technology described as “The New YouTube for the far-right”[1], came under the spotlight for continuing to broadcast RT France’s livestream[2]. Our partner, Check First, obtained a list of access restrictions active on the Odysee platform, as requested by various entities. These 3,000+ access restrictions prevent users from watching some videos or livestreams from specific locations. Their technical paper, including how they got access to the active restrictions’ list on Odysee, is available here. The searchable list of requests is available here. It should be noted that our article here focuses only on active restrictions active on Odysee. It does not cover access restrictions applied by LBRY, “a blockchain-based file-sharing and payment network that powers decentralized platforms, primarily social networks and video platforms”[3], including Odysee.

Almost the entire list of requests is about content possibly considered as illegal in most European countries, such as the promotion of Nazism and profoundly anti-Semitic content, the promotion of terrorism, or the Islamic State, and so on. In view of this, it should be noted that these requests to Odysee can easily be submitted using this link. Therefore, one would think that when it comes to such illegal content, Member States would have acted and sent geo-blocking requests regularly to Odysee.

On the contrary, our analysis based on the received list shows that, besides Germany, Member States are far from doing enough to enforce the law, even when it comes to illegal content. Considering the provisions of the Digital Services Act, according to which very large online platforms will have to address systemic risk that they identify on their platforms, including disinformation, and would need to take mitigation measures such as specific content moderation practices, this is deeply worrying.

When we first accessed this list on October 19th, 2022, German authorities had sent close to 3,000 requests, while other EU Member States had sent a combined total of ZERO requests, and the EU had sent only 2 requests.

Interestingly and worryingly, even “EU-Google” (Google’s European offices) had sent more requests than EU institutions, France, Italy, Spain, Poland, and all the other Member States combined (except Germany). Australia also sent more requests than all these Member States combined. This makes EU-Google the only “pan-European” entity to have effectively made requests to Odysee.

After the controversy related to the continued broadcasting of RT, France eventually sent requests to geo-block these livestreams. As only France made this request, the content is still accessible from all the other Member States.

Sadly, our assessment is in line with what is often said about the EU: namely that it is good at making law, but often poor at enforcing it. As a result, the key upcoming challenge of the DSA will be its enforcement. While much of the attention has been given to access to data, this example shows that this can be very useful, but that on its own and without proper enforcement, mere data access will not be sufficient. Here are our recommendations:

  • EU Member States’ administrations need to put the necessary resources into enforcing the law, including with the new upcoming national Digital Services Coordinators. Each Member State should have personnel to verify that platforms respect the law and take action when they do not comply with it. It is hardly acceptable that large Member States currently do not spend even a minimum number of resources to try to limit the visibility of illegal content on some platforms.
  • EU-level action is needed. The EU needs to be capable of properly enforcing the law on illegal content and other content that falls into the systemic risks category in the DSA. Leaving enforcement entirely up to the Member States or private companies would be more costly and far less efficient.
  • For a public enforcement vs. a private enforcement. The example of “EU-Google” taking more actions than the EU and its Member States is worrying. While we should not complain about Google taking the necessary measures to apply the law, we believe that public institutions should be making such requests to avoid the privatisation of enforcement and to gain efficiency.
  • Access to data should not be a smokescreen. Alone, it will not suffice as it is crucial to address what will be done with this data once accessed. We call on the EU and its Member States to seriously reflect on how the DSA can be enforced, as transparency goes hand-in-hand with enforcement.