2 February 2026

by Katrīna Luīze Ašmane & Joe McNamee, EU DisinfoLab

In our last blog post we talked about malicious semi-compliance exercised by the platforms, apparently to avoid meaningful compliance with the law. 

The issue of platforms complying in an obstructive way is beginning to gain some more attention in the European Parliament. When it comes to reporting illegal content, platforms have designed lengthy forms that include unnecessary steps, even though the Digital Services Act (DSA) clearly states that reporting mechanisms need to be user-friendly. Recently, a Belgian MEP Pascal Arimont (EPP, Belgium) tabled a parliamentary question, pointing out the various obstacles to reporting illegal content. His question seeks an answer from the European Commission on specific steps it intends to take to ensure that platforms provide a user-friendly notice mechanism for reporting illegal content.

Pascal Arimont’s question is based on a report published by HateAid – a non-profit organisation for human rights in the digital space. This report examines how very large online platforms (VLOPs) implement the key obligations under DSA Articles 16 (Notice and action mechanisms), 20 (Internal complaint-handling system) and 21 (Out-of-court dispute settlement). The report found that the process of reporting illegal content is slow and burdensome, seemingly designed to make users give up or report content as terms of service (community guidelines) violations instead, which creates fewer obligations for platforms when reviewing the report.

In a day-to-day sense, what Pascal Arimont is referring to in his question could be described as sludge, discouraging people from doing something, which is the opposite of nudge – encouraging people to do things. The concept of sludge can be characterized as imposing too many administrative steps, or an excessively complicated procedure. This is designed to make people feel like they are wading through “sludge”, so that they give up. In other words, it “deters you from getting what you are owed.” In this case what you are owed is a user-friendly reporting tool, to facilitate the protection of a safe online space, and big tech platforms deter you from exercising your rights in this space. (The podcast 99% Invisible explains the concept of “sludge” in context of customer service call centres here.)

Sludgy mechanisms for reporting illegal content

Article 16 of the Digital Services Act requires platforms to implement “user-friendly” mechanisms to report illegal content. This article also sets out that any report of illegal content should include an explanation of the alleged illegality of the content that is “adequately substantiated”. This wording comes from a Court of Justice of the European Union (CJEU) case L’Oréal SA and Others v eBay International AG and Others (C-324/09), to ensure that general awareness of the service storing illegal content is not enough to make platforms liable for not taking action against it. As this wording is for the benefit of the platforms, it is especially malicious to make “adequately substantiated” reporting difficult.

Our “Malicious semi-compliance” blog post mentions Commission’s preliminary findings, asserting that Meta’s Notice and Action mechanisms include unnecessary steps. While the publicly available information on the findings do not provide further details, the complexity can be easily seen by trying to report a piece of content yourself. Before being presented with the option to report content as illegal, users encounter a long list of other categories including hate speech, harassment, bullying, etc. While these categories include many types of illegal content, all of these channels will only lead to reporting a terms of service violation. 

If you do manage to find the right section to report illegal content (it appears only after scrolling down) the page directs to a sludgy website form which will require extensive information about the legal violation in question and can also ask you if you are sure you don’t want to report this as terms of service violation, because surely it is significantly easier. According to HateAid, reporting illegal content as terms of service violation takes 5 clicks whereas reporting it as illegal content – at least 15.

Facebook and Instagram are not the only platforms that try to sludge users away from reporting illegal content. YouTube and TikTok subject their users to similar hurdles, including asking for the same information again through email or having character count restrictions, limiting the length of explanations.

The overwhelming complexity of reporting illegal content can lead users to report it as a community guidelines violation or giving up on the process altogether, which seems to be why platforms have designed this sludge. 

Under Article 6(1) of DSA platforms are only liable for the illegal content circulating on their services if they do not act after receiving a valid notification of such content. Moreover, reviewing terms of service violations does not require to fulfill the same obligations as in the case of illegal content, where the platform has to provide the user with a confirmation receipt of the notification received and inform them about the decision platform is taking.

What about enforcement?

With this in mind, it is interesting to note that the Digital Services Board in its work programme has opted to approach illegal content through looking at the challenges posed by certain specific types of content, rather than looking at generic problems with illegal content – such as platforms getting away with sludgy notice and action mechanisms. When it comes to illegal content, the annual work plan of the Board sets priorities to focus on financial scams, child sexual abuse material, sharing of non-consensual intimate material or pirated content. While these are critical issues, addressing illegal content as a horizontal issue could be arguably more efficient.

In relation to sludge, so far, the closest thing to enforcement from the Commission’s side are the preliminary findings in the investigation against Meta. The further development of this investigation remains to be observed.

The question tabled by Pascal Arimont frames the core issue, asking the Commission for concrete steps it intends to take to ensure effective enforcement of the obligations laid down in the DSA.