September 8, 2022

The EU DisinfoLab welcomes the new Code of Practice on Disinformation (the Code) that was revealed in June this year. So far, it has been signed by 34 organisations that joined the revision process of the 2018 Code. We would like to congratulate all organisations who have worked relentlessly in updating the Code. Especially our civil society partners who dedicated time and energy to improving the Code by drawing Commitments and measures together with industry stakeholders. We also would like to use the opportunity and commend the European Commission for leading the process of the revision of the Code. 

The process could have seen some improvements, specifically regarding third-party experts’ engagement, as raised by a number of civil society organisations. However, we acknowledge and praise the ambition to make the Code part of a broader regulatory framework in combination with the upcoming Regulation on Transparency and Targeting of Political Advertising and the Digital Services Act, as well as the AI Act currently on the table, and the GDPR. It is notable that for the Very Large Online Platforms (VLOPs) the Code aims to become a mitigation measure for systemic risks linked to disinformation they will have to identify and address under the DSA. It should become a co-regulatory mechanism once the DSA becomes a law and starts applying. However, the success of the Code and the DSA will depend on strong enforcement. The two tools should be mutually reinforcing each other. This will require sufficient resources as well as clarity in many areas, especially related to possible underperformance by signatories.

Will the EU DisinfoLab become a signatory of the Code? 

After some deliberation, the EU DisinfoLab has for now decided not to become a signatory of the Code. This decision has primarily been made based on the limited resources of the organisation. The EU DisinfoLab strongly believes that one should honour its commitments as best as possible, and make all proactive efforts to ensure that objectives are met. Being a signatory of the Code would require dedicating an amount of resources that the organisation is currently lacking. The organisation would also not be able to engage with the Code in a quality way it sees fitting. 

Other reasons justify the present conclusion. Signatories of the Code would need to sign non-disclosure agreements. For EU DisinfoLab, this requirement is overly restrictive, and potentially limits our ability to critically assess the implementation of the Code and present our views to the public. We also do not see why the application process appears to be the same for all signatories, including civil society organisations (Preamble). There is a resourcing issue across civil society, and we fear the Code might make this issue worse before it gets better. 

While we are not joining the Code, the EU DisinfoLab, considering its expertise in monitoring and analysing disinformation, is keen to be engaged and support the implementation and further development of the Code in other capacities. 

The Code provides a possibility for the Task-force and, hopefully, different subgroups foreseen in the Code, to engage with the third-party experts as observers. Hence, we would be happy to provide our insights and expertise in this or other alternative ways, and we remain open for dialogue and cooperation with signatories and the European Commission.

What do we like about the 2022 Code? 

  • The Code of Practice on Disinformation becoming the Code of Conduct under the DSA (DSA article 35). Once the DSA enters into force, for VLOPs, the Code then could be a risk mitigation measure on disinformation. It would then also mean mandatory audits (DSA article 28) that will be a critical tool to ensure compliance. While there is a commitment from VLOPs to do better, the past experience has demonstrated that it is not always enough. The audits, among other things, will look into how VLOPs adhere to their commitments for the Codes of Conduct that they are a part of. Where an audit opinion is negative, auditors will be issuing operational recommendations and the recommended timeframe to achieve compliance. This is promising as the Code does not have other enforcement mechanisms. 
  • Framework for data sharing with vetted researchers including civil society researchers which is in line with the DSA. We welcome the creation of a third-party body (Commitment 27) that will provide support to make this possible. We specifically appreciate the commitment that signatories will grant vetted researchers with real time searchable crowd-tangle type public data (Commitment 26). It mirrors the DSA (recital 46 & article 31 4d) that asks VLOPs to provide access to real time data, where technically possible. And while we see a lot of positives in the framework for data sharing, some major questions still need to be answered. From the language of the Code itself (SLI 27. 3. 1.), it is not clear whether all vetted research projects will be implemented as “relevant signatories will disclose” with how many of them they engaged with or provided access to data. That leaves us wondering: if a signatory decides not to provide requested access to data, would there be an explanation? All this still seems quite fuzzy and not clearly spelled out. This is important not only for the possibility to challenge such a decision but also potentially to improve research proposals in the future. In addition, it is not clear whether asking for access to real time data would be going through the foreseen independent third-party body or would be left entirely up to a “relevant signatory” to decide?
  • The Code mentions the availability of independent funds for researching disinformation (Commitment 28). Resources are essential, especially for civil society organisations who are key stakeholders in researching and improving knowledge on disinformation but often struggle with very limited means. 
  • The work on harmonised reporting templates by the Task-force (Commitment 37). Similar to practices in the financial sector, regulators should be able to use these reports to spot patterns and understand compliance failures. Our research on previous reporting shows that it is difficult to assess platforms individually and collectively. This will be an important improvement and we look forward to the first reports. 
  • Efforts foreseen in the Code to contextualise information provided in the baseline reports by signatories (additional necessary quantitative information for context (Commitment 31)). Experience with the past Code has shown that without meaningful context, reports on the implementation of the Code are not providing the full vision on the impact actions taken by signatories had. Consequently, the reporting then ends up being a “tick the box” exercise more than a useful tool. Providing more contextual information seems like a step in the right direction. 
  • Commitment 31 ensuring that fact-checking covers all Member States of the EU is a commendable one. Considering how content moderation differs across countries and languages, this is another good step to ensure that the first line of defence to disinformation is established everywhere. 

What are we concerned about? What are some other questions that still have to be answered? 

  • The key concern for us about the 2022 Code lies in a potential inconsistency with the DSA on the User appeals mechanism under article 17 (Internal complaint-handling system). Commitment 24 on the Appeal mechanisms of the Code is limited in scope and does not align with the DSA. The DSA includes over-moderation as well as under-moderation in its Internal complaint-handling system. The Code, however, only refers to appeals linked to possible content over-moderation cases. This should be re-examined in the Code to avoid any confusion or uncertainty regarding the relationship between the DSA and the Code. Moreover, for the counter-disinformation community, under-moderation cases are equally, if not more, important to be addressed to ensure that disinformation does not slip through the content moderation system cracks.  
  • Some actors whose services contribute significantly to the dissemination of disinformation and should be signatories of the Code are missing. The voluntary nature of the Code means that some non-VLOPs will continue to play an outsized role in the dissemination of disinformation with limited scrutiny (e.g. Odyssee, Patreon, GoFundMe, Telegram). However, the Code has some Commitments aiming to address disinformation on messaging apps. Signatories providing private messaging applications recognise the importance of working on technical features helping users to identify and flag disinformation and integrate the work of fact-checkers into such services (Part V(h)). This is a good addition considering the significant amount of disinformation that spreads on messaging apps. Once again though, it is even more disappointing not having services such as Telegram signing up to the Code. 
  • It seems that the key body responsible for the monitoring of the Code will be the European Commission (Commitment 39). We are concerned about the capacity that will be needed to effectively do the monitoring. And wondering whether the European Commission has the necessary resources to achieve this objective. 

Other things to note

  • The Code includes a special commitment to minimise the risk of viral spread of disinformation through the adoption of safer design practices, including designing recommender systems to improve the prominence of authoritative information and reduce the prominence of disinformation (Commitment 18). However, there is little guidance in the Code how this could be implemented. Quality assurance standards for recommender systems should be established to fill in that gap. These standards should specifically prohibit the algorithmic amplification of disinformation.
  • Resourcing for collective enforcement. Expectations towards civil society signatories and observers are enormous. This is not the only Code of Conduct foreseen under the DSA, nor the only way they would be engaging, so there is a risk the ecosystem will be stretched thin. The European Commission should discuss with civil society partners what needs should be covered and how best to get there. Possibly, specific enforcement grants for civil society could be envisioned. As noted above, we are happy to see special funding foreseen for disinformation research (Commitment 28) but this will be nowhere nearly enough. Some more targeted support could also be foreseen to strengthen parts of the implementation of the Code. For example, supporting relevant third-party approaches aimed to provide advertising buyers transparency on the placement of their advertising campaigns. 
  • Making sure that the data provided through the Transparency Centre of the Code will give a meaningful picture on efforts to counter disinformation across the EU. Based on the experience with the last Code, we can expect that the Transparency Centre will show the numerator (for example, “we took down 1,000 pieces of disinfo this month”). However, researchers need to know about the context of data provided. While, as previously mentioned, we see improvements in the Code acknowledging this need (Commitment 31), it will be crucial to work together with researchers, including civil society, on what this contextual data should look like (e.g. views, engagement, number of content, reach, etc.). Also, it is important to note that the compliance-related data (the Transparency Centre would provide information related to the implementation of the Code’s Commitments) is NOT the same as data related to real time access about what is happening inside the platform at a given time. Service level indicators should be included in data sharing and aggregated statistics related to how much money is being made from disinformation, how harmful content is being distributed, and who is being exposed (demographics, traffic, targets.).
  • Signatories choosing their own Commitments will leave gaping holes. Of course, not all Commitments are relevant to all signatories, but there should be a more objective method for assigning Commitments, perhaps through the Task-force as a whole. For instance, the data access Commitments apply to “relevant signatories” without any concretisation. This approach risks neglecting important actors and data sets. While it is widely understood that the relevant signatories would be VLOPs, it does not seem to be explicit and should be clarified. 
  • Structural indicators will still need to be developed. It will be important to keep country level granularity there and make sure that all Member States are covered (Commitment 41). 
  • Non-compliance has different implications for different signatories, for VLOPs and non-VLOPs. For the former it seems to be rooted in the DSA, while for the latter, the process does not seem to be even designed yet. It is also not clear whether there are any steps foreseen to help non-VLOPs remedy non-compliance. There are some ways it could be addressed, for example, the Task-force could be issuing joint recommendations. As for the audits for VLOPs, they must be well performed (Commitment 44). Are there as of now any auditing standard or guidelines for those disinformation audits or will it come together with the ones mentioned in the DSA through a dedicated delegated act and voluntary standards on auditing? From our perspective, there is a need to look beyond the existing auditing practices and accreditations. They could include third-party independent auditors working together with experts in researching disinformation to audit how platforms comply with the Code, measuring if platforms staffing is proportionate to their user base, evaluating if policies are comprehensive, and assessing if policies are being enforced. 
  • Some terminology in the Code continues to raise eyebrows and pose questions, for example, the terms “harmful disinformation”. It is not clear what distinction this is trying to make and how it will be done. We suggest that it could link with a potential impact that a particular piece of disinformation could have. Tools such as our recently published Impact-Risk Index could come in handy to clarify those.

Closing remarks 

The 2022 Code of Practice on Disinformation certainly is a promising step forward, especially when it becomes a co-regulatory mechanism with the DSA. However, there are many questions that still need to be answered and some of it still has to be fully developed. In many parts, the relationship between the DSA and the Code still needs to be clarified. We look forward to the elaboration/clarification coming from the Vademecum, especially on the role and engagement of the “third-parties”. 

The EU DisinfoLab remains keen on engaging with the Code in a way that allows us to perform a role of an external watchdog and provide relevant input. Above all, we hope that the Code and the DSA will align in all necessary ways and be mutually reinforcing. And that it will not create any uncertainties or ambiguities hampering rather than helping the fight against disinformation.