April 25, 2022

PRESS RELEASE

Brussels, Monday, April 25 – The Digital Services Act (DSA), the EU’s groundbreaking law on internet safety and accountability that will introduce a sweeping change to our online environment, was agreed on Friday April 22nd. It provides a set of powerful tools that can be used to tackle disinformation. With this regulation, platforms will be held accountable for their role in disseminating mis and disinformation. But ensuring this accountability will require some further effort. 

The law will strengthen incentives for platforms to tackle illegal and harmful content, and presents a new way of thinking about content moderation that is especially valuable for the counter-disinformation community. Platforms will have to be more transparent, but most importantly, they will finally have to be accountable. 

As Alexandre Alaphilippe, EU DisinfoLab Executive Director, announced: “For the community countering disinformation, the Digital Services Act ends the era of privileges for the few and will now ensure accountability for all. We’ve been fighting against sector exemptions, we’ve been fighting for users to know why the disinformation they flag is not removed, and we’ve been fighting to open the black box of content moderation decisions.”

The regulation is definitely a milestone to celebrate. It is the result of the dedication and cooperation of many civil society voices, as well as the work with decision makers who have been engaging with us throughout this process. 

Key take-aways for the counter-disinformation community

Risk Assessments 

Very large online platforms (VLOPs) like Facebook will be required to conduct a ‘Systemic Risk Assessment’ once a year, which will include the risk of disseminating or amplifying disinformation. Researchers will be able to verify risks through new data access and scrutiny mechanisms. VLOPs will then have to mitigate these risks appropriately and successfully, or face fines. 

User Appeals

We know too well the frustration of reporting harmful content and having that complaint go ignored, or simply being told that it doesn’t breach the platform’s terms and conditions. We pushed hard to ensure that these non-responses and lack of action from platforms can be challenged through an internal complaint system, with an additional opportunity to access an out-of-court dispute settlement body.

No Media Exemption

Along with other civil society organisations and members of the counter-disinformation community, we fought ruthlessly against any kind of ‘media exemption’ that could have given media outlets a free pass from content moderation even when they spread disinformation. The regulation would have been useless against disinformation with such a loophole. Media should be equally accountable when it disinforms. No exceptions. 

Future clarification on the implementation will be needed

In addition, we have a few concerns and things to watch out for: 

  • It is still unclear how the revamped Code of Practice on Disinformation will fit together with the regulation, and how effective this combination will be. 
  • The precise definition of “very large online platform”, and the scope of obligations for other digital services that also spread disinformation, like messaging channels, remain vague. 
  • DSA enforcement will vary across member states, which may affect the fight against disinformation in different countries. 
  • We hope for the timely implementation and swift adoption of delegated acts, not to delay the impact that DSA can have on creating safer information space for all. 
  • Finally, we also hope that platforms meet the DSA requirements with reporting and appeal processes that are user-friendly and that empower people, rather than confuse them.

What’s next? Make sure these new rights are implemented, and used.

Regulation is a process, however, not just a product. While the text is agreed, it is useless if not properly enforced by the regulatory authorities on the one hand, and if the new tools and user rights are not invoked by individuals and entities on the other. EU DisinfoLab will be working on empowering the counter-disinformation community to use these tools and ensuring regulators hold platforms accountable. 

“We welcome this long-awaited regulation expanding the toolbox for civil society and regulators to hold platforms to account. But a toolbox is only a toolbox. Only the collective empowerment of the counter-disinformation community to grasp these tools in the DSA will tackle disinformation,” commented Mr. Alaphilippe.

About EU DisinfoLab

EU DisinfoLab is an independent non-profit research organisation specialised in analysing disinformation. We uncover and expose sophisticated disinformation campaigns. We seek to amplify the voices of our community of counter-disinformation experts across the EU and contribute with collective expertise to policy making. You can find more information about our work on our website https://www.disinfo.eu/.

Contact details

For further information, please contact Rita Jonusaite, Advocacy Coordinator, rj@disinfo.eu