Submit your feedback on the EDAP Roadmap

Just to let you know, the deadline for giving feedback on the European Democracy Action Plan roadmap is this Thursday 27th August. More information about the roadmap as well as how to submit a contribution can be found via this link.

In addition, you still have until the 15th of September to submit your contribution to the EDAP public consultation.

Facebook’s algorithm & the threat to public health

Avaaz has released a report detailing how Facebook’s efforts to tackle health misinformation are being strongly undermined by its own systems. Amongst other things, the report highlights that:

  • Global health misinformation networks spanning at least five countries generated an estimated 3.8 billion views on Facebook in the last year.
  • Content from the top 10 websites spreading health misinformation had almost four times as many estimated views on Facebook as equivalent content from the websites of 10 leading health institutions, such as the WHO and the Centers for Disease Control and Prevention.
  • Only 16% of all health misinformation analysed had a warning label from Facebook. Despite their content being fact-checked, the other 84% of articles and posts sampled in Avaaz’s report remain online without warnings.

In response, EU Commission Vice President Věra Jourová affirmed that the report demonstrates the need for “stronger action to improve transparency, access to data, users’ awareness,” which will be addressed in the European Democracy Action Plan and the Digital Services Act.

Read more

Killing the switch

Whilst on the topic of Facebook and content curation, Facebook is reportedly piloting a new method to check viral posts for mis– and disinformation before they spread too far, The Verge’s Casey Newton reports. Following a recommendation by the Center for American Progress, this kind of “virality circuit breaker” would work by temporarily preventing viral content from algorithmic amplification in newsfeeds and appearing in trending topics until moderators have a chance to review the content’s veracity. At the moment, Facebook does share information about viral news articles with its fact-checking partners who then decide what to fact-check, but this pilot method would allow Facebook’s own teams to evaluate the content against community standards. This new process is expected to be rolled out soon.

In the news

Good reads

  • new, inspiring long read by The New Yorker details the trials and tribulations of Marietje Schaake’s move to Silicon Valley as a European policymaker undertaking a top position at Stanford Cyber Policy Center, affirming that America has a lot to learn from Schaake’s approach to regulating tech and safeguarding democracy.
  • Repress/redress: What the “War on Terror” can teach us about fighting misinformation – This piece offers a new perspective for fighting misinformation by arguing that instead of engaging in a single strategy of repressing misinformation, we should also redress the sociopolitical grievances that cultivate our receptivity to misinformation, e.g. loss of public trust in institutions. 


  • Institute for Strategic Dialogue has released an investigation analysing the extent to which Holocaust denial content is readily available on mainstream online platforms, finding that Holocaust denial content is actively recommended through Facebook’s algorithm. If you’re looking for a concise write up of the report, The Guardian has it.
  • Fighting the ‘Infodemic’: Legal Responses to COVID-19 Disinformation – This study looks into the various legal responses taken by public authorities to stem the flow of COVID-19 disinformation, providing a timely discussion of intended and unintended consequences of such legal responses and reflecting on a set of safeguards needed to preserve trust in the online information ecosystem. 

Events and Announcements