Webinar Announcement

Join us on Wednesday 27th May at 15:00 CEST for a webinar where ERGA Board Member Lubos Kuklis will present the key takeaways from ERGA’s Assessment of the Implementation of the EU Code of Practice on Disinformation.

REGISTER HERE

The Few Faces of Disinformation

The COVID-19 pandemic has borne witness to the reorientation of the disinformation ecosystem around the pandemic. With actors capitalising on the infodemic for many different reasons, we felt that there was a need to further explore the intentions behind the use of disinformation: from foreign influence, political and lucrative, to issue-based disinformation. Most disinformation campaigns are fluid in that they can be found at the crossroads of multiple categories. This illustrates just how blurred the lines are when it comes to attribution. Looking at intent alone does not give us all the answers, but can help better grasp the bigger picture.

READ MORE

Regulatory pressure continues to build

Facebook has announced the first members of its Oversight Board. In writing for Wired, Siva Vaidhyanathan pointed to the shortcomings of the Board’s design, which will apparently lead to its overburden. This comes at the moment of increasing European regulatory movements. In this context, an independent assessment of the implementation of the EU Code of Practice on Disinformation has been released on which we collaborated. Additionally, ERGA released its own assessment of the Code last week. Taken together, both assessments point to the limitations of the self-regulation and emphasise the need for harmonised definitions. Such assessments will inform the EU’s next actions to fight disinformation online. In light of the infodemic, the EU Commission is now expected to release a Communication on “Disinformation in the context of COVID-19” on 10 June.

The Plandemic

One conspiracy video caught the attention of many last week. Named “The Plandemic,” the slick production features a prominent conspiracy theorist who seeks to tell an alternative story of the pandemic. Despite its removal from the major platforms, copies are still circulating online — sometimes with removed or reworded titles — which makes it difficult to detect by AI moderation. If you’re looking for something concise, CNN’s Donie O’Sullivan has all the details in a special report. All of this comes at the time of YouTube reporting a 75% increase in news viewership on its platform, emphasising just how crucial it is for platforms to remain vigilant in removing harmful content.

Good reads

  • For The Atlantic, Renee DiResta argues that health experts do not understand how information moves online, which prevents them from effectively filling COVID-19 knowledge gaps and instead feeds into the popularity of unreliable (often misinformative) sources.
  • According to commentators for Slate, the battle faced by authoritative sources to provide accurate information on COVID-19 is complexified by the fast-changing landscape of scientific findings, facts, and uncertainties. This makes accuracy hard to attain due to the novelty of the virus. “When today’s facts can easily become tomorrow’s fictions, it is difficult to even define “misinformation,” much less to “correct” it,” write the authors.

Insights

  • In a co-investigation by BBC Click and the Institute of Strategic Dialogue, authors reveal how both extremist political and fringe medical communities have tried to exploit the pandemic online.
  • Google and the Cost of ‘Data Voids’ during a Pandemic: This piece explores the impact of data voids — a lack of relevant information for search terms — which can increase users’ chances of exposure to mis/disinformation.

Events and Announcements