Dear Disinfo Update readers,

Embark on this edition of our newsletter, your compass through the turbulent sea of disinformation!

Want to support Disinfo2024? Let us know!

Mark your agenda: After our successful annual conference 2023, #Disinfo2024 will expect you on 9-10 October in Riga, Latvia.

We’re looking for potential donors/partners for #Disinfo2024. Would you be in a position to offer financial support to the conference organisation? Your contribution will play a crucial role in enabling us to maintain an affordable participation fee for the counter-disinformation community. Please get in touch!

We look forward to your involvement in making #Disinfo2024 a resounding success!

New colleague in: welcome Joe!

We’re delighted to welcome Joe McNamee, our new Senior Policy Expert, to the team. Joe comes with a rich background in policy development, as the former director of digital rights organisation EDRi and a former policy advisor in the European Parliament. His deep understanding of the issues of internet policy, content moderation and cybercrime will perfectly align with EU DisinfoLab’s objectives to mobilise the counter-disinformation community in driving policy change. Joe’s expertise in digital rights and first-hand experience with the negotiations of the Digital Services Act (DSA) will be crucial in monitoring the DSA’s implementation during the EU elections and in crafting effective policy responses to disinformation challenges. We’re excited about the knowledge and experience he adds to our team. Please join us in warmly welcoming Joe!

Enough preamble – scroll on to uncover the curated insights waiting for you in the following sections!

Disinfo news & updates

  • To open 2024. Misinformation and disinformation are listed as the most severe global risk anticipated over the next two years in the World Economic Forum’s new Global Risks Report 2024.
Platforms under scrutiny
  • Code of Practice on Disinformation. A recent review by the European Fact-Checking Standards Network (EFCSN) reveals that major tech platforms are failing to tick the boxes of their fact-checking commitments under the EU Code of Practice on Disinformation, risking the Digital Services Act (DSA) compliance. Read more in this article.
  • First infringement proceedings under DSA. The European Commission has initiated formal proceedings, the first of their kind, against X (formerly Twitter) under the Digital Services Act (DSA) to assess a potential breach of the regulation.
  • Restricting information access. TikTok has quietly restricted its Creative Center tool, previously used by critics to scrutinise content related to geopolitics and the Israel-Hamas war. The tool allowed users to track popular hashtags, but the search function and links for certain hashtags have stopped working, and data related to politically sensitive topics seems to have disappeared.
New episodes of Russian interference
  • In France. Kremlin documents reveal a focused effort to undermine French support for Ukraine by creating political discord through social media and far-right political figures. Read more in this article.
  • More on Doppelganger. This report delves into the technical infrastructure of Doppelganger, and highlights ongoing malign influence activities associated with the operation. Its key findings include the identification of over 800 social media accounts in a Ukrainian-focused campaign, the ongoing activity of over 2.000 inauthentic social media accounts, and the likely use of generative AI to produce content. The report calls for enhanced online literacy and awareness initiatives to address these challenges.
  • On TikTok. This case study describes a covert Russian influence campaign on TikTok, revealed by a joint investigation by the DFRLab and BBC Verify, that targeted the former Ukrainian defence minister with false corruption accusations with videos in multiple languages, and resulted in TikTok removing over 12.800 accounts, likely generating hundreds of millions of views.
  • All over Europe. This publication by Project:Polska dissects and explainings the extent of Russia’s influence across Europe. The initiative brings together twenty experts from across the continent to articulate how Putin’s actions and ideologies are impacting our daily lives. 
  • Beyond Europe. Advertisements with A green “RT” logo appearing in Mexico, accompanied by a message “Information has no borders”. This article dives into Russian propaganda in Latin America.
  • For a deep understanding. This article offers insights on the Russian disinformation and propaganda narratives, both domestic and foreign.
Protecting elections
  • From FIMI. The CrossOver Finland project is investigating recommendation algorithms on social media platforms to monitor and counter potential election interference and manipulation in the upcoming Finnish presidential and European parliamentary elections. Read more here.
  • From AI. This article discusses the threat posed by generative AI to electoral processes and democracies, and how to safeguard the upcoming US elections from AI powered misinformation.
The impact of disinformation
  • Contos de janeiro. On the one-year anniversary of the coup attacks that occurred in Brasília (DF), the News Agency Lupa released a mini-documentary titled “8 de janeiro” which reports on how misinformation and hate speech were key elements for the undemocratic acts.
  • Economic impact. A false tweet from the compromised Securities and Exchange Commission (SEC) Twitter account, falsely announcing the approval of a Bitcoin exchange-traded fund (ETF), triggered a rapid spike in Bitcoin’s price. The incident underscores the vulnerability of social media accounts to hacking and the immediate economic consequences of misinformation in the cryptocurrency market.
The state on content moderation and fact-checking
  • Stepping in. Stephan Mündges, formerly of ZDF and the Institut für Journalistik (TU Dortmund), announced his new role as the Coordinator of the European Fact-Checking Standards Network (EFCSN), leading the efforts to build and nurture a network of fact-checking organisations across Europe.
  • New and better fact-checks. According to this article, the conventional approach to fact-checking through published articles on websites and social media is insufficient, and falls short of reaching the audience that needs fact-checks the most. The article suggests a reconsideration and reimagining of fact-checkers’ objectives and methodologies in 2024.
  • Misinformation warning labels. This study offers a review of warning effects and their moderating features.

What we’re reading

  • The power of social. This long read by DRFLab examines the influence of platform design and content choices on perceptions of the Israel-Hamas armed conflict.
  • Sneaky AI. This study explores the persistence of deceptive behaviour in large language models (LLMs) even after safety training. Researchers created proof-of-concept examples where LLMs exhibit deceptive behaviour, like writing secure code for one condition and exploitable code for another. The findings highlight challenges in detecting and eliminating deceptive behaviour in AI systems. 
  • Lessons learnt. The EDMO Task Force on 2024 European Parliament Elections released a report that analyses disinformation narratives during the 2023 elections in Europe, based on over 900 fact-checking articles related to eleven elections in ten European countries. The report emphasises the critical importance of robust fact-checking and awareness initiatives to safeguard electoral integrity and democratic values.
  • Identity, a tool for disinformation. This report discusses how identity can be exploited as a tool for disinformation to achieve strategic objectives and destabilise target societies. The effectiveness of disinformation is heightened when it targets existing social divisions rooted in individual and societal assumptions related to identity markers, and these identity-based divisions often play a central role in wedge issues or societal conflicts.

This week’s recommended read

This week, we suggest exploring the topic of artificial intelligence (AI). The EU considers AI a strategic priority. Regulating it thanks to the AI Act is the first step, and last November, the European Commission opened access to EU supercomputers to speed up AI development in order to support startups, SMEs and the AI community. This article sheds light on the EU Commission’s plans to support small businesses and startups in the context of artificial intelligence, and offers us a glimpse into the future of the development of supercomputing, including hybrid computational resources.

The latest from EU DisinfoLab

  • Open call: #Disinfo2024. Our next annual conference will bring us (and hopefully you, too!) to Riga, Latvia, on 9-10 October 2024. Do you have an idea for a talk, a publication to present, insights to share, or a theme in mind that we shouldn’t miss? Submit your suggestions through this form by 25 February.
  • Research and content moderation policies. Explore our newly released factsheets for an in-depth look into the policies of two platforms: X (formerly Twitter) and Facebook. Check them out, and stay tuned – there’s more to come!
  • Doppelganger. EU DisinfoLab contributed to the Spanish Experts Forum against Disinformation Campaigns in the Field of National Security, by explaining the Doppelganger case to the Spanish audience. Read our contribution in this report (in Spanish, p. 60-63)!

Events & announcements

  • 17 January: ESU and Alliance4 Europe will organise an online workshop on media literacy and disinformation, addressing the theme of 2024 elections.
  • 19 January: Our Alexandre Alaphilippe, together with Kalina Bontcheva (University of Sheffield, EDMO Advisory Council, and vera.ai project), will introduce the guidelines for public interest open source investigations, and the ObSINT resource hub for the OSINT community in this EDMO online training
  • 27 February – 1 March: The European Digital and Media Literacy Conference, organised in Brussels by Mediawijs, together with EDMO BELUX and several other partners, will put the many faces of digital and media literacy in the spotlight, to showcase and exchange initiatives, tools, projects and practices, and to strengthen the European cooperation.
  • 14-16 March: Voices – European Festival of Journalism and Media Literacy, will bring together citizens, journalists, and media professionals to celebrate the pivotal roles of journalism and the informed public in societies while fostering critical thinking around disinformation. 
  • 9-10 October 2024: This year’s EU DisinfoLab annual conference will take place in Riga, Latvia. Submit your ideas for the #Disinfo2024 programme here!

Jobs

  • The Centre for Information Resilience (CIR) has several open positions, including Programme Manager for Open-Source Research (remote within the UK or EU) and Editors (remote).
  • The Maldita.es Foundation is looking for a Madrid-based and Spanish-speaking specialised Editor for the Migration section to create content based on data and facts that helps debunk hoaxes and provide verified information about migration, human rights and immigration legislation.
  • The University of Urbino (UNIURB) is looking for a postdoctoral researcher to join their team to work on the vera.ai project that is developing trustworthy artificial intelligence solutions to combat advanced disinformation techniques. Apply by 20 January here.
  • Access Now is looking for a Director of Policy and International Programs.
  • Logically Facts is looking for a Deputy Editor to lead a team of fact-checkers and a Senior Editor to review and edit content and to contribute to fact-checking initiatives. Both roles are remote, operating on GMT or CET hours.