Dear Disinfo Update readers,

Introducing the most recent edition of our newsletter!

Although October is still some way off, we’ve been working on the preparations for our next annual conference for quite some time. And we’re excited to announce that everything is almost ready to roll, with registrations opening on 27 March 2024! #Disinfo2024 will follow the same format as our previous conferences, with a few small changes that we’re sure will further improve the event: 

  • Firstly, our open call resulted in an overwhelming number of brilliant ideas and suggestions, compelling us to add a third track to the programme. We’re also working on including more cultural activities. Walking tours are back, and we’re aiming for a movie screening.
  • Secondly, we will introduce a nominal registration fee to help maintain the quality and independence of the event. This decision was not made lightly – for more information, please read here. We sincerely hope you will understand and support this approach!

Our webinar series really took off – already five organised since the start of February this year, and loads more in the pipeline. Scroll down for a list of our upcoming webinars, as well as links to recordings of past sessions you may have missed.

In this Disinfo Update, we’ve also packed in a bunch of news, essential reads, and resources about elections, AI, Russian and Chinese disinformation efforts, the latest happenings in the world of platforms, climate change disinformation, EU policies, and lots more. The journey continues below.

Our webinars

Upcoming – register now!
Past – watch the recordings!

Disinfo news & updates

Elections and/or AI
  • Voter/AI-generated disinfo. Supporters of Donald Trump have been, without any direct links to Trump’s election campaign, acting as sources of AI-generated disinformation, by creating and sharing AI-generated fake images of black voters to encourage African Americans to vote for Republicans.
  • Unsuited for voting queries. This article reports that several major AI services performed poorly in a test of their ability to address questions and concerns about voting and elections.
  • Gemini defers election Qs. Google appears to have taken heed, as it announced restrictions in all election-related queries made in Gemini AI (formerly known as Bard) in any country where an election is taking place. Instead of providing answers to questions related to political parties, candidates, or politicians, the AI chatbot will direct users to Google Search.
  • Fake image factories. The Center for Countering Digital Hate (CCDH) tested four AI image generators using 40 text prompts related to the 2024 US presidential election. Result: in a startling 41% of the tests, generated images constituted election disinformation. Read the full report here.
  • Stirring the pot. During the past weeks, Russian state media and Kremlin-connected social media profiles have spread and amplified deceptive content about the US immigration and border protection. The aim of the campaign appears to be to provoke outrage and polarisation ahead of the US elections, and to weaken the support for Ukraine.
  • AI & media. OpenAI is now partnering with Le Monde and Prisa Media, using their content to train its models. In the wake of some controversies regarding unauthorised use of articles for training AI systems, the partnership could be a good example of a mutually beneficial future model for collaborations between AI entities and media.
Russia & China
  • Red Cross manipulation. This investigation uncovered how the Putin administration has co-opted the Russian Red Cross into Russia’s propaganda framework, violating the International Red Cross Movement’s principles of neutrality and independence.
  • Doppelganger returns. The DFRLab revealed a new iteration of the Doppelganger operation, with over five hundred Facebook ads redirecting users to domains impersonating Ukrainian and French media outlets.
  • Deepfakes. A deepfake video of Maria Ressa, Nobel laureate and CEO of Rappler, was circulated to promote a fake cryptocurrency endorsement. Originating from a Russian scam network, the video altered an earlier interview, using AI-generated audio to fabricate her endorsement. It had gathered over 22.000 views on Facebook, and an ad for the page hosting the video was seen on Microsoft Bing.
  • Kremlin leaks. This series of investigations dives into how the Kremlin has been working to ensure Putin’s election victory, and to build a pan-Russian propaganda network and internet censorship machinery. 
  • China & Russia leaks. This article examines two recent leaks that expose Russian and Chinese cyber and information operations, shedding light on their efforts to manipulate and control the internet.
  • Merry-go-round. Is Chinese state media amplifying Russian narratives? This research reveals eye-opening similarities in media tactics of two countries, as well as patterns in spreading disinformation on war in Ukraine, as well as Finland and Sweden joining NATO, and conspiracy theories about the USA.
  • RIP CrowdTangle. Meta announced that CrowdTangle will be discontinued on 14 August 2024. This tool has been used for years by researchers to monitor activity on Facebook and uncover malicious and disinformation campaigns.
  • TikTok. The US house lawmakers voted in favour of a bill that would force ByteDance to sell its popular social media app TikTok. Now, the UK government is coming under increasing pressure to follow suit, and to toughen its approach to TikTok.
  • Ring ring. Audio and video calls are now available to all users on X. But… the new feature is enabled by default, and it exposes your IP address. Follow these instructions to change the settings.
Climate change disinformation
  • Food disinformation. This report sheds light on misinformation and disinformation campaigns by the meat and dairy industry, and how these industries influence policy-making and public perception, especially in the context of climate change and alternative proteins, at a time when reducing emissions is paramount and in a year where climate change is a key agenda for elections around the world.
  • AI and climate. Generative AI has the potential to turbocharge climate disinformation, including climate change-related deepfakes, ahead of a historic election year where climate policy will be central to the debate, according to this investigation.

Brussels corner

  • Elections, AI & platforms. In an effort to intensify ahead of the upcoming elections its oversight of major platforms and the risks associated with generative AI, the European Commission has issued formal requests for information to the very large online platforms (VLOPs) Google, Meta, Microsoft, Snap, TikTok, and X (formerly Twitter) on the measures these platforms have implemented to mitigate these risks.
  • No SLAPPs. The European Parliament has voted in favour of new legislation to protect individuals and organisations working on matters of public interest from Strategic Lawsuits Against Public Participation (SLAPPs). Approved with strong support, the law targets cross-border cases, enabling early dismissal of baseless lawsuits and requiring claimants to potentially cover legal costs and damages.
  • AI Act. Culminating the lengthy preparation process, the European Parliament has adopted the AI Act. The increasing integration of AI technologies into daily life is met with optimism that the new regulation will play an important role in ensuring transparency and safeguarding against potential abuses.

Reading & resources

  • Influencing Western audiences. This Hybrid CoE working paper scrutinises and drives into measuring the often-underestimated influence of Kremlin disinformation in Western societies. 
  • Policy guide. The OECD report ‘Facts not Fakes: Tackling Disinformation, Strengthening Information Integrity’ proposes a comprehensive policy framework encompassing enhanced transparency, accountability, and diversity of information sources, building societal resilience against disinformation, and improving governance and institutional measures to maintain a reliable information environment. And last but not least, it focuses on the benefits of prebunking and inoculation.
  • AI & UK. This white paper examines the impact of AI-generated disinformation on British elections, society, and national security, offering a detailed framework for UK policy-makers, focusing on regulatory measures, technological solutions, and joint governance approaches.
  • Disconcerted public. This study by the Bertelsmann Stiftung examines the increasing polarisation and fragmentation of public opinion in Europe, covering the key aspects of the role of misinformation and disinformation in shaping public discourse, the impact of algorithm-driven content on political polarisation, and the strategies employed by various actors to manipulate public opinion.
  • DISARM v1.4. The open framework for those cooperating in the fight against disinformation, DISARM, has an update Version 1.4.
  • ​Fact-checking 101. This article provides a guide on basic skills and tools for fact-checking.

This week’s recommended read

This time, our Executive Director Alexandre Alaphilippe recommends taking a look at this article by David Aaronovitch, which delves into Peter Pomerantsev’s new book ‘How to Win an Information War’. This book examines the impact of nationalism and identity, advising against ignoring the deep roots of how disinformation and lies create an alternative reality. It suggests that only part of the response to counter these lies is to confront them directly. Another crucial aspect is ensuring that people do not disengage from their own social environments. In his book, Pomerantsev revisits World War II, highlighting that effective counter-propaganda efforts primarily targeted the corruption within the Nazi regime, aiming to preserve a plurality of opinions inside the totalitarian regime.

Peter’s new book is certainly poised to join the EU DisinfoLab’s library, alongside his previous works. These too, come highly recommended, and most notably ‘Nothing Is True and Everything Is Possible’.

The latest from EU DisinfoLab

  • PAPERWALL. In February 2024, CitizenLab unveiled a Chinese-originated influence operation that had built a network of 123 dummy media outlets worldwide, with a strong presence in Europe. We took a closer look at the Belgian and Luxembourg websites included in the exposed assets.

Events & announcements

  • 22 March: This online panel ‘Ukraine – Taiwan: Exploring Shared Experiences in Countering Disinformation’, hosted by the Ukraine Crisis Media Centre (UCMC) and Doublethink Lab, brings together legislators and experts to discuss the similarities in dealing with Russian and Chinese disinformation, focusing on the Russian portrayal of Taiwan’s elections and broader implications for Ukraine-Taiwan bilateral relations and global propaganda counter-strategies. 
  • 2-3 April: Les Journées Infox sur Seine 2024, the annual meeting of the francophone community interested in fake news and disinformation, will take place on 2-3 April in Paris. Attendance is free, but registration compulsory.
  • 9-11 April: The Hybrid Intelligence Foundations online event explores methods, techniques, and research approaches related to hybrid intelligence, focusing on integrating machine and human intelligence to combat disinformation and abusive language.
  • 18-30 April: The ‘Investigating the Influence Industry in Elections’ mastercourse will provide journalists, researchers and civil society investigators with investigation methods, resources and skills for covering upcoming, or recently run elections. Applications are open until 7 April, and accepted on a rolling basis.
  • 22 May: Media & Learning Wednesday Webinar ‘Promoting MIL and Youth Citizen Journalism through Mobile Stories’ will dive into youth and media literacy.
  • February-October 2024: Youth Climate Lab organised by The Education for Climate Coalition will organise six sessions that focus on the intersection of AI and climate change within learning communities. It will delve into practical strategies to combat disinformation, explore AI tools, and cultivate sustainability skills crucial for our green transition.
  • 9-10 October: #Disinfo2024 will take place in Riga, Latvia. Registrations opening soon – stay tuned!


  • Bellingcat is looking for an experienced editor to join their team. Deadline to apply is 31 March.
  • European Digital Rights (EDRi) has two open positions in their policy team: a Policy Advisor, and a Senior Policy Advisor to work on digitalisation, society, climate justice. Apply by 24 March!
  • The European Fact-Checking Standards Network (EFCSN) is seeking a dedicated Project Manager for an EU-funded project that builds the capacity of European fact-checking organisations to tackle climate change disinformation.