Dear Disinfo Update Readers,

In this monumental election year – more than half of the world population will have cast their votes by the end of 2024 – it comes as no surprise that our newsletter is packed with articles, studies, and resources dedicated to the topic of election disinformation.

Today, the European Commission announced the opening of investigations against Meta’s platforms Facebook and Instagram for suspected infringement of the Digital Services Act (DSA). The investigation concerns the spread of disinformation and dissemination of political ads, questions over notice and action mechanisms, as well as the announced discontinuation of CrowdTangle. In a response in Politico, Meta declared the company has “a well-established process for identifying and mitigating risks on our platforms.” 

Paul Bouchaud from AI Forensics’, which published lately on the persistence of Doppelganger ads on Facebook, said on Linkedin that “Meta’s approval of pro-Russian ads, reaching tens of millions, highlights their failure in combating covert political campaigns”. CheckFirst, a Finnish company that investigated a scam operation labelled Facebook Hustles, also welcomed the investigation. Replicating its methodology, and with “two hours of research”, the organisation was “able to measure that at least 3 million Facebook users were reached in 12 European countries in the last 3 weeks” by this fraudulent campaign.

At EU DisinfoLab, we call for strong enforcement of the DSA. It is clearer than ever that voluntary commitments and declarations of intentions are not enough for the platforms to comply with current European regulations. But with only five weeks left before the EU elections, the investigation will have to demonstrate if the DSA can deliver results to protect the elections from foreign interference and disinformation spread through online platforms.

Continuing along this crucial theme, we are excited to announce that on 7 May, Margot Fulde-Hardy will join us to explore foreign information manipulation and interference (FIMI) campaigns that target elections. Additionally, on our 30 May webinar, Sophie Murphy Byrne from Logically will examine the challenges posed by artificial intelligence in relation to election disinformation. We invite you to register for these online sessions in preparation for the upcoming EU elections!

That’s all folks. Please continue reading for the latest news and insights!

Our webinars

Upcoming – register now!
Past – watch the recordings!

Disinfo news & updates

  • True lies. Despite exposure, ‘Pravda’ continues to expand. As revealed in a Viginum report, this disinformation network spreads pro-Russian propaganda through copycat websites across the EU. Initially targeting German speakers, the operation now spans several European languages, aiming to influence EU discourse, particularly ahead of the June 2024 EU elections.
  • Ads fueling Russian disinformation. European Parliament policy-makers call on major international brands to cease their advertising on pro-Kremlin media outlets operating in the Balkans, to avoid inadvertently funding platforms known for spreading disinformation.
  • Overwhelmed by propaganda. France is grappling with heightened disinformation as the European election approaches, according to French Minister for European Affairs Jean-Noël Barrot. In the past weeks, the country’s security authorities have faced waves of pro-Russian disinformation campaigns.
  • Counterstrike. As Russia seems to be gearing up its disinformation activities ahead of the upcoming European parliamentary elections, France has proposed tougher restrictions on individuals and entities involved in Russia-backed influence operations.
  • Counterstrike II. In response to increasing threats of disinformation and foreign interference, the Belgian presidency of the Council of the EU has activated the Council’s Integrated Political Crisis Response (IPCR) arrangements in information sharing mode, to enhance cooperation and coordination among EU member states and institutions.
  • Kremlin kinks in euro dream. Bulgaria’s aspirations to join the Eurozone by January 2025 face significant hurdles. Senior EU officials suggest that Russian disinformation campaigns have contributed to the lower-than-average public support, one of the factors complicating the path to membership.
  • Double agent drama. An assistant to Maximilian Krah, the top candidate of the Alternative for Germany (AfD) party in the upcoming European Parliament election, has been arrested on suspicion of espionage for China. This incident follows closely on the heels of the Voice of Europe scandal, where Krah and other AfD members were accused of accepting funds to promote Russian interests in Europe. 
  • Content integrity. Microsoft extends its ‘Content Integrity’ tools – which allow adding “content credentials” to media – to political parties and campaigners from the EU and global news outlets with the aim of helping tackle the spread of misleading AI-generated content.
  • AI ethics on trial. Meta is under scrutiny after two incidents involving explicit AI-generated images posted on its platforms. Its independent Oversight Board is examining whether the company’s enforcement policies surrounding AI-generated deep fakes are effective and consistent.
  • Snapchat stamps AI. Snapchat plans to introduce watermarks on images generated using its AI-powered tools. The company stated that removing these watermarks will breach the platform’s terms of use, but it remains uncertain how it plans to monitor and enforce this policy.
  • Not just online. In Mozambique, nearly a hundred people died while trying to escape the mainland on an overcrowded makeshift ferry for an island following false reports of a cholera outbreak, which caused widespread panic. The incident serves as a stark example of the real-life impact of disinformation.
  • Partners in truth. The European Fact-Checking Standards Network (EFCSN) has launched a training and mentorship program targeted at young fact-checking organisations. The initiative pairs new organisations with experienced mentors from the EFCSN’s network to share experiences and best practices.

Reading & resources

  • Election integrity in India. Concerns around disinformation are prominent in the context of India’s general elections, held between 19 April and 1 June 2024 in seven phases, with 968 million eligible voters. Issues like government control of digital platforms and the manipulation of information through major social media platforms potentially distort public perception and influence the electoral process, raising concerns about the integrity of the election and the broader impact on India’s democracy.
  • AI, disinformation, and elections – in charts. This article examines public perceptions through various surveys conducted by Ipsos and UNESCO, reflecting global perceptions about AI, disinformation, and their potential effects on elections, presented in detailed charts.
  • Election AI watch. The 2024 Elections Tracker by Rest of World provides an ongoing analysis of how artificial intelligence-generated content is impacting elections globally.
  • Conspiracy cash flow. This scientific article examines the business model of Telegram’s main conspiracy channels, and showcases their use of different monetisation strategies from spreading misinformation.
  • Search and verify. This study examines Google Web Search’s response to disinformation, analysing over 825 URLs linked to 50 fact-checked keywords. It reveals that fact-checks generally rank higher than the problematic content they address, especially in topics such as COVID-19 and US politics, yet their visibility is comparable in searches related to the Ukraine war.

This week’s recommended read

Raquel Miguel, Senior Researcher of EU DisinfoLab, recommends reading the most recent report published by AI Forensics: ‘No Embargo in Sight: Meta Lets Pro-Russia Propaganda Ads Flood The EU’, an insightful and in-depth analysis of Meta’s Ad Library, a repository updated since August 2023 to meet the Digital services Act’s requirements. The researchers uncover alarming gaps in regulation compliance and ineffective moderation of ads that don’t adhere to Meta’s guidelines concerning political advertising.

In addition to this, the research shows how the Russian influence operation Doppelganger (exposed by EU Disinfolab already in 2022), exploited Meta’s shortcomings and could get a significant reach – over 38 million users in France and Germany, many more people than previously thought.

Most ads were not identified by Meta as political on time, according to the report. The results ring the alarms just over a month before the European elections and highlight the risks that the gaps in regulation compliance can pose for democracy.

Events & announcements


  • Access Now is seeking a Finance Manager – Belgium to manage all aspects of the financial operations of their Belgian entity, including bookkeeping, budgeting, and grants management. Apply by 10 May.
  • The Institute for Strategic Dialogue (ISD) is looking for a US Digital Policy Intern for a part-time position based in Washington D.C. with the possibility of flexible on-site, remote, or hybrid work arrangements. Applications are open until 1 May.
  • The second round of the Kempelen Institute of Intelligent Technologies (KInIT) PhD applications is open. Express your interest by 10 June.
  • The European Endowment for Democracy (EED) has several open positions.

This good X!