Dear Disinfo Update reader,

Glad to have you with us for another round-up from the ever-shifting disinformation landscape. Spring is in the air, and so are new stories, timely investigations, and compelling research. This newsletter is here to keep you informed and connected.

Behind the scenes, work is progressing for our annual conference. #Disinfo2025 is heading to Ljubljana this October, and thanks to your outstanding proposals, the programme is shaping up to be packed with a diverse range of voices and fresh perspectives. Registration will open very soon – places are limited, so keep an eye on your inbox and follow us on social media to secure yours.

In this edition, we’re tracing everything from clickbait health ads still running unchecked, to corporate greenwashing, covert influence operations, and disinformation cases ranging from the Vatican to Wellington. Our recommended read takes a sobering look at how short-sighted alliances with extremist movements can spiral far beyond the intentions of those who enabled them – a stark lesson from the past, with clear relevance for the challenges we face today.

A big shout-out to the whole EU DisinfoLab team who helped shape this edition. From tracking trends to curating resources, it’s a collective effort we’re proud to share with you.

So scroll on and enjoy the read!

Our webinars

UPCOMING – REGISTER NOW

  • 27 March: Clickbait cures: how Meta & Google fail to stop fake meds ads | Join us as we unpack with Aleksandra Atanasova (Reset Tech) an investigation revealing how Meta and Google continue to host deceptive health ads across Europe. This session examines why the platforms’ efforts to curb misleading claims fall short and why stronger enforcement is needed.

PAST – WATCH THE RECORDINGS!

  • Faster & more trustful – fact-checkers and X’s Community Notes | This webinar examined how fact-checking organisations influence X’s Community Notes, based on Fundación Maldita.es’ analysis of 2024 data. It explored which sources are most cited, how quickly notes appear, and their visibility alongside posts. Watch the replay of the session, and read more in this article discussing potential pitfalls Meta might face with its Community Notes implementation, drawing parallels to challenges previously observed on X.
  • Melodies of Malice: AI’s role in extremist music propaganda | This webinar explored how far-right groups use AI to create and spread extremist music, drawing from GNET’s “Melodies of Malice” report. It examined the role of AI in music propaganda, its impact on disinformation, and policy responses.

Disinfo news & updates

Ads in question

  • Celebrity scan. Meta has launched a facial recognition tool to combat fraudulent ads on Facebook and Instagram, targeting scams that misuse celebrity images. The tool will roll out in the EU, UK, and South Korea. 
  • Camouflaged ads. A study by Boston University researchers examines how fossil fuel companies use “native advertising” – sponsored content designed to mimic real news articles – to shape public perceptions of climate change. These ads often appear on major news outlets with subtle disclaimers that many readers overlook.
  • Planetary scale greenwashing. A senior employee at the AMV BBDO advertising agency claims harassment after raising concerns about “greenwashing” in Mars’ ads for Galaxy chocolate and Sheba pet food. Polina Zabrodskaya argued the campaigns misrepresented sustainability, citing issues like child labor and deforestation. After voicing her objections, she was allegedly sidelined.

Russia being Russia

  • Infiltration. A Moscow-based disinformation network called “Pravda” has influenced Western AI tools, spreading Russian propaganda through manipulated data. An audit revealed that the top 10 generative AI chatbots repeated false Kremlin-backed narratives 33% of the time. Another investigation reveals how Pravda sources are frequently cited on Wikipedia across multiple languages, X Community Notes, and AI-generated content, allowing content laundering and increasing their visibility. 
  • Cyber tensions. The Pentagon has denied reports claiming Defense Secretary Pete Hegseth ordered a halt in offensive cyber operations against Russia, asserting that no stand-down order was issued. The denial follows media coverage suggesting the move was linked to attempts to initiate peace talks with Russia.
  • ‘Not Our War’ pro-Kremlin demonstration. On 9 March, a demonstration was held in Madrid under the slogan “Peace and neutrality. This is not our war,” protesting Spain’s involvement in the war in Ukraine. While the message seemed pacifist, it was promoted by accounts and websites aligned with the Kremlin, known for spreading pro-Russian disinformation. 
  • Doppelganging Down Under. A Kremlin-linked disinformation campaign is targeting New Zealand audiences, using sites promoted by Russian state media, including one in te reo Māori – the indigenous Māori language – to mimic local news and sow division.

Tall tales from around the World 

  • The Chinese connection. A former Meta director claims that CEO Mark Zuckerberg worked closely with Beijing to develop a censorship system for the Chinese market, including hiding viral posts and sharing user data. The whistleblower’s allegations come ahead of her book release and The US Securities and Exchange Commission (SEC) whistleblower complaint. Meta denies the claims, stating it no longer operates in China.
  • Scam empire. A massive network of call centers, originating from Israel, Cyprus, Ukraine, Bulgaria, and Georgia, has scammed 247 million USD from 27.000 victims through fake cryptocurrency investments. The operation relies on affiliate marketing, exploitative call centers, and money laundering to funnel illicit profits across the globe.
  • Secretary of Health (Disinformation). Robert F. Kennedy, Jr., the US Secretary of Health, has spread misleading theories about the deadly measles outbreak in the US, linking infections to poor diet and health, suggesting that the measles vaccine harmed children in Texas, the epicenter of the outbreak, and promoting alternative treatments like cod liver oil. The Children’s Health Defense, the anti-vaccine nonprofit founded by Kennedy, recently updated a 2021 book in which Kennedy claims that “measles outbreaks have been fabricated to create fear”.
  • The Pope’s voice. The Vatican is actively debunking false claims about Pope Francis’ health, following widespread conspiracy theories that he has died. As the pontiff continues treatment for pneumonia, social media platforms have been flooded with misinformation. Despite the falsehoods, the Vatican has consistently provided updates and even released an audio message from the pope himself to counter these rumors.
  • Twisting Syria’s reality. Amid a weekend of renewed violence in Syria, the country experienced a coordinated surge in false narratives, with AI-generated content and private messaging used to incite fear among minority communities. Researchers warn that external actors are amplifying these efforts to undermine the new government and influence international perceptions.
  • Infrastructure & interference. The US Federal Communications Commission (FCC) is launching a new Council on National Security to coordinate responses to cyber threats from foreign adversaries targeting the telecoms sector – including disinformation campaigns aimed at undermining public trust and national stability.

Brussels corner

  • Bluesky settles under grey sky. Bluesky has established its EU presence in Belgium, resolving earlier uncertainty about which national authority would oversee its compliance with the Digital Services Act (DSA). The platform has yet to publish its exact EU user figures, but remains below the 45 million monthly users needed for designation as a Very Large Online Platform (VLOP).
  • Spy ring. A European media investigation reveals that many of the 220 Russian diplomats at the embassy in Brussels, posing as commercial attachés, advisers, and technical staff, have been expelled by Belgian authorities for being spies linked to Russian intelligence agencies, including the SVR, GRU, and FSB.
  • Risk assessments, assessed. The DSA Civil Society Coordination Group, coordinated by the Centre for Democracy & Technology Europe, has released an initial analysis of Big Tech’s first Risk Assessment Reports under the DSA. The brief highlights key trends, useful practices, and areas for improvement.

Reading & resources

  • The quiet corners of YouTube. While YouTube pushes viral, attention-grabbing videos to the forefront, most of its vast 14.8 billion videos remain virtually invisible, untouched by the algorithm-driven spotlight. This BBC article explores what these forgotten corners reveal about YouTube’s role as both a social archive and a gatekeeper of visibility, offering insight into how algorithmic choices shape the information ecosystem, not just through what they amplify, but also what they leave behind.
  • Conspiratorial memes. This study explores the cultural function of internet memes within online conspiratorial communities, focusing on 544 COVID-19-related memes shared across two Reddit subreddits. It demonstrates how memes, as “cultural representations,” stabilise community culture by reinforcing themes of deception, delusion, and superiority, helping maintain group cohesion and supporting conspiracist worldviews.
  • Standing for journalism. The Europe Press Freedom Report 2024 highlights the ongoing risks faced by journalists reporting on the Ukraine war, as well as persistent threats to media freedom across Europe. The report also warns of growing concerns over disinformation, digital surveillance, and media capture, urging stronger legal protections and policy reforms to safeguard journalists’ safety and independence.
  • No fact-checking, no free speech. This article criticises Meta’s decision to end third-party fact-checking on Facebook, Instagram, and Threads in the US, arguing that it prioritises profit over responsible discourse. It highlights how this move, alongside similar actions by other tech companies like X, undermines free speech by allowing misinformation to flourish. The piece defends fact-checking as essential for maintaining trust in public discourse and stresses that social media platforms, given their influence, must take responsibility for the content they circulate.
  • Boosters against misinformation. A new study shows that misinformation training can help people spot falsehoods in as little as a month. Led by Dr. Rakoen Maertens at the University of Oxford, the study involved over 11.000 participants who tried different training methods, such as reading an article, watching a video, or playing a game to identify misinformation tactics. Researchers stress the importance of memory retention, motivation, and consistent “boosters” to maintain the ability to reject misleading information.
  • Double down on denial. At the 2025 Conservative Political Action Conference (CPAC) in the US, President Donald Trump and key conservatives like Elon Musk rallied against climate action, promoting fossil fuel expansion and deregulation. Trump mocked climate policies, falsely claiming to cancel non-existent electric vehicle mandates and dismissing the Green New Deal. The event marked a shift in US right-wing discourse from climate denialism to pure anti-environmental rhetoric.
  • Frontman of the anti-clima agenda. At the 2025 Alliance for Responsible Citizenship (ARC) conference in London, Jordan Peterson and other influential conservatives rallied against climate action, promoting a vision rooted in climate denial and traditional values. With nearly 3.000 attendees, including political figures and business elites, the event positioned itself as an alternative to mainstream climate and social discourse – framing climate action as part of a broader ideological threat, and advocating instead for fossil fuel expansion, traditional values, and cultural homogeneity.
  • Unified data, effective response. The NATO Strategic Communications Centre of Excellence explores how standardising communication protocols and data integration can boost interoperability, resilience, and responsiveness across strategic information environments. The report outlines practical steps to support more efficient collaboration, faster response times, and stronger counter-disinformation capabilities.

This week’s recommended read comes from our Managing Director, Gary Machado: Timothy W. Ryback’s article in The Atlantic, The Oligarchs Who Came to Regret Supporting Hitler, examines how key German industrialists and media figures, including Alfred Hugenberg, played a decisive role in Hitler’s rise. Believing they could harness his movement for their own political and economic advantage, they provided crucial financial and media support. Hugenberg, in particular, pursued what he called “Katastrophenpolitik” – a strategy of deliberate polarisation through incendiary and often fabricated news stories, aimed at hollowing out the political centre and collapsing democratic consensus. But this strategy ultimately backfired: rather than shaping the political landscape to their advantage, figures like Hugenberg found themselves sidelined – or worse – once the Nazi regime consolidated power.

This article offers a stark reminder of how short-sighted alliances with extremist movements can lead to disastrous consequences. Ryback traces how these industry leaders, initially seeking to protect their interests, ended up enabling a dictatorship that ultimately consumed them as well. Through meticulous research, he unpacks the dangers of underestimating the forces they helped unleash, making this a compelling read for anyone interested in the unintended consequences of political opportunism.

The latest from EU DisinfoLab

  • Another AI Hub-date. We’ve just added new content to our AI Disinfo Hub – your source for tracking how AI shapes and spreads disinformation, and how to push back. Here’s a snapshot of what’s new:
    • Polluting AI. This NewsGuard report exposes how a Moscow-based disinformation network called Pravda is manipulating AI chatbots by flooding the internet with pro-Kremlin propaganda. In 2024 alone, it published 3.6 million articles, embedding false claims into AI training data. Another investigation by DFRLab and CheckFirst confirmed the dissemination of content from the Pravda network in popular AI chatbots, while extending the outreach to Wikipedia source links and X Community Notes. 
    • Citation gaps. According to the Columbia Journalism Review, generative AI search tools are rapidly gaining popularity, with one in four Americans now using AI instead of traditional search engines. Despite partnerships with publishers, AI often fails to cite content accurately, fabricating links, misattributing articles, and bypassing publisher preferences. This poses significant issues for both content producers and consumers, highlighting the need for improved transparency and citation practices in generative search tools.
    • Hyping the fake. Futurism reports that Google’s algorithm has been boosting an AI-driven site called Science Magazine, which churns out bizarre, error-ridden articles filled with AI-generated images. These articles often feature fake writers with invented bios and lack credible sources or factual reporting. Recent examples include distorted coverage of SpaceX’s Starship, featuring AI-generated spacecraft images and poetic, vague narratives. This underscores the risks of AI-generated content cluttering search results and overshadowing real, fact-checked news.

Events and announcements

  • 17-21 March: FPS BOSA hosts the European AI Week 2025: innovation versus regulation?, exploring the balance between innovation and regulation in the context of the societal, technological, and industrial challenges of the AI era.
  • 19-20 March: The third edition Les Journées Infox sur Seine will gather the francophone community working on disinformation and fact-checking to meet and present their current work in Paris, France.
  • 20 March: The Disinformation and Democracy Forum in Sydney, Australia, will bring together experts, lawmakers, and advocates to discuss the impact of disinformation on democracy and strategies to address it in the digital age.
  • 27 March: The online event Navigating Digital Democracy: Challenges and Future Directions organised by three EU-funded projects will explore how technology is transforming democratic participation and the associated challenges.
  • 27 March: ATHENA project is will organise a webinar “From Memes to Missiles: Information Warfare in Ukraine”.
  • 6 April: EDMO BELUX has launched a call for abstracts for an English-French bilingual special issue “Untangling the knots between disinformation and inequalities” in the “Recherches en Communication” publication.
  • 8 April: EDMO BELUX workshop & networking event “Mitigating disinformation in Belgium and Luxembourg: Where are we now, where are we going?” will take place in Brussels. Register by 2 April.
  • 9-13 April: The International Journalism Festival will be held in Perugia, Italy. The second edition of the Centre for Information Resilience (CIR) Open Source Film Awards ceremony will be organised as part of the event.
  • 10 April: The DSA Online Dispute Resolution Conference, hosted by ADROIT, will explore the evolving ecosystem under the Digital Services Act. Engage with industry leaders and gain insights into DSA compliance.
  • 10-11 April: AlgoSoc’s international conference The Future of Public Values in the Algorithmic Society will take place in Amsterdam, The Netherlands.
  • 23-25 April: The 2025 Cambridge Disinformation Summit will discuss research regarding the efficacy of potential interventions to mitigate the harms of disinformation.
  • 22-25 May: Dataharvest, the European Investigative Journalism Conference, will be held in Mechelen, Belgium. Tickets are sold out, but you can still join the waiting list.
  • 17-18 June: The Summit on Climate Mis/Disinformation, taking place in Ottawa, Canada, and accessible remotely, will bring together experts to address the impact of false climate narratives on public perception and policy action.
  • 18-19 June: Media & Learning 2025 conference under the tagline of “Educational media that works” will be held in Leuven, Belgium.
  • 27-28 June: The S.HI.E.L.D. vs Disinformation project will organise a conference “Revisiting Disinformation: Critical Media Literacy Approaches” on Crete, Greece.
  • 30 June – 4 July:  The Summer Course on European Platform Regulation 2025, hosted in Amsterdam, will offer a deep dive into EU platform regulation, focusing on the Digital Services Act and the Digital Markets Act. Led by experts from academia, law, and government, the course will provide in-depth insights into relevant legislation.
  • 15-16 October: #Disinfo2025 is heading to Ljubljana, Slovenia. Mark your calendars – and don’t miss the opening of registrations!
  • 22–23 October: 17th Dubrovnik Media Days, under the theme “Generative AI: Transforming Journalism, Strategic Communication, and Education”, will take place in Dubrovnik, Croatia.
  • 29-30 October: The 2nd European Congress on Disinformation and Fact-Checking, organised by UC3M MediaLab, will take place under the subtitle “Beyond 2025: Emerging threats and solutions in the global information ecosystem” in Madrid, Spain, with the possibility to join remotely.
  • 20-24 November: The 2025 Global Investigative Journalism Conference will be held in Kuala Lumpur, Malaysia.
  • Mapping the silencing of women in the public sphere. Forced to Quit is a crowdsourced-based project to map women who left the public sphere due to gender-based violence. Collective action is necessary to identify and report the names of women expelled from their activities – the initiative invites you to add new cases and evidence (news, scientific articles, fact-checks) via this survey.
  • Boosting fact-checking activities in Europe. The European Media and Information Fund (EMIF) is launching a call for proposals to support independent fact-checking projects that combat disinformation and protect democratic processes. This 11th funding round with the total funding of 4.350.000 EUR is open until 30 June 2025.

Jobs

Did you find a job thanks to the listing in this newsletter? We’d love to know – please drop us a message!