Dear Disinfo Update readers,

This edition of the Disinfo Update covers Russian influence operations, electoral disinformation, and EU DisinfoLab’s recent blogpost presenting our new response-impact framework. The framework assesses different responses to influence operations and their impact.

Artificial intelligence is transforming the (dis)information landscape, making it easier to manipulate and spread deceptive content. To address this growing challenge, we’ve launched a dedicated hub offering reliable research, tools, and insights. Whether you’re a researcher, policymaker, or concerned citizen, this resource is designed to help you understand and combat AI-driven disinformation. Check out our new AI Disinfo Hub and join us in the fight against AI-driven disinformation!

Our webinars

UPCOMING – REGISTER NOW

PAST – WATCH THE RECORDINGS!

  • Building FIMI Resilience through Models of Practice: Taiwan’s 2024 Election | This webinar explored Taiwan’s resilience to China’s FIMI campaigns ahead of the January 2024 election. Ben Graham Jones from Doublethink Lab shared insights on fostering whole-society resilience and strategies for global FIMI resilience.
  • AI and Disinformation: A Legal Perspective | Noémie Krack, legal researcher at KU Leuven, examined the EU’s evolving legal framework to tackle Gen AI-driven disinformation. The session highlighted key provisions from the AI Act, the Digital Services Act, and findings from the AI4Media project.
  • Building Latvian Resilience Against Disinformation | In this session, Dr. Rihards Bambals, Director of Strategic Communication at Latvia’s State Chancellery, explained how public authorities in Latvia are strengthening societal resilience through initiatives such as a handbook, an online platform, and a podcast to counter disinformation and protect information integrity.

Disinfo news & updates

  • Deepfakes and elections: In this article, the authors explores the use of deepfakes in the US presidential election, their impact and the implications deepfakes have for the Australian elections. The potential impact is described stating that “As Australia faces its own election, this technology could profoundly impact perceptions of leaders, policies, and electoral processes.”
  • CAR targeted by Russian propaganda: A former propagandist discusses their experience of spreading Russian disinformation on behalf of the Wagner Group. Ephrem Yalike, a journalist from the Central African Republic, worked for an unofficial Russian organisation known as Africa Politology that spread Russian disinformation in the Central African Republic.
  • Russian malware: Human rights groups, private companies, state institutions, and educational institutions in Central Asia, East Asia, and Europe were targeted with custom malware by a Russia-linked campaign. Insikt Group, who identified the campaign, stated that it is “Similar to other recent Russian state hacker campaigns affecting the region, the group is likely seeking to acquire intelligence to bolster Russia’s military efforts in Ukraine and gather insights into geopolitical events in neighboring countries, especially as Moscow’s relations with its neighbors have suffered following its invasion of Ukraine”. 
  • Musk and the UK: Chi Onwurah, chair of Parliament’s Science, Innovation and Technology Committee in the UK wants to summon Elon Musk to testify before Parliament. This comes after the spread of mis/disinformation on social media led to violent riots across the UK over the summer. 
  • Misinformation bill dropped: The Australian government withdrew a bill regarding misinformation, platforms would have been fined up to five percent of their global revenue if they failed to stop the spread of misinformation. The bill faced opposition from both Australian government officials and Elon Musk.
  • Google disrupts Chinese influence operation: Google has taken down hundreds of news sites from Google News, these sites and domains were traced back to several companies that are allegedly part of a Chinese influence operation. Mandiant and Google’s Threat Analysis Group published a report on this influence operation, known as Glassbridge.
  • UN initiative to counter climate disinformation: Together with UNESCO and the Brazilian government, the UN has launched the Global Initiative for Information Integrity on Climate Change. The project aims to strengthen research and measures to address all disinformation which has the effect of delaying or derailing climate action.
  • COP 29 and climate disinfo: Top officials participating in this major event in Baku say that the spread of false climate narratives is undermining the annual climate talks. Additionally, a new study by CAAD reveals that Big Tech continues to allow thousands of bot accounts to promote Azerbaijan’s propaganda during the UN climate negotiations once again this year. Meat, dairy and pesticide lobbyists return in high numbers to this climate summit. While we already knew that more than 1,700 oil, gas, and coal lobbyists are registered attendees of COP 29, attempting to convince the world that the fossil fuel industry can be part of the climate solution, it has been also found that the PR companies that help them promote that message are not far behind.
  • Social media age restriction: The Australian government has passed legislation that will ban children under 16 from using social media platforms. The ban will apply to Snapchat, TikTok, Instagram, and Facebook.

Reading & resources

  • Fact-checkers in Africa: Deutsche Welle’s Africa Link explores disinformation campaigns in Africa and a coalition of fact-checkers in Ghana working to combat these campaigns. Two fact-checkers are interviewed about the disinformation campaigns and the attempt to undermine democracy in Africa.
  • Election misinformation slides: The News Literacy Project has published a resource for understanding election misinformation. The resource contains 70 slides intended for middle school and high school students, covering topics such as fabricated content, manipulated content, and tricks of context. 
  • RSF on Russian Propaganda: This interview with Antoine Bernard, the Advocacy Litigation Assistance Director at Reporters Without Borders, presents the organisation’s project The Propaganda Monitor, its focus on Russian disinformation, and the rise in Russian propaganda following the invasion of Ukraine.
  • Assault on research: In this interview, Philip M. Napoli talks about his recent paper In pursuit of ignorance: The institutional assault on disinformation and hate speech research with Justin Hendrix, the CEO of Tech Policy Press. They discuss the political targeting of media research and its implications for misinformation and disinformation research.
  • Election disinformation across the globe: This episode explores the many elections that took place in 2024 and the effect misinformation had on these elections. ProPublica journalist Craig Silverman and Gabrielle Lim from Citizen Lab discuss electoral disinformation and its potential implications. 
  • Disinformation about the media: IPI has launched the Observatory of Disinformation Narratives Against the Media, a platform mapping key disinformation narratives targeting the media across multiple countries. Some of the disinformation narratives explored include xenophobic rhetoric, political bias, and describing media outlets as “traitors” or “foreign agents”.
  • Operation Undercut: Recorded Future’s Insikt Group explains Operation Undercut, an influence operation run by Russia’s Social Design Agency (also responsible for the Doppelganger operation). The operation aims to undermine Ukraine, sow discontent with Western aid, and foment polarisation.
  • OSINT & Bluesky: Craig Silverman provides multiple tools for open-source investigations, these tools and guides discuss geolocation, flight tracking, and more. In addition, Silverman includes multiple tools and tips for Bluesky.

This week’s recommended read

This week’s recommended read comes from our managing director, Gary Machado:

On my flight back from the DisinfoLab conference in Riga this October, I noticed my seatmate reading Educated by Tara Westover. Curious, I looked it up—and I’m so glad I did.

This memoir tells the incredible story of Tara, who grew up in a family steeped in conspiracy theories, isolated from the world, and without formal schooling. Despite these challenges, she pursued education and eventually earned a PhD from Cambridge.

What’s truly remarkable about the book is Tara’s ability to doubt herself and continually question her own beliefs while navigating the extraordinary steps needed to change her life. Her story is a powerful testament to the transformative power of education. I can’t recommend it highly enough.

If you’re reading this newsletter and happen to have encountered Tara Westover, please don’t hesitate to contact us. We would love to invite her to speak at the next DisinfoLab conference.

The latest from EU DisinfoLab

  • AI Disinfo Hub. The rise of AI technologies has brought unprecedented challenges to the disinformation landscape, enabling faster content manipulation and spread. Our new AI Disinfo Hub is here to help you navigate these complexities. From the latest Neural News to research reports, podcasts, and tools, the hub offers insights and resources for researchers, policymakers, and the public. Explore it here!
  • Response-impact framework. How do we respond to evolving disinformation threats and FIMI campaigns? In the fight against disinformation and Foreign Information Manipulation and Interference (FIMI), countermeasures often follow a closed circuit: uncover a campaign, implement a response, and move on. But is this enough? We’ve developed a response-impact framework to go beyond individual countermeasures and assess the broader impact of responses. How could this framework be applied to an IO such as #Doppelganger? Find out here.

Events and announcements

  • 4-6 December: SciBeh will be hosting a  virtual workshop on psychological manipulation
  • 5 December: Forum Europe will host the International AI Summit 2024 in Brussels.
  • 6 December: ThinkYoung will be hosting a Hackathon in Brussels, the deadline for application is 6 November.
  • 10 December: Les Surligneurs will be hosting a debate on disinformation and democracy with Gérald Bronner and Bertrand-Léo Combrade
  • 10 December: Open Eyes Institute will be hosting a webinar on climate and migration.
  • 6-10 January 2025: The Digital Methods Initiative (DMI), Amsterdam, is holding its annual Winter School on ‘Chatbots for Internet Research?‘.
  • 24-27 February 2025: RightsCon 2025, hosted by Access Now, will be held in Taipei, Taiwan, and will bring together leaders and activists to discuss key digital rights issues.
  • 22-25 May 2025: Dataharvest The European Investigative Journalism Conference will take place in Mechelen, Belgium.
  • 15-16 October 2025: After this year’s brilliant outcome, the EU DisinfoLab annual conference #Disinfo2025 will be bringing the community together again, this time in Ljubljana, Slovenia. Mark your calendars!
  • Due to unforeseen circumstances, we are cancelling our policy event initially planned for March 2025. However, our commitment to fostering collaboration around policy work remains unwavering. In 2025, we will continue to engage actively in policy discussions through a series of dedicated webinars – more updates coming soon! We look forward to your involvement and participation.

Jobs

This good post!

Check out this post from Bluesky Safety!