Dear Disinfo Update readers,

A lot happened during the summer break. In case you missed anything, you can catch up here. This instalment’s readings will cover the recent UK riots, the narratives surrounding them, and actors fueling these narratives. They will also discuss the recent and upcoming elections, along with the efforts to address election related disinformation. We’ve included updates to research on Russian Doppelganger campaigns conducted by various organisations, and updated our own Doppelganger operation resource page that gathers a timeline of this disinformation operation and brings together additional knowledge about other operations allegedly led by the same operators. Finally, the articles will discuss the use of AI in relation to disinformation. 

We are honoured to have Carl Miller as a guest author. Check out his recent blog post, Directing Responses Against Illicit Influence Operations (D-RAIL), in which he discusses shifting from combatting disinformation to disrupting the infrastructure, funding, and personnel behind illicit influence operations.

We are also welcoming our newest addition to the EU DisinfoLab team: Quinton Walsh. He is currently a Master’s student at Binghamton University where he studies Genocide and Mass Atrocity Prevention. He obtained his Bachelor of Arts from George Mason University where he studied Conflict Analysis and Resolution. Quinton will be working with us this fall to help prepare the #Disinfo2024 conference, run webinars, and assist with newsletters.

And last but not least, a reminder: It’s not too late to register for #Disinfo2024! The two-day conference will take place in Riga, Latvia, on October 9-10. As places are limited, we have a vetting process to ensure that professionals in the field can attend. However, no invitation is needed – everyone is welcome to request a ticket!

Our webinars

Upcoming – register now!
Past – watch the recording!

Your message here?

Want to reach the 10.000+ readers of this newsletter?

Disinfo news & updates

  • Doppelganger revamped. Reset Tech has published an article providing updates into their investigation of the Doppelganger operation. They describe a development on X (formerly Twitter) in which popular hashtags are hijacked and used to promote pro-Kremlin messages. Check out the updates to our Doppelganger resources in the ’Latest from EU DisinfoLab’ section.
  • UK riots & social media. Rioting in the UK was triggered after the murder of three children on 29 July. A Telegram channel called “Southport Wake Up” was the source of organising protests that later turned to riots. Online rumours about the murder as well as the immigrant community have fuelled the riots and violence.
  • Musk fuels tensions. Following riots in the UK, Elon Musk has made inflammatory comments on the platform. This includes stating that “civil war is inevitable” and claiming that muslims are treated more leniently than individuals involved in the rioting. These claims fuel misinformation and extremist narratives, Musk’s large following and status as the owner of X elevates the visibility of these harmful narratives. Learn more about tensions with Musk in our ‘Brussels corner’.
  • Women, the universal scapegoat. Following the attempted assassination of former President Donald Trump there have been claims that DEI (Diversity, Equity and Inclusion) initiatives and the presence of women in the secret service interfered with the protection of the president. These claims have been shared on social media platforms as well as by right wing pundits.
  • AI safety approach. OpenAI has developed a new safety approach known as Rule-Based Rewards (RBRs) that does not require human intervention. RBRs allow safety teams to provide rules for AI models as well as a process for rating the compliance to these rules. While removing human intervention may be efficient, it’s easy to imagine the potential risks associated with reduced human oversight.
  • Tiktok Lite’s limited protections. Mozilla has investigated a TikTok feature described as “TikTok Lite” which is intended for lower-bandwidth contexts. Compared to the classic TikTok app, this feature does not include basic protections such as content labels for graphics or misinformation. In addition video descriptions are arbitrarily shortened, risking to eliminate crucial context.
  • Funding climate disinformation. According to De Smog, a group operating in North and South America, the UK, Europe, and Africa, ExxonMobil has provided funds to the Swedish right wing think tank, Timbro, to spread climate disinformation. This article details the relationship between ExxonMobil and Timbro.
  • Gender Hysteria. Olympic boxer Imane Khelif was the centre of controversy after false claims were circulated about her gender. Several celebrities shared the false claims including JK Rowling, Elon Musk, Logan Paul, and others.

Brussels corner

  • DSA & election protections. The European Board for Digital Services has published a post-election report on the EU elections. The report includes descriptions of the work done under the Digital Services Act (DSA) to assess and mitigate risks to civic discourse and electoral processes by the European Commission and Digital Services Coordinators (DSCs) as well as the European Digital Media Observatory (EDMO) Elections Task Force.
  • EU Elections Task Force. EDMO completed a report on their EU Elections Task Force. The report describes the presence of disinformation related to the European Union, the main disinformation narratives, cases of coordinated and foreign interference, and major events causing EU-wide disinformation events.
  • Future EC guidelines. The European Commission has published their political guidelines for the next European Commission (2024-2029). The proposed European Democracy Shield aims to counter Foreign Information Manipulation and Interference (FIMI) online. In addition, the implementation of the European Freedom Act is intended to increase protections and support for independent media as well as journalists.
  • EU probes Meta compliance. The European Commission has requested Meta to provide information about the measures it has taken to give researchers access to publicly available data as required by the DSA. In addition, the Commission has requested information about the platform’s plan to update its election and civil discourse monitoring mechanisms. This is a result of Meta’s discontinuation of CrowdTangle, a public insight tool used to explore public content on social media. An open letter from the Mozilla Foundation calling on Meta to keep CrowdTangle functioning has been signed by numerous organisations.
  • Musk and the EU. Thierry Breton, EU commissioner for internal market and services, wrote an open letter to Elon Musk in regards to his livestreamed conversation with former President Donald Trump. Breton warned Musk about the potential spread of harmful content, Musk responded with a meme containing expletives.

Reading & resources

  • AI influencing elections. Access Now discusses the use of generative AI and the role it plays in election disinformation. The article addresses both deepfakes and audio content being used as methods of spreading disinformation. 
  • Foreign interference’s impact. The EU institute for Security Studies (EUISS) is exploring the role of foreign interference in its latest issue of the Chaillot Paper. It discusses interference in numerous areas such as the social, political, digital, economic, and security sectors.
  • Refuting disinformation critiques. This article from the Humanities and Social Sciences Communications Journal addresses criticism of misinformation research, including claims that it seeks to censor conservative voices and that misinformation is not widespread enough to warrant attention. The article refutes these claims and demonstrates that the spread of misinformation poses a threat to public health.
  • Recommendations for combatting FIMI. Reporters without borders (RSF) has published five recommendations for combating foreign interference in EU elections. These include: setting up an obligation or digital platforms to provide reliable sources of information; ensuring the same rules for pluralism and accuracy apply to all broadcast media in the EU; establish a system to protect the information space within the EU that is based on the principle of reciprocity vis-à-vis third countries; supporting the resettlement and the continued work of foreign journalists and media that have fled to the EU because they were persecuted due to their journalism in their home countries; and making reliable news sources available to more people living in countries that oppose press freedom. 
  • Health, disinformation, and the EU. The European Parliament explores health-related disinformation in a recent article, highlighting proposed or implemented measures at the national, European and international level to address it, and offering recommendations to reduce its impact.
  • Dangers of disinformation confirmed. Tech Policy Press published this article that dives into the politicisation of disinformation research, countering the narrative that the research is used to silence conservative voices. Additionally, the article explores the harmful impact of disinformation on society.
  • Counteracting Russian “sharp power”. This blogpost from the ATHENA project examines the concept of FIMI as a tool for understanding Russian disinformation campaigns, exploring the history of Russian disinformation as well as the modern use of these tactics.
  • Algorithms fuelling disinformation. The European Digital Media Observatory’s (EDMO) new report discusses the amplification of disinformation enabled by platform algorithms, business models, and monetisation programs. The report proposes metrics for measuring the spread of disinformation through these methods.

This week’s recommended read

This week, our recommended read, proposed by EU DisinfoLab’s Research Manager Maria Giovanna Sessa, is Doublethink Lab’s new report on Taiwan’s resilience to FIMI during elections, and the Taiwan POWER model that describes the main characteristics of the country’s overall approach to countering foreign information manipulation and interference (FIMI) in its democratic processes. 

The POWER model was used by Taiwan to counter Chinese FIMI operations that targeted Taiwan’s elections. The model provides a clear sense of purpose rather than a shared strategic plan, while utilising a bottom-up approach. Rather than focusing on one segment of civil society, the model benefits from a united approach by engaging universities, CSOs, media outlets, and social platforms. Designed to be flexible, the model allows different organisations to focus on issues relating to their specialisation. 

This purpose-driven, organic, whole-society, evolving, and remit-bound model can support other countries’ election preparedness efforts.

The latest from EU DisinfoLab

  • D-RAIL – Directing Responses Against Illicit Influence Operations. In this guest post, author, speaker and researcher Carl Miller dives into D-RAIL, a proposal to better guide responses to harmful influence operations. D-RAIL represents a strategy shift from combatting disinformation to disrupting the infrastructure, funding, and personnel behind illicit influence operations (IIOs), while enhancing interoperability and connecting existing frameworks such as DISARM of Kill Chain.
  • CIB detection tree. In 2021, we created the Coordinated Inauthentic Behaviour (CIB) detection tree to help the counter-disinformation community identify CIB and develop effective strategies to mitigate its impact. Now, through the veraAI project, we’ve enhanced this resource with a strong focus on AI’s role.
  • Maybe Russians aren’t that good; maybe we just suck. Qurium’s recent publication presents yet another set of evidence around the Russian Doppelganger disinformation operation, and dives deep into the internet infrastructure established to support it. In this post, we call for the EU to, finally, act together, decisively and comprehensively to counter illegal operations like Doppelganger.
  • Update of Doppelganger resources. Still searching desperately for the light at the end of the Doppelganger tunnel, we’ve updated our resource page. Recent additions include, among others, a quarterly update to Meta’s threat report (including a specific doppelganger section), a Harfang Lab report on new Doppelganger assets, a technical report on European election related Doppelgangers from EEAS, a report on operation Matriochka by Viginum, multiple reports on the infrastructure from Bavarian intelligence, and a report on Operation Overload by CheckFirst and Reset Tech.

Events & announcements

  • 25-26 September: The European Commission Joint Research Centre (JRC) Annual Disinformation Workshop 2024, themed ‘How the EU Tackles Disinformation,’ will take place in Ispra, Italy. Remote participation is available. Registration closes on 2 September.
  • 26 September: DisinfoCon 2024 – Taking stock of Information Integrity in the Age of AI is a conference for civil society, policymakers, journalists, and AI practitioners. The event will take place in Berlin – if you can’t join in person, you can attend virtually.
  • 1 October: The Tech and Society Summit will bring together civil society and EU decision-makers in Brussels, Belgium, to explore the interplay between technology, societal impacts, and environmental sustainability.
  • 3 October: Adapt Institute will be hosting their conference Disinformation & Democracy, a virtual event bringing together experts from the public and private sectors, academia, and non-governmental organisations.
  • 9-10 October: Our #Disinfo2024 conference will take place in Riga Latvia. The two day event will bring together scholars, journalists, government officials, and the private sector.
  • 10 October: News Impact Summit: Fighting climate misinformation in Copenhagen, organised by the European Journalism Centre, will address how climate misinformation undermines public trust in climate policies and stalls progress toward a green transition.
  • 16 October: UNESCO will organise a webinar “Countering climate disinformation: strengthening global citizenship education and media literacy.”
  • 18-22 November: Exposing the Invisible will be hosting a week-long Digital Investigation Residency in Berlin. This residency is intended for experienced investigators, and the application deadline is 30 August.
  • 29 October: Coordinated Sharing Behavior Detection Conference will bring together experts to showcase, discuss, and advance the state of the art in multimodal and cross-platform coordinated behaviour detection.
  • 14 November: ARCOM, the French Audiovisual and Digital Communication Regulatory Authority, is calling for research proposals around the themes of information economics, digital transformation, audience protection, and media regulation for its 3rd Arcom Research Day. Submit your proposal by 1 September.
  • 22 November: CYBERWARCON will take place in Arlington, Virginia, US, bringing together attendees from diverse backgrounds to identify and explore cyber threats.
  • 2-3 December: European Media and Information Fund (EMIF) will be hosting its two-day winter event in Florence, Italy, to discuss disinformation trends and provide opportunities for networking.
  • 24-27 February 2025: RightsCon 2025, hosted by Access Now, will be held in Taipei, Taiwan, and will bring together leaders and activists to discuss key digital rights issues.
  • 22-25 May 2025: Dataharvest: The European Investigative Journalism Conference will take place in Mechelen, Belgium. Save the date!

Jobs