Dear Disinfo Update readers,

Did you miss #Disinfo2024 or are you feeling nostalgic? Check out the photos and presentations from the conference! You can find them using this link.

In this edition of our newsletter, we will discuss the US presidential election and foreign interference. With the elections underway across the pond, our Executive Director, Alexandre Alaphilippe, shares his insights on the topic:

FIMI is bad, but domestic disinformation is worse

At #Disinfo2024, we recently heard an interesting remark: “Why don’t we rebrand our work solely as FIMI and drop the use of the word ‘disinformation’?” After all, FIMI (Foreign Information Manipulation and Interference) is widely discussed these days, especially in Brussels, and there are genuine cases of “FIMI FOMO,” with an increasing amount of reporting on efforts led by the Kremlin and others.

However, there is a risk of losing sight of the real issue. Last week, the US investigative outlet OpenSecrets revealed that Progress 2028, an apparently Democrat-aligned Political Action Committee, targeted American voters with pro-Kamala Harris content. The problem? None of the proposals they promote are true, and Progress 2028 is actually part of a broader 100 Million dollars effort to disinform American audiences. According to OpenSecrets, it’s not Russia funding this initiative, but rather Elon Musk, the CEO of X.

It can be tempting—and sometimes easier—to blame foreign actors for what is happening within our own borders. But we must remain vigilant and remember that domestic disinformation has a far greater impact on how our own democracies function (or malfunction) and hold accountable the ones responsible for this.

…and before you dive into this newsletter, we have one more announcement: If you are looking for an opportunity to learn about disinformation and want to work with a passionate team, here’s your chance—apply to be our next intern!

Our webinars

UPCOMING – REGISTER NOW

  • 7 November: Generative AI and disinformation: a legal perspective – The impact of Generative AI (Gen AI) on the information ecosystem has made the headlines, especially with 2024 being a critical year for elections worldwide. This talk with Noémie Krack will explore the legal considerations associated with Gen AI & disinformation. It will discuss how the EU legal framework has evolved to respond to the challenges raised by Gen AI and how new provisions of the EU legal arsenal can help address the use of Gen AI for disinformation purposes including from the AI Act & the Digital Services Act (DSA). The talk will share insights coming from the EU-funded AI4Media project and recent collaborative research publications.
  • 14 November: Latvian resilience to disinformation (with the support of the German Embassy in Latvia) – Countering disinformation requires for public authority to evolve in their missions, especially in educating key audiences on how disinformation spreads. In 2022, the Latvian State Chancellery edited a guide for local public authorities on how to counter disinformation. A podcast on YouTube with regular guests has been launched. Dr Rihards Bambals, from the State Chancellery in Latvia, will discuss this strategy and its successes.
  • 28 November: Building FIMI resilience through models of practice: Taiwan’s 2024 elections – Foreign information manipulation and interference (FIMI) is a shared challenge for democracies across the globe. Faced with China’s well-resourced FIMI campaigns against Taiwan’s 13 January 2024 election, the island’s resilience has been closely studied. To equip others to readily learn from Taiwan’s experience, Taiwanese civil society organisation Doublethink Lab worked with Ben Graham Jones to document the key characteristics of Taiwan’s FIMI resilience. The ultimate aim is to reach beyond the information integrity community and provide strategic fuel for decision-makers to envision goals for whole-society resilience. This webinar will discuss the findings and consider the long-term direction of global FIMI resilience.

PAST – WATCH THE RECORDINGS!

Disinfo news & updates

  • LLMs and political discourse: This essay presents a protocol for validating the use of large language models (LLMs) in analysing political content on social media, focusing on their versatility, narrative depth, and human evaluation limits. The study applies the LLM to Facebook data from Italy’s 2018 and 2022 elections. The protocol supports the classification and clustering of political links.
  • FIMI in Canada: Canadian Prime Minister Justin Trudeau gave testimony that conservative parliamentarians are involved in foreign interference. He reportedly instructed the Canadian Security Intelligence Service (CSIS) to inform the Conservative Party leader, Pierre Poilievre, of these instances of foreign interference. 
  • Israel-Gaza and content moderation: The Intercept discusses the moderation of content regarding Pro-Palestinian content, the article explores the over-representation of views from Israel and the lack of representation of Palestinian viewpoints. Content from Students for Justice in Palestine has reportedly been flagged for moderation numerous times.
  • Disaster response disinfo: An effort to undermine trust in disaster relief agencies and support for Ukraine has been launched by Russia following Hurricanes Milton and Helene. This includes allegations of administrative failure and the prioritisation of foreign aid over disaster relief.
  • Georgia and Facebook ads: The Georgian Dream Party (Georgia’s ruling party) has used ads on Facebook to share the narrative that the West was trying to interfere with the 26 October Georgian election. Anti-west rhetoric reportedly increased in Georgia following the Russian invasion of Ukraine.
  • Georgians on our minds: We’re very concerned with recent incidents targeting Georgian members of the counter-disinformation community, such as the recent raids on Eto Buziashvili and Sopo Gelava’s homes, seizure of their computers and freezing of bank accounts. The work being accomplished by civil society in Georgia in exposing disinformation can not be subject to any intimidation or threats and should be supported as widely as possible.
  • UK sanctions Russian Doppelganger: The UK government has sanctioned several Russian agencies and individuals for their involvement in spreading disinformation about the Princess of Wales. The disinformation has been attributed to the doppelganger operation, the group promoted rumours about the princess (these rumours were dispelled when the princess revealed her cancer diagnosis).
  • Azerbaijan, COP 29: The UN’s COP 29 climate summit taking place in Azerbaijan has been the subject of conversation on X. An investigation from Global Witness has uncovered a network of suspicious accounts using hashtags that amplify official Azerbaijan government messages.
  • Can you identify electoral misinformation?: The Washington Post has published an article where you can test your skills at spotting fake news. The article uses social media posts about the US presidential elections from various sources. At the end of the quiz, you can see how you compare to others who have taken the quiz. 
  • Deepfakes and misinformation: Russian documents obtained from European intelligence services show an effort to create deepfakes and circulate misinformation about US presidential candidate Kamala Harris. According to these documents, a former Palm Beach County sheriff was provided funds by a GRU (Russian Military Intelligence Service) officer to create and disseminate this disinformation campaign. Newsguard has created a depository of articles that discuss the disinformation shared by the former sheriff.

Brussels corner

  • DSA public consultation: The European Commission has opened a public consultation on access to online platform data for vetted researchers under the DSA (Digital Services Act). This public consultation is in response to a draft delegated act on access to online platform data for researchers. The draft act is currently open for feedback, follow the link to provide your feedback.
  • The Transparency Onion: The European Commission has published an implementing regulation outlining the templates for transparency reporting obligations for VLOPs (Very Large Online Platforms) and online intermediaries. Disinformation is included as a category that must be reported biannually or annually, respectively. However, upon closer inspection, we’re concerned with risks of inconsistent reporting. For instance, the template identifies “Gendered disinformation,” “Disinformation, Misinformation and foreign interference,” and “Inauthentic accounts” as three distinct categories. Yet platforms and online intermediaries must only report each instance of moderated content once. As a result, there is a likelihood that, for example, fake accounts operated by Russian IOs (Information Operations) falsely claiming that Kamala Harris is not a woman may be reported inconsistently over time or across different platforms, raising further issues around how this data can be analysed.

Reading & resources

  • Social media and disinformation: This podcast, Marianna in Conspiracyland, explores social media and disinformation. The podcast is hosted by Marianna Spring, the BBC’s social media and disinformation correspondent. Some recent episodes explore the Taylor Swift fandom, the Israel-Gaza war, and the assassination attempt against Donald Trump.
  • Mis/disinformation platform: This report from the Institute for Strategic Dialogue discusses the Irish Channel, a website with associated social media accounts that spread misinformation. The website shared falsified quotes alleging election interference in June 2024. Additionally, the Irish Channel spreads content that supports far-right ideologies such as anti-immigrant narratives.
  • Counter disinformation network: Women vs. Disinfo is a campaign organised by NATO, it is a network of women in the counter-disinformation field. Videos with counter-disinformation practitioners discussing their experiences and expertise will be shared as a part of this series. You can find these videos on Marie-Doha Besancenot’s Instagram page, she is the NATO Assistant Secretary General for Public Diplomacy.
  • Prebunking with AI: This article discusses the use of AI as a pre-bunking tool. The authors conducted a study in August 2024 with 4,293 registered voters in America, exploring AI as a tool for prebunking electoral misinformation. The study found that the use of LLM-assisted prebunking was effective in reducing belief in election myths across party lines. 
  • Health and mis/disinformation: The Lancet discusses the trend of polarisation regarding health and the role that mis/disinformation plays in this polarisation. The article discusses sensitive topics in healthcare that are often politicised, these issues relate to bodily autonomy, the interests of large corporations, and areas historically neglected by mainstream healthcare (for example, women’s health). 
  • Climate disinformation: The Global Covenant of Mayors for Climate and Energy has published a report about responding to climate disinformation. The report discusses what disinformation is, the trend of climate denial, and principles of effective communication.
  • Mis/disinformation tracker: Newsguard has created a database that tracks election misinformation, it tracks websites and social media accounts that publish false or misleading election information. The database also tracks partisan websites that claim to be politically neutral.
  • Foreign interference: DFRLab has published “Interference 2024”, a resource that tracks instances of foreign interference in the US elections. The resource provides the date of attribution, the source that identified it, the nation behind the interference (if it has been determined), and the campaign it is linked to.

The latest from EU DisinfoLab 

Events & announcements

  • 1-29 November: All relevant stakeholders and interested individuals are invited to participate in the public consultation and thus, contribute to the finalisation of IWA 44 – Unique Media Identifier (UMId) for distribution channels and brands. Check out the recording of our previous webinar on unique media identifiers.
  • 12 November: Les Surligneurs will be hosting a debate on internet regulations, the event will take place in Paris.
  • 14 November: EEAS will be hosting an event on identity-based disinformation. The event, Identity-Based Disinformation in FIMI: Countering the Weaponization of Who We Are, will take place both online and in person (in Brussels). Our researchers Maria Giovanna Sessa and Raquel Miguel Serrano will give a workshop on how to carry out an OSINT investigation to document cases of Identity-Based Disinformation/FIMI. If you can’t attend the event, we will host a webinar on 12 December on the same topic. Stay tuned for more information. 
  • 14 November: ARCOM, the French Audiovisual and Digital Communication Regulatory Authority, will be hosting Arcom Research Day. The event will be held in Paris and streamed online.
  • 15-21 November: C+ pocket project and de smog will be hosting the climate consciousness summit
  • 20 November: The European Liberal Forum will be hosting the Techno-Politics 2024 Forum in Brussels
  • 22 November: CYBERWARCON will take place in Arlington, Virginia, US, bringing together attendees from diverse backgrounds to identify and explore cyber threats.
  • 21 November: Zevedi will be hosting a workshop about the DSA and what it means for fighting disinformation in Germany.
  • 2-3 December: The European Media and Information Fund (EMIF) will host its two-day winter event in Florence, Italy, to discuss disinformation trends and provide networking opportunities.
  • 5 December: Forum Europe will be hosting the international AI summit in Brussels.
  • 6 December: ThinkYoung will be hosting a Hackathon in Brussels, the deadline for application is 6 November.
  • 6-10 January 2025: The Digital Methods Initiative (DMI), Amsterdam, is holding its annual Winter School on ‘Chatbots for Internet Research?‘.
  • 24-27 February 2025: RightsCon 2025, hosted by Access Now, will be held in Taipei, Taiwan, and will bring together leaders and activists to discuss key digital rights issues.
  • 22-25 May 2025: Dataharvest The European Investigative Journalism Conference will take place in Mechelen, Belgium.
  • 15-16 October 2025: After this year’s brilliant outcome, the EU DisinfoLab annual conference #Disinfo2025 will be bringing the community together again, this time in Ljubljana, Slovenia. Mark your calendars!

Jobs

  • ProPublica is hiring for several positions including assistant director of leadership gifts, freelance pitch form, data editor, and communications manager.
  • Debunk.org is hiring digital literacy trainers in Lithuania, Estonia, Latvia, North Macedonia, Poland, Armenia, Bosnia and Herzegovina, Georgia, Moldova, Montenegro, Ukraine, Belarus, Kazakhstan, Uzbekistan, Kyrgyz Republic, and Azerbaijan. 
  • Newsguard is hiring for several positions, including director of marketing, staff reporter on media and misinformation, contributing analyst (UK), and contributing analyst (Ukraine).
  • DFRLab has several openings including a Moldova research assistant, a publications editor, and an Armenia research assistant.
  • The Center for Countering Digital Hate has several openings including a UK government and parliamentary affairs officer, senior press manager, director of advertising industry advocacy, senior US policy associate, and US programs associate.
  • The Centre for Information Resilience has two job openings, engagement manager and events coordinator.
  • Internews has an opening for a civic defenders program associate.
  • The Global Internet Forum to Counter Terrorism is hiring for several positions, this includes memberships and programs associate, memberships and programs coordinator, and operations associate.
  • EU DisinfoLab is looking for an intern to join our team.

This good X! 

Check out this thread from Mathias Vermeulen where he talks about Article 40 of the DSA.