Dear Disinfo Update readers,
Welcome to this edition of the newsletter. Our previous update was just three weeks ago, but given the significance of recent events for our community, it feels more like three years. Here’s a message from our Executive Director Alexandre Alaphilippe:
What we are witnessing every day in the United States since President Trump took office is a severe attack on the public interest and an accountable information ecosystem. This assault is multifaceted and includes the removal of mentions of climate change or of reproductive rights from several government websites. Another striking example is the U.S. National Transportation Safety Board’s decision to stop sending media briefings by email and exclusively deliver them through Elon Musk’s platform, increasing the risks of private capture of public information distribution.
Additionally, dismantling numerous programs supporting the fight against disinformation worldwide – previously funded by the now-jeopardised U.S. Development Aid – is also a concerning trend that already impacts civil society in the European Union. American Sunlight Project’s CEO Nina Jankowicz calls those affected to take the floor because ”staying quiet is not going to do you any good.”
In such moments of shock, the importance of checks and balances cannot be overstated. Independent justice and the rule of law are the foundation of any democracy. We must never take for granted that the executive and legislative branches will always act within legal and constitutional limits. These events also highlight how democracy is fragile and how fast it can be deeply wounded. It is up to institutions, civil society, and the press to speak up, hold political and financial powers accountable and ensure that fundamental rights and freedoms are preserved. It is our responsibility and mission to not give up, to not lose hope and to hold the line, as Maria Ressa said.
Democracy does not die in darkness; all of this happens in plain sight. Democracy dies in silence.
Our webinars
UPCOMING – REGISTER NOW
- 20 February: Clearing the air: building trust against climate misinformation storms | This session with Jodie Molyneux from Resolver will explore how false or misleading narratives about climate change spread across social media platforms, their tangible impacts on society, and the critical harm they can cause in the real world.
- 6 March: Melodies of malice: understanding how AI fuels the creation and spread of extremist music | In this session, we will explore “Melodies of Malice”, a report by the Global Network on Extremism and Technology (GNET), with Heron Lopes, uncovering how far-right online communities harness AI to craft music propaganda and spread disinformation across platforms like Facebook, Instagram, and YouTube.
PAST – WATCH THE RECORDINGS!
- New approaches to measuring the impact of counter-disinformation campaigns | In this webinar, Sviatoslav Hnizdovskyi, Founder and CEO of OpenMinds, talked about his work on measuring the impact of disinformation and counter-narratives on public attitudes and behaviours.
- Elections at risk? Disinformation campaigns observed in Romania and the role of the DSA in risk mitigation | In this session, we examined some of the organised disinformation campaigns that targeted the Romanian social media space ahead of the presidential elections in November 2024, explored how policy tools like the DSA can mitigate these risks, and discussed actionable steps to effectively assess and respond to emerging risks related to elections.
Disinfo news & updates
Disinfo & democracy
- The podcaster effect: boosting Trump. A Bloomberg analysis of over 2.000 videos reveals how popular YouTubers and podcasters helped mobilise young male voters in support of Donald Trump’s 2024 presidential win. By blending political messages with entertainment, the influencers shaped discussions and positioned themselves as an alternative press corps for Trump’s agenda.
- Vanishing public records. US government websites are quietly removing or archiving key information, raising alarms among researchers and transparency advocates. As federal agencies undergo leadership changes, data on critical topics like climate change, health, and civil rights is disappearing in real time.
- Kremlin games: Poland edition. Poland has exposed a Russia-linked disinformation campaign aimed at disrupting its upcoming presidential election in May, according to the country’s digital affairs minister. The campaign seems to go beyond simple online propaganda and new reports reveal that Russian intelligence agencies have been recruiting Polish citizens to spread disinformation in exchange for hefty rewards.
- A fair price? Meta has agreed to pay $25 million to settle a 2021 lawsuit filed by Donald Trump, accusing the platform of censorship and political bias after Meta suspended his accounts following the Capitol attack.
- Flooding Spain with disinformation. An investigation by Graphika reveals that a Chinese influence operation, “Spamouflage,” targeted Spain with disinformation campaigns aimed at destabilising the government. The operation exploited the Valencia floods, amplifying false narratives that blamed the Spanish government’s response and promoting anti-government sentiment.
- Influence by design. A joint report by AI Forensics, Check First, and Reset Tech reveals how Meta accepted payments from the Kremlin-linked Social Design Agency (SDA) to run over 8.000 political ads, despite international sanctions.
Election watch: Germany
- Safeguarding the vote. Germany is calling for a tougher EU response to Russian hybrid threats, as lawmakers warn of disinformation campaigns aimed at manipulating the upcoming elections.
- Stress-testing for disinfo. Ahead of the elections, German authorities are ramping up efforts to counter potential social media interference, combining stress tests of major platforms under the EU’s Digital Services Act (DSA) with broader monitoring strategies.
- Ads under scrutiny. German authorities are investigating whether the Alternative for Germany (AfD) received illegal party donations after ads promoting the party appeared on X without proper disclosure. The ads could be considered in-kind contributions potentially violating the country’s campaign finance laws. Read more here (in German).
- AI & the usual suspect. A joint investigation by NewsGuard and Correctiv uncovered a network of 102 Russia-linked websites spreading AI-generated disinformation ahead of Germany’s election. The sites push false narratives targeting pro-NATO politicians while favoring Russia-friendly parties like the far-right AfD.
- Resources for researchers. The Center for Monitoring, Analysis, and Strategy (CeMAS) offers a dedicated platform for tracking disinformation related to the elections. It provides resources for researchers, focusing on election interference, including pro-Russian and extremist narratives. Correctiv complements these efforts with similar research.x
Unchecking the facts
- Boost to state-sponsored disinfo? Meta’s decision to ditch its fact-checking programme is raising concerns about the spread of state-sponsored disinformation. Experts warn that without robust fact-checking, authoritarian regimes and other actors will have more freedom to manipulate narratives, amplify false content and undermine democratic discourse globally.
- …and blow to advertiser trust? Meta’s decision has also raised concerns among advertisers over potential harmful content and disinformation which could force brands to reconsider their ad spending. The overhaul, seen as part of Mark Zuckerberg’s broader shift toward a “free speech” approach, risks alienating brands that rely on Meta’s advertising platforms.
- Targeted truth. Fact-checkers have become prime targets of populist firebrands and online abuse, as debates intensify over content moderation, censorship, and political bias. While some accuse platforms of silencing conservatives, fact-checkers warn that reducing their role could weaken global efforts against disinformation.
- Same same but different. Meta’s fact-checking program flagged only 14% of disinformation posts from Russia, China, and Iran before its shutdown, according to research highlighted in this NewsGuard report. The low rate was attributed to limited fact-checking coverage and Meta’s policy of not labelling similar narratives when the wording was slightly altered.
Climate vs. profit
- Toxic plastic spin. This Le Monde investigation (in French) reveals how the plastics industry has spread misleading narratives and cast doubt on scientific evidence to downplay the environmental and health risks of PFAS aka “forever chemicals,” aiming to shape public perception and delay regulatory action.
- Exporting climate denial. The Heartland Institute, a group known for its climate denial, is actively collaborating with far-right allies in opposition to EU environmental policies, including the Nature Restoration Law. Its methods include leveraging far-right political networks and amplifying narratives that downplay climate science, while using misleading tactics to shape public and political opinion. These efforts aim to delay green policies across Europe.
- More transatlantic anti-green efforts. This Global Witness investigation highlights how AI chatbots risk promoting fossil fuel-friendly narratives through bothsidesism, the false balance of presenting climate science and fossil fuel defences as equally valid. This issue aligns with findings from a DeSmog investigation, which mapped a growing transatlantic anti-green network working to undermine climate policies and promote fossil fuel interests.
Artificial Influence
- In too deep(seek). This NewsGuard investigation reveals that the AI-powered chatbot Deepseek failed to detect disinformation 83% of the time when tested on topics targeted by state-backed campaigns from China, Russia, and Iran. Instead of flagging false claims, the chatbot often amplified them.
- Lower literacy – higher receptivity. A study suggests that individuals who know less about artificial intelligence are more open to integrating it into their daily lives, and that a lack of knowledge often leads to lower perceptions of risk and fewer concerns about AI’s societal impact.
- AI breaks the news. Apple has disabled its AI-powered news summary feature, Apple Intelligence, following several high-profile errors, including a false news summary involving the BBC. Read more here (in French).
Brussels corner
- No place for illegal hate. Major online platforms have agreed to tougher measures through a strengthened Code of Conduct under the Digital Services Act (DSA).
- Probing X’s algorithms. The European Commission has ordered Elon Musk’s X to hand over internal documents on its recommendation algorithm as part of an expanded probe into the platform’s influence in European politics. The investigation follows complaints from German politicians that X is amplifying far-right content ahead of key elections.
- Google: No fact-checks. Google has informed the European Commission that it will not integrate fact-checking into its search results or YouTube rankings, despite obligations under the EU’s Code of Practice on disinformation. Citing concerns over effectiveness, the company plans to withdraw from its fact-checking commitments and will instead continue to rely on its current content moderation practices, including contextual tools and AI-driven transparency efforts.
- Standing up to Russian disinfo. In a resolution, the European Parliament condemns Russia’s disinformation campaign aimed at justifying its war in Ukraine and calls for sanctions against the media outlets spreading false narratives, as well as stronger international cooperation.
- United to shield elections. France, Germany, and ten other EU member states urge the European Commission to leverage the Digital Services Act (DSA) to safeguard European elections from foreign interference.
Reading & resources
- When do parties lie? This study examines how radical right populist politicians strategically amplify disinformation to reinforce anti-establishment messaging, erode trust in democratic institutions, and mobilise voters against mainstream parties. The authors argue that disinformation should be studied as a political strategy embedded in party politics, not just as a media phenomenon.
- Influencers for hire. This study analyses the covert use of social media influencers during the 2022 Philippine elections, revealing how candidates leveraged influencer-led campaigns across platforms.
- Delaying climate action. This study examines how fossil fuel companies use discursive delay tactics to obstruct climate action. By analysing corporate messaging, it identifies key narratives that shift blame, minimise urgency, and sustain dependence on fossil fuels.
- Media literacy vs politics. This study challenges the emphasis on digital literacy as the main factor driving vulnerability to misinformation. Instead, it argues that standard political variables — such as political interest and calcified partisanship, which increase with age — play a larger role.
- The common solution. This article questions whether interventions against misinformation in diverse formats are genuinely distinct or share fundamental similarities across different contexts. The author argues that most interventions target common cognitive vulnerabilities and psychological processes, and suggests that focusing on these shared mechanisms can help design more scalable and effective counter-disinformation strategies.
- Criminal networks of disinformation. This article argues that disinformation should be tackled using strategies similar to those used against organised crime. By framing disinformation networks as criminal enterprises, it highlights the need for coordinated responses, including stronger regulation, intelligence sharing, and legal action.
- Disinformation tops the global risk list. The World Economic Forum’s 20th Global Risks Report highlights misinformation and disinformation as top short-term risks for the second consecutive year. The report warns of their ongoing threat to societal cohesion, governance, and international stability by eroding trust, deepening divisions, and undermining collective action in addressing global challenges.
- Systemic risks monitoring for elections. This report by VIGINUM (in French) analyses disinformation tactics observed during Romania’s 2024 presidential election in which the first-round results were annulled due to irregularities, and warns of the risks they pose to France. The report highlights how malicious actors manipulated TikTok’s recommendation algorithms to amplify content and covertly enlisted influencers to shape public opinion. (Watch also the recording of our webinar a few weeks back around the same topic.)
- Expanding media efforts. Maldita has launched a new content agency aimed at helping media outlets and organisations combat disinformation more effectively. The agency will provide articles, multimedia content, and training, building on their expertise in countering false narratives and promoting digital literacy across Spain. Read more here (in Spanish).
This week’s recommended read
This week, our recommended read, proposed by our Project Officer Inès Gentil, is a newly revised report by NATO StratCom COE on Kremlin information influence operations (IIOs) in Egypt, Mali, Kenya, South Africa, and the UAE – countries selected for their strategic significance and ties to key issues like food and energy security, military influence, and trade relationships with Russia.
The findings highlight how historical narratives, particularly colonial memory and perceived Western hypocrisy, are leveraged to strengthen pro-Kremlin sentiment. Russia’s messaging pushes three core narratives – strength, salvation, and sovereignty – to build support for Russia, foster distrust of the West, and position itself as a defender of “traditional values” against Western influence. The report also details how digital platforms are used to spread disinformation, with Kremlin-backed content often outweighing Western strategic communications.
Additionally, the report outlines key Tactics, Techniques, and Procedures (TTPs) used in Kremlin IIOs, such as astroturfing, doxing, media capture, and information laundering. It contrasts these efforts with Western strategic communications, which, despite their commitment to universal values, often struggle with volume, local language barriers, and credibility gaps.
The report shows that countering these operations require a deep understanding of local contexts, innovative approaches and engagement with local voices to ensure effectiveness and avoid perceptions of paternalism or neocolonialism.
The latest from EU DisinfoLab
- Visualising CIB. Our new report “Visual assessment of Coordinated Inauthentic Behaviour in disinformation campaigns”, developed under the EU-funded project veraAI, examines three FIMI campaigns. Using a structured visual approach, it assesses how these cases align with key indicators of Coordinated Inauthentic Behaviour (CIB).
- The human side of FIMI. As part of our work in the ATHENA project, we’ve prepared a deliverable that explores the behavioural and societal aspects of Foreign Information Manipulation and Interference (FIMI). Check out the newly released executive summary.
- Update: AI Disinfo Hub. Our AI Disinfo Hub remains your go-to resource for understanding AI’s impact on disinformation and strategies to combat it. In addition to the AI news in this newsletter, you can find the latest updates here.
Events and announcements
- 6 February: Eliot Higgins, founder and director of Bellingcat, will deliver a public lecture titled “Information Disorder” at Leiden University in The Hague, The Netherlands.
- 12-13 February: Two EU-HYBNET workshops will take place in Brussels. The 5th Annual Workshop will showcase key findings from the project, and the Future Trends Workshop will apply the JRC ESPAS methodology to explore emerging risks, compare analyses, and discuss the project’s future transition with the Hybrid CoE.
- 24-27 February: The 13th edition of RightsCon, hosted by Access Now, will take place in Taipei, Taiwan, and online.
- 28 February – 1 March: The European Festival of Journalism and Media Literacy will take place in Zagreb, Croatia
- 19-20 March: The third edition Les Journées Infox sur Seine will gather the francophone community working on disinformation and fact checking to meet and present their current work in Paris, France.
- 9-13 April: The International Journalism Festival will be held in Perugia, Italy. The second edition of the Centre for Information Resilience (CIR) Open Source Film Awards ceremony will be organised as part of the event.
- 23-25 April: The 2025 Cambridge Disinformation Summit will discuss research regarding the efficacy of potential interventions to mitigate the harms from disinformation.
- 22-25 May: Dataharvest, the European Investigative Journalism Conference, will take place in Mechelen, Belgium.
- 18-19 June: Media & Learning 2025 conference under the tagline of “Educational media that works” will be held in Leuven, Belgium. Submit your proposal for a session by the end of January.
- 27-28 June: The S.HI.E.L.D. vs Disinformation project will organise a conference “Revisiting Disinformation: Critical Media Literacy Approaches” on Crete, Greece. The call for papers is open until 10 March.
- 15-16 October: See you at our annual conference #Disinfo2025 in Ljubljana, Slovenia! You have until 21 February to submit your proposal.
- 22–23 October: 17th Dubrovnik Media Days, under the theme “Generative AI: Transforming Journalism, Strategic Communication, and Education”, will take place in Dubrovnik, Croatia. Call for submissions is open until 16 March.
- 20-24 November: The 2025 Global Investigative Journalism Conference will be held in Kuala Lumpur, Malaysia.
- Funding for independent media. Civitates has opened a call to support independent public-interest media outlets in Europe, and welcomes proposals from media organisations registered in Hungary, Poland, Italy, Spain, Bulgaria, Slovakia and Croatia.
Jobs
- The BBC is on the lookout for a Senior Journalist to join their Verify team in London, UK. Applications are open until 4 February.
- Data & Society Research Institute is hiring a Senior HR Manager for a remote position, with candidates required to be eligible to work in the United States.
- European Digital Rights (EDRi) is looking for an Executive Director.
- The Network of European Foundations (NEF) is recruiting a Programme and Grants Officer to support two of its flagship grant-making programmes: the European Democracy Fund (Civitates) and the European AI & Society Fund.
- The European Commission DG CONNECT is seeking to hire a Security analyst: Cyber and hybrid threats.
- The Center for Data Innovation is hiring a Policy Analyst in London, UK.
- The European Policy Centre (EPC) is seeking a Policy Analyst to join its Social Europe and Wellbeing Programme in Brussels, Belgium. Applications are open until 14 February.
- The Defend Democracy Foundation is looking for a Managing / Operational / Business Director a.k.a. COO for a temporary pro bono position, remote with occasional meetings in Brussels and The Hague, with the prospect of a paid long-term contract. Apply by 25 February.
Did you find a job thanks to the listing in this newsletter? We’d love to know – please drop us a message!
This good post!
This thread by Laurent Cordonier (in French) highlights his research revealing that opposition to COVID-19 vaccines differs from traditional antivax beliefs. While the latter typically focus on vaccine safety and health concerns, COVID-related antivax sentiments are more rooted in distrust of institutions, conspiracy theories, and political polarisation.
