Dear Disinfo Update readers,
In this edition of the Disinfo Update we will explore developments regarding the Doppelganger operation, specifically the crackdown on Russian influence operations. In addition we will dive into other instances of FIMI and the US presidential debate.
How can we improve this newsletter? We want to hear from you! If you haven’t already, please take a few minutes to fill out our reader survey. Your feedback is invaluable to us as we strive to create a newsletter that reflects your interests, preferences, and passions. Whether it’s the topics you enjoy the most, or new ideas you’d love to see, your insights will help us tailor our content to better serve you. Together, we can make sure every issue is packed with the stories, updates, and resources that resonate with you.
Our webinars
Last week, we welcomed Polina Malaja, Policy Director, CENTR, to dig into domain name dispute resolution in a webinar “Cracking down on media impersonation: Harnessing the power of domain name dispute resolution”. Watch the recording to see the insightful discussion between her and our Senior Policy Expert Joe Mcnamee.
This year, we have held an impressive 22 webinars. In case you missed some of them, you can find the recordings here. Dive in!
Disinfo news & updates
- X banned in Brazil: The Brazilian Supreme Court has upheld a ban on X, the social media platform, after it failed to appoint a new legal representative in the country. In a column for The Guardian, John Naughton discusses how tensions are growing between democratic states and tech platforms, citing events such as the recent arrest of Telegram’s CEO Pavel Durov or ongoing discussions over breaking up Google in the United States.
- Chinese FIMI and the US, #TheAmericans: A new report from Graphika highlights the latest avatars of an influence operation known as “Spamouflage”, which has been linked to the Chinese state. The report describes how the creation of fake American personas is being used as part of efforts to influence the 2024 US Presidential election.
- Southport and social media: A report from the Institute for Strategic Dialogue (ISD) discusses the rise of islamophobia and xenophobia following the Southport attack. The publication describes the use of X and Telegram as an avenue for spreading anti-muslim and anti-migrant narratives. The spread of xenophobic disinformation led to acts of violence and riots.
- Media smear campaign: A coordinated smear campaign targeting the Hungarian independent media outlet 444.hu and its partners from the The Eastern Frontier Initiative (TEFI) – including Bellingcat or Gazeta Wyborcza–, has raised concerns about press freedom in Hungary. The campaign, led by pro-government media, accuses the group of conducting “information warfare” and being influenced by foreign intelligence services. The Media Freedom Rapid Response (MFRR) condemned the attacks, warning of potential official harassment of critical voices in Hungarian media.
- Australia threatens fines: Legislation has been introduced to Parliament in Australia to hold platforms accountable for the spread of online misinformation. If passed, social media platforms will be fined up to 5% of their global revenue if they fail to curb the spread of misinformation that could harm election integrity, public health, or infrastructure.
- Mis- /disinformation at the debate: The US presidential debate between Donald Trump and Kamala Harrris was marked by many false claims spanning topics such as the Israel and Gaza conflict, the war in Ukraine, crime rates, abortion policies, Project 2025, immigration (more specifically, the claim about immigrants eating pets), unemployment figures, the 2020 election, the economy and the Central Park Five case. Both the Guardian and Al Jazeera have analysed these statements in depth, offering clarifications on the misinformation presented throughout the debate.
- Training AI on user data: META has publicly acknowledged that it uses content from Instagram and Facebook —text and images alike—to train its AI, tapping into data published as far back as 2007. While this raises concerns about privacy, users in the EU are able to opt out due to strict regulations.
- Big day in court: TikTok told a US court that forcing its sale or banning it would harm free speech for its 170 million American users. The law stems from concerns about data misuse by China, which TikTok denies. A panel of three judges debated at length if the law targets TikTok specifically or broader foreign control.
Russia’s pillars of disinformation
- US fighting FIMI: The Biden administration has taken significant action to counter a Russian influence operation attempting to disrupt the 2024 US elections. These measures include imposing sanctions, filing criminal charges against two Russian nationals, and the seizure of 32 internet domains linked to the operation. A key player behind this campaign is the Social Design Agency (SDA), which has been previously spotlighted by EU DisinfoLab for its involvement in the Doppelganger operation, a disinformation effort targeting Western democracies. Learn more here about this.
- The Kremlin’s social media playbook: The recent DOJ (Department of Justice) indictment alleges that Tenet Media was funded by Russia and it employs several conservative influencers to create divisive content that promotes pro-Kremlin narratives. Russia’s influence operations are shifting from using bots and trolls focusing instead on social media influencers with massive followings to spread disinformation. Russia’s goal is to polarise the U.S. and weaken its foreign policy through these “superspreader” accounts
- Doppelganger operation and the US: DFRLab’s report gives background on how Russian influence operations, like the Doppelganger campaign, is used to influence the 2024 US elections through sophisticated tactics such as cloning legitimate news sites, creating fake news articles, or laundering false stories through credible platforms.
- RT Alert: The US Department of State has issued a fact sheet about RT (Formerly Russia Today) and their global covert activities, which includes attempts to influence public opinion and interfere in democratic processes. This alert highlights RT’s extensive use of misinformation and propaganda as well as RT’s cyber capacity.
Brussels corner
- Compliance investigation: The Irish digital services coordinator, Coimisiún na Meán, has launched an investigation into compliance of online platforms with the obligations foreseen by the Digital Services Act (DSA) to provide points of contact (article 12) for users, as well as “easy to access and user-friendly” methods to report allegedly illegal content (article 16 of the DSA). The investigation consists of a Request for Information (RFI) sent to a range of platforms, including both designated “Very Large Online Platforms” and non-designated platforms.
- AI Global treaty: The Council of Europe has launched its first global treaty on AI inviting countries to sign the agreement. The treaty aims to ensure AI systems will uphold human rights, democracy, and the rule of law. The framework has been signed by Andorra, Georgia, Iceland, Norway, Moldova, San Marino, the UK, Israel, the US, and the EU.
- Google’s major fine: An EU court has ruled that Google must pay a €2.4 billion fine. Google was fined for “abusing the market dominance of its shopping comparison service”.
Reading & resources
- Online hate and offline violence: This study from the Stimson Center explores how online false narratives lead to offline violence and disrupt social cohesion, using the United States as a case study. The project explores systematic tracking and data collection. It also provides recommendations to better identify early warning signs of harmful online content and mitigate its impact.
- Geolocation tool: Geospy is an AI tool from Graylark that can be used to assist with geolocation.
- Making OSINT easier: “Information Laundromat” is a tool that allows users to do a “Content Similarity Search” and a “Metadata Similarity Search”. This online resource could be useful for those conducting OSINT investigations.
- OSINT advice for oil spills: Bellingcat has published a guide for detecting oil spills with open source techniques.
- Protecting against election misinformation: New York State Attorney General Letitia James has published an AI electoral misinformation guide. It covers the nature of deepfakes and chatbots, and offers strategies for identifying content produced with these tools. In addition, the guide provides several links to reliable election information sources.
- X’s AI gets tested: The Center for Countering Digital Hate has published a report on X’s Grok AI revealing that the technology was exploited to produce election disinformation. The report includes examples of images generated by Grok AI that depict Donald Trump sick in a hospital bed and Kamala Harris with illicit substances. An article from The Guardian reports the efforts of a group of secretaries of state (and the national association of secretaries of state) who encouraged X to direct Grok users to trusted voting information sites. As a result, Grok now directs users to vote.gov when they inquire about elections.
- Telegram and misinformation: Telegram has been used as a staging ground for misinformation about the Russia-Ukraine war. Newsguard reports that 42% of the content they debunked regarding the conflict originated on Telegram.
- Algorithm education: An article from Nieman Lab argues that educating people about algorithms can be a useful way to combat misinformation. The article points to this recent study that explores the “algorithmic knowledge gap” in the United States, the United Kingdom, South Korea, and Mexico.
- Social media and extremism: This report from PBS includes interviews with Katie McHugh, former writer and editor for Breitbart, and Christ Bail, the founder of Duke University’s “Polarization Lab”. The video discusses extremism, misinformation, and how they are amplified by social media.
- AI combating conspiracy theories: A recent study tests an AI, called DebunkBot, trained to engage with people who believe in conspiracy theories. According to the study, engagement with DebunkBot can reduce an individual’s belief in these theories by promoting critical thinking and providing factual counterarguments.
This week’s recommended read
Paris 2024 is officially over, and while some of us are eagerly replaying the best moments (we see you, Alex), it’s also time to reflect on the attempts at information manipulation that occurred during this global event. To address this, VIGINUM has published a report on the FIMI (Foreign Information Manipulation and Interference) attempts observed during the Paris Olympics and Paralympics. If you’re not up for translating it, or if your French is a bit rusty, here are some key takeaways:
- Forty-three attempts were identified, summarised into eight TTPs (Tactics, Techniques, and Procedures): false flag operations, amplification of alleged physical events, doxxing, non-transparent use of influencers, creation of original video content (including impersonation), hashtag creation, video decontextualisation, and the management of inauthentic accounts.
- The narratives largely focused on claims that Paris was not a safe or suitable location for athletes, the impact on local residents, and smear campaigns against specific national competitors.
- Interestingly, most documented attempts were not carried out in French, but in other languages (Mandarin, Spanish, Hebrew, etc.), raising many questions about the intended target audience of these operations.
Events & announcements
- 20 September: Bellingcat will launch a training on command line tools for open source researchers (for absolute beginners). This training will be held online.
- 25-26 September: The European Commission Joint Research Centre (JRC) Annual Disinformation Workshop 2024, themed ‘How the EU Tackles Disinformation,’ will take place in Ispra, Italy.
- 26 September: DisinfoCon 2024 – Taking stock of Information Integrity in the Age of AI is a conference for civil society, policymakers, journalists, and AI practitioners. The event will take place in Berlin – if you can’t join in person, you can attend virtually.
- September 30: In partnership with Birmingham City University, Meedan has announced a call for doctoral research applications focused on electoral misinformation and technology-facilitated gender-based violence.
- 1 October: The Tech and Society Summit will bring together civil society and EU decision-makers in Brussels, Belgium, to explore the interplay between technology, societal impacts, and environmental sustainability.
- 2 October: Climate Action Network (CAN) Europe has put out a call for registration for the Western Balkans Youth Climate FORUM 2024. This forum will have an online and in person component, it begins on 2 October and will continue until the end of November.
- 3 October: Adapt Institute will be hosting their conference Disinformation & Democracy, a virtual event bringing together experts from the public and private sectors, academia, and non-governmental organisations.
- 8 October: NATO StratCom will be hosting a one day event in Riga, Latvia, titled Emerging Trends in Social Media
- 9-10 October: We will be hosting our annual conference in Riga, Latvia. Register now! #Disinfo2024.
- 10 October: News Impact Summit: Fighting climate misinformation in Copenhagen, organised by the European Journalism Centre, will address how climate misinformation undermines public trust in climate policies and stalls progress toward a green transition.
- 14 October: Politico will be hosting an event on AI and elections. Potential viewers will be able to register to watch it online as well as apply to attend in person.
- 16 October: UNESCO will organise a webinar “Countering climate disinformation: strengthening global citizenship education and media literacy.”
- 29 October: Coordinated Sharing Behavior Detection Conference will bring together experts to showcase, discuss, and advance the state of the art in multimodal and cross-platform coordinated behaviour detection.
- 14 November: ARCOM, the French Audiovisual and Digital Communication Regulatory Authority, will be hosting Arcom Research Day. The event will be held in Paris as well as streamed online.
- 22 November: CYBERWARCON will take place in Arlington, Virginia, US, bringing together attendees from diverse backgrounds to identify and explore cyber threats.
- 2-3 December: European Media and Information Fund (EMIF) will be hosting its two-day winter event in Florence, Italy, to discuss disinformation trends and provide opportunities for networking.
- 24-27 February 2025: RightsCon 2025, hosted by Access Now, will be held in Taipei, Taiwan, and will bring together leaders and activists to discuss key digital rights issues.
- 23-25 April 2025: A call for papers has been announced for The Cambridge Disinformation Summit, the deadline for application is 25 October.
- 22-25 May 2025: Dataharvest. The European Investigative Journalism Conference will take place in Mechelen, Belgium. Save the date!
Jobs
- Internews has a job opening for an Information Management Officer.
- US Senator Brian Schatz is hiring a tech policy legislative assistant/counsel.
- Science Feedback is looking for an applied researcher to study online misinformation (Paris-based).
- The Center for Democracy & Technology Europe is hiring for two positions, a Legal and Advocacy Officer as well as a Policy and Research Officer.
- The Institute for Strategic Dialogue has several job openings for Executive Director (Germany/hybrid), AI Research Analyst (US/hybrid), Communications and Research Intern (US/Hybrid), Communications Coordinator (US/hybrid), Business Development Manager (UK/hybrid), and Program Manager for Strong Cities (US).
- Barcelona Supercomputing Center is accepting applications for Group leader of the Social Link Analytics unit – biases and evaluation of AI tools in health / disinformation in social networks / Bioinfo4Women.
- Check First has an opening for a Front End Developer.
- The Center for Countering Digital Hate (CCDH) has multiple job openings including, Director of Creative Industry Engagement (US), Head of Communications (US), Research Officer (UK), Data Scientist (Flexible with option of using London office), and HR Coordinator (UK).
- Debunk.org has multiple openings including Researcher and Analyst for Disinformation Analysis, Media Literacy Expert on Disinformation, Researcher and Analyst for Disinformation Analysis in Mongolia, Administrator/Project Coordinator for Countering Disinformation, Voice Actors for Digital Literacy Program, and Translators for Digital Literacy Program.
- Logically has several vacancies including Assistant Editor (India), Fact Checker -Telugu- (India), Media Literacy Trainer (India), Senior NLP Engineer (Spain), and VP of Business Development (US).
- The International Committee of the Red Cross is hiring an Adviser on addressing harmful information (Cairo).