Dear Disinfo Update readers,

As we countdown the weeks to #Disinfo2024, we are delighted to see that our community is growing – over 350 participants already registered! Make sure to grab one of those remaining tickets, to not miss out on this opportunity to learn, connect and engage. There are also a few slots still open for sponsoring the conference, so if you’re interested, get in touch ASAP!

In this edition of our newsletter we will explore the use of disinformation in the context of the US elections, how AI is being used for foreign interference as well as election disinformation, and cyber attacks.

We want to make sure our newsletters are as relevant and engaging as possible. To do that, we need your help! By taking a few minutes to complete our reader survey, you’ll play a crucial role in helping us understand what you enjoy and what you’d like to see more of.

Our webinars

UPCOMING – REGISTER NOW!

12 September: Domain name dispute resolution to fight media impersonation | This session tackles the growing issue of disinformation networks such as Doppelganger hijacking domain names to impersonate legitimate media outlets. We’ll explore practical strategies for identifying and resolving domain name disputes, equipping media organisations with the tools to protect their brand and credibility in an often deceptive online landscape.

PAST – WATCH THE RECORDINGS!

Advancing synthetic media detection: introducing veraAI | This session delves into the latest advancements in synthetic media detection, with a strong focus on the innovative work conducted within the veraAI project. Symeon (Akis) Papadopoulos provides useful insights into the work of veraAI.

Watch our other past webinars here.

Disinfo news & updates 

  • UK riots and disinfo dilemma: Two articles from the BBC describe developments relating to the Southport attack and the following spread of disinformation. A man was arrested in Pakistan, accused of cyberterrorism for allegedly creating the false information that fueled riots throughout the UK. Following his arrest, charges against him were dropped after authorities failed to uncover evidence linking him as the source of the disinformation. 
  • France-Soir, the price of disinformation: The website France-Soir has lost its official status as an “online press service”. The decision follows allegations that the site promotes misleading medical advice and conspiracy theories, notably during the COVID-19 pandemic. Despite the status change, the website remains active  (at the time of publication of this newsletter).  
  • POLADA cyber attack: A cyber attack targeting POLADA (Poland’s anti-doping agency), resulted in the release of false doping allegations against top athletes. POLADA quickly debunked the claims and responded that the leaked data was entirely fabricated.
  • AI and Public Services: An AI set of tools used by the UK government will be published on a public register. This move comes after various campaigns challenging the use of AI systems, which have been accused of amplifying ‘racist and biased’ content. While only a few records have been published so far, the register is part of a broader effort to ensure AI is used responsibly and fairly.
  • Bangladesh, Hasina, and disinformation: Following the resignation of former Prime Minister Sheikh Hasina, disinformation in Bangladesh is exacerbating tensions and hampering stabilisation efforts.  The false narratives particularly focus on religious minorities. Claims about genocide against the Hindu community have been accompanied by manipulated footage.

Interference, Lies, and AI Misuse in the US Elections

  • Foreign influence and AI: OpenAI reveals the role of generative AI in an Iranian influence operation, known as Storm-2035. This covert campaign exploited ChatGPT to create content aimed at swaying public opinion on various issues, including the US presidential elections. OpenAI has found that this operation used ChatGPT to generate content for 12 X accounts and 1 Instagram account. The ChatGPT accounts have been shut down by OpenAI.
  • Trump campaign cyber attack: Iran has hacked the Trump campaign and leaked internal communications. In addition, the country has  conducted a spear phishing attack on campaign officials. Similar efforts were taken against Kamala Harris’ campaign, but were ultimately unsuccessful.
  • Elections, Photoshop, and AI: The use of fake celebrity endorsements has been investigated by the News Literacy Project, involving both photoshopped and AI generated images. Former US President Donald Trump shared manipulated images on Truth Social depicting Taylor Swift and her fans in apparent support of him, captioning the photos with “I accept!”. In addition, Ryan Reynolds was photoshopped to appear wearing clothing  supporting Vice President Kamala Harris. The News Literacy Project offers useful resources, including a tool known as RumorGuard.
  • Fined over fake robocalls: Lingo Telecom has agreed to pay a fine of $1 million imposed by the Federal Communications Commission (FCC) for transmitting fake robocalls mimicking President Joe Biden. These robocalls were generated using AI voice-cloning technology, and aimed at dissuading voters from participating in New Hampshire’s Democratic primary, where part of a disinformation campaign has been orchestrated by political consultant Steve Kramer.

Reading & resources

  • Digital marketing and AI: A report from NATO Stratcom examines the use of AI as a tool for digital marketing and targeted persuasion. The report discusses the manipulative use of AI as a tool for precision persuasion campaigns.
  • New tools from Bellingcat: Bellingcat’s new Shadow Finder tool offers a powerful boost to  OSINT investigations. The program can assist in determining where a picture was taken. After inputting the date and time the picture was taken, the tool measures an object and the shadow it casts. That way, the Shadow Finder tool can narrow down the possible location to a handful of countries and locations. In addition to that tool, Bellingcat offers  another resource called the Smart Image Sorter, which helps with  classifying images.
  • Atrocity prevention and disinformation: The United States Agency for International Development (USAID) has updated their Atrocity Prevention Practitioners Guide. The document provides programmatic recommendations for supporting independent media as well as mitigating risks from disinformation and hate speech. These sections can be found on pages 42-45.
  • OSINT training: Janes has updated their OSINT training, including both in person and online training sessions.
  • ’Operation Overload’ podcast: This podcast episode from the International Press Institute discusses ‘Operation Overload’, a disinformation campaign targeting journalists and fact checkers with pro-Russian content. Guillaume Kuster, the co-founder and CEO of Check First, discussed his investigation of this disinformation campaign. Watch our past webinar on the same topic!
  • User led media assessment:  MIT Technology Review discusses Trustnet, a new browser extension that will enable users to label the accuracy of content as well as view the assessments a source has received. This tool can provide users the opportunity to actively participate in the fact checking process.
  • You are what you eat: How Disinformation Spreaders Use Food Topics to Manipulate Europeans. This report, part of the Climate Facts Europe project, explores how food disinformation is exacerbated and intensified when related to international crises.

This week’s recommended read

This time, our Managing Director Gary Machado suggests reading “Opération spéciale: Dix ans de guerre entre Russie et Ukraine, vus et vécus depuis le Donbass” (“Special operation: Ten years of war between Russia and Ukraine, seen and experienced from Donbass”) by Paul Gogo. This book offers an in-depth and personal perspective on the ongoing conflict. One of the key aspects highlighted is the pervasive influence of propaganda on the Russian-speaking population and the profound challenges in countering this narrative. The book is written in French.

Events & announcements  

  • 25-26 September: The European Commission Joint Research Centre (JRC) Annual Disinformation Workshop 2024, themed ‘How the EU Tackles Disinformation,’ will take place in Ispra, Italy. Remote participation is available. Registration closes on 2 September.
  • 26 September: DisinfoCon 2024 – Taking stock of Information Integrity in the Age of AI is a conference for civil society, policymakers, journalists, and AI practitioners. The event will take place in Berlin – if you can’t join in person, you can attend virtually.
  • 1 October: The Tech and Society Summit will bring together civil society and EU decision-makers in Brussels, Belgium, to explore the interplay between technology, societal impacts, and environmental sustainability.
  • 3 October: Adapt Institute will be hosting their conference Disinformation & Democracy, a virtual event bringing together experts from the public and private sectors, academia, and non-governmental organisations.
  • 8 October: NATO StratCom will be hosting a one day event in Riga, Latvia, titled  Emerging Trends in Social Media
  • 9-10 October: #Disinfo2024, we will be hosting our annual conference in Riga, Latvia. Register now! We still have a few slots open for sponsoring Disinfo2024 – get in touch!
  • 10 October: News Impact Summit: Fighting climate misinformation in Copenhagen, organised by the European Journalism Centre, will address how climate misinformation undermines public trust in climate policies and stalls progress toward a green transition.
  • 16 October: UNESCO will organise a webinar “Countering climate disinformation: strengthening global citizenship education and media literacy.”
  • 29 October: Coordinated Sharing Behavior Detection Conference will bring together experts to showcase, discuss, and advance the state of the art in multimodal and cross-platform coordinated behaviour detection.
  • 14 November: ARCOM, the French Audiovisual and Digital Communication Regulatory Authority, will be hosting Arcom Research Day. The event will be held in Paris as well as streamed online.
  • 22 November: CYBERWARCON will take place in Arlington, Virginia, US, bringing together attendees from diverse backgrounds to identify and explore cyber threats.
  • 2-3 December: European Media and Information Fund (EMIF) will be hosting its two-day winter event in Florence, Italy, to discuss disinformation trends and provide opportunities for networking.
  • 24-27 February 2025: RightsCon 2025, hosted by Access Now, will be held in Taipei, Taiwan, and will bring together leaders and activists to discuss key digital rights issues.
  • 23-25 April 2025: A call for papers has been announced for The Cambridge Disinformation Summit, the deadline for application is 25 October.
  • 22-25 May 2025: Dataharvest: The European Investigative Journalism Conference will take place in Mechelen, Belgium. Save the date!

Jobs 

  • Barcelona Supercomputing Center is accepting applications for Group leader of the Social Link Analytics unit – biases and evaluation of AI tools in health / disinformation in social networks / Bioinfo4Women 
  • Check First has an opening for a Front End Developer
  • AccessNow is accepting applications for a Security Officer 
  • European University Institute has openings or Director of Communications Service, and Chair in Political and Social Sciences
  • Bellingcat is accepting applications for an Open Source Analyst Consultancy
  • The PhilTech Research Center for the Philosophy of Technology within the Department of Philosophy at the University of Milan has an opening for a 2-year postdoctoral position within the Project Techne. 
  • The Center for Countering Digital Hate (CCDH) has multiple job openings: Head of Communications (US-based), Research Officer (UK, flexible working in London office or fully remote), Data Scientist (location flexible), UK Parliamentary and Policy Affairs Officer (London-based), and HR Coordinator (London-based).
  • The Centre for Information Resilience (CIR) is hiring an Open Source Investigator – Eyes on Russia
  • The Global Disinformation Index is hiring for several positions including Chief Impact Officer (remote, London), Chief Impact Officer (remote, Berlin), and Chief Impact Officer (remote, Texas).
  • Debunk.org has multiple openings including Researcher and Analyst for Disinformation Analysis, Media Literacy Expert on Disinformation, Researcher and Analyst for Disinformation Analysis in Mongolia, and Administrator/Project Coordinator for Countering Disinformation
  • Logically has several vacancies including Account Director (UK), Assistant Editor (India), Business Development Representative (UK), Principal Front End Developer (UK), Senior NLP Engineer (Spain), Technical Programme Manager (Bangalore), and VP of Business Development (US).