Dear Disinfo Update readers,

Disinfo2024 is right around the corner, and we couldn’t be more excited! Next week, we’ll be gathering with experts, policymakers, and thought leaders to tackle the most pressing challenges in disinformation, with engaging discussions, networking opportunities, and fresh perspectives. We can’t wait to see you there!

In this edition of our newsletter, we discuss AI and its potential for generating misinformation, climate-related disinformation, and the upcoming US presidential election.

If you haven’t already done so, please take our quick reader survey. It only takes a few minutes, and your feedback can help us improve this newsletter. Don’t miss the chance to have your say!

Our webinars

With 22 webinars already organised since the start of 2024, we’ve been dedicated to providing expert insights on the most pressing issues. Catch up on some of our most insightful ones! Explore how patterns in past elections can offer clues for the future, and examine the emerging AI-related risks that could impact electoral integrity. Dive into the latest AI tools and techniques for digital investigations, uncover Russian influence operations (Doppelganger, CopyCop/False Façade, Overload…), and learn more about how platforms are addressing EU regulations to combat climate disinformation. Don’t miss these recordings, among all the others.

We’ll be continuing our webinar series after our annual conference, #Disinfo2024, bringing you even more discussions on the latest trends and challenges in the fight against disinformation.

Disinfo news & updates

  • FIMI and the US elections: According to research from Microsoft, a Russian disinformation operation is the originator of a false claim about Vice President Kamala Harris’ involvement in a hit-and-run. The Russian group reportedly paid an actor to appear as the supposed victim. The group created a video and shared the claim to a website for a nonexistent news outlet.
  • LLMs and Disinformation: VeraAI discusses the potential for language learning models (LLMs) to generate disinformation. The study explores five disinformation narratives: COVID-19, the Russo-Ukrainian war, health issues, the US election, and regional topics.
  • FBI crackdown: The FBI conducted a joint operation to take down Flax Typhoon, a Chinese botnet that hijacked routers and internet of devices like cameras and video recorders. The FBI identified thousands of infected devices and removed malware from them. The Chinese government has denied accusations of cyber operations, claiming instead that these are part of a US disinformation campaign aimed at framing China.
  • Musk fined: X was briefly accessible in Brazil recently despite being banned in the country. As a result, the Brazilian government has fined Elon Musk
  • AI and disinformation: “Relax and enjoy the ride. There is nothing we can do to stop climate change, so there is no point in worrying about it.” This is what Google chatbot Bard told researchers in 2023. DeSmog talks with AI researchers about the risks of widespread fake climate content and how to combat it. 
  • Social media warning labels: A bill has been introduced to Congress in the US, requiring social media platforms to have warning labels. These warning labels are intended to mitigate risks to children and would be created by the US Surgeon General and the FTC (Federal Trade Commission).

Reading & resources

  • OSINT toolkit: Bellingcat has published a new toolkit for online investigations. It provides tools for maps/satellites, geolocation, image/video, social media and more. The toolkit describes the tools, provides the URL, and indicates whether or not the tool is free.
  • Election protections: The Institute for Strategic Dialogue (ISD) has published a policy brief on electoral integrity in the digital age. The document reviews steps taken to mitigate online risks to elections and provides recommendations for potential strategies to protect elections.
  • Atrocity and tech: This article explores AI, mis/disinformation, and weaponised internet access as key features of contemporary mass atrocities. The article provides recommendations for evolving atrocity prevention strategies to better respond to the use of technology in the facilitation of mass atrocities.
  • Greenwashing athletics: This investigation by the New Weather Institute reveals that sport is increasingly one of the areas oil and gas companies are using to greenwash their reputation, betting billions on sponsorships that allow them to paint themselves as public-spirited, generous corporations. The findings in this report suggest that a combination of state-owned and private fossil fuel companies are spending at least $5.6 billion on sponsorship of global sport across 205 active deals. 
  • Tenet media & climate disinformation: A recent investigation reveals that Tenet Media, currently under scrutiny by the US Department of Justice for alleged connections to Russia, has disseminated climate denial messages to more than 16 million followers. Their misleading climate content has garnered over 23.5 million views on X and YouTube, where these platforms still profit from the spread of such harmful information.
  • Election Integrity Under Threat: As the 2024 US presidential election approaches, researchers have sounded the alarm about the growing threat of coordinated online disinformation campaigns. A recent study has exposed a network of inauthentic accounts on social media that are working together to spread misleading information and propaganda.
  • New books: Check out these books about disinformation, conspiracy theories, and algorithms. The Quiet Damage: QAnon and the destruction of the American family by Jesselyn Cook, discusses the stories of five families and the effects of conspiracy theories on those families. Invisible Rulers: the people who turn lies into reality by Renée DiResta explores how certain actors leverage algorithms to share propaganda. The Lie Detectives: in search of a playbook for winning elections in the disinformation age by Sasha Issenberg discusses the research and strategies being developed for political campaigns to deal with disinformation. An article from the NY Times interviews each of these authors about AI and misinformation’s effect on the US presidential election.

This week’s recommended read

This edition’s recommended reading comes from our Researcher Ana Romero Vicente. She was captivated by the topic of Amy Westervelt’s Climate Week NYC panel, “The Mad Men of Big Oil,” which took place on September 23 and examined the ‘Don Draper’ role of climate disinformation, shedding light on how Hollywood, entertainment, and advertising have shaped the narrative of Big Oil over the years. Although she couldn’t attend, the panel inspired her to revisit an insightful article: “Mad Men: Meet the spin masters who created corporate propaganda to help polluting industries get around democracy.” This compelling read delves deep into the history of corporate disinformation, drawing connections between early public relations strategies and the current state of climate denialism.

The article is a powerful exploration of how disinformation has evolved over the last century. It begins by tracing the origins of PR, highlighting key figures like Ivy Lee and Edward Bernays, who pioneered methods to manipulate public opinion on behalf of powerful industries. The piece reveals how these tactics, initially employed to defend companies like Standard Oil and American Tobacco, laid the groundwork for today’s climate denial and disinformation. It also draws fascinating connections between the disinformation campaigns surrounding tobacco, climate change, and contemporary movements like vaccine hesitancy and political extremism. Understanding this history is essential to grasp the broader context of today’s information pollution, and the article provides a thorough, engaging breakdown.

Events & announcements

  • 3 October: Adapt Institute will be hosting their conference Disinformation & Democracy, a virtual event bringing together experts from the public and private sectors, academia, and non-governmental organisations.
  • 8 October: NATO StratCom will be hosting a one-day event in Riga, Latvia, titled Emerging Trends in Social Media
  • 9-10 October: Our annual conference #Disinfo2024 will take place next week in Riga, Latvia.
  • 10 October: News Impact Summit: Fighting climate misinformation in Copenhagen, organised by the European Journalism Centre, will address how climate misinformation undermines public trust in climate policies and stalls progress toward a green transition.
  • 14 October: Politico will be hosting an event on AI and elections. Potential viewers will be able to register to watch it online as well as apply to attend in person.
  • 16 October: UNESCO will organise a webinar “Countering climate disinformation: strengthening global citizenship education and media literacy.”
  • 29 October: The Coordinated Sharing Behavior Detection Conference will bring together experts to showcase, discuss, and advance the state of the art in multimodal and cross-platform coordinated behaviour detection.
  • 14 November: ARCOM, the French Audiovisual and Digital Communication Regulatory Authority, will be hosting Arcom Research Day. The event will be held in Paris as well as streamed online.
  • 22 November: CYBERWARCON will take place in Arlington, Virginia, US, bringing together attendees from diverse backgrounds to identify and explore cyber threats.
  • 2-3 December: The European Media and Information Fund (EMIF) will be hosting its two-day winter event in Florence, Italy, to discuss disinformation trends and provide networking opportunities.
  • 24-27 February 2025: RightsCon 2025, hosted by Access Now, will be held in Taipei, Taiwan, and will bring together leaders and activists to discuss key digital rights issues.
  • 23-25 April 2025: A call for papers has been announced for The Cambridge Disinformation Summit, the deadline for application is 25 October.
  • 22-25 May 2025: Dataharvest, the European Investigative Journalism Conference will take place in Mechelen, Belgium. Save the date!

Jobs

  • EU DisinfoLab is looking for an intern starting in early 2025. Apply to join our team!
  • Interface is looking to hire a Senior Policy Researcher.
  • The Center for Countering Digital Hate has job openings for the Director of Creative Industry Engagement, Head of Communications, HR Coordinator, and Senior Press Manager
  • ProPublica has several job openings for a visuals editor, temporary newsletter writer, data editor, engagement reporter, senior editor for the local reporting network, and senior engineer
  • The National Education Association is looking to hire a counter disinformation intern.
  • Internews is hiring a senior director of global internet & technology initiatives and a finance manager.
  • Check First is hiring a front-end developer.
  • Logically has several job openings, including assistant editor (India), fact checker (India), media literacy trainer (India), and software engineering manager (UK).