Dear Disinfo Update readers,

Welcome to our fortnightly newsletter on disinformation that offers you a curated selection of news, events, and announcements in the disinformation field from around the world.

Summer has been an incredibly busy period for the EU DisinfoLab team, as we’ve been engrossed in preparations for our upcoming publications and the annual conference. And *drum roll* we’re thrilled to announce that the final programme is now ready and available for viewing here! While the agenda already features some surprises, stay tuned – there may be even more excitement in store!

We’re expecting a robust turnout for the #Disinfo2023 conference in Krakow, with over 350 participants slated to join us. We’re grateful for the overwhelming response and want to let you know that only a limited number of seats remain available. Secure your spot now to avoid landing on the waiting list!

We appreciate your understanding as our entire team focuses intently on crossing the finish line. Due to our preoccupation with final preparations, responses to queries may be slightly delayed. Thank you for your patience and ongoing support.

Looking forward to seeing you in Krakow!

Disinfo news & updates

  • Acting on cancer treatment misinformation. As a major update of their health misinformation policies, YouTube is revamping its stance on medical misinformation, refining its approach into three core categories: Prevention, treatment, and denial. Read about their long term vision here.
  • X files lawsuit. The Center for Countering Digital Hate (CCDH), a non-profit combatting hate speech and misinformation online, recently faced Elon Musk’s legal wrath for alleging increased hate speech on Twitter/X under his ownership. CCDH suggests that the lawsuit aims to drain their resources. The clash raises questions about the role of transparency, accountability, and shared responsibility in curbing online misinformation and hate.
  • Xtremisstep. X (ex-Twitter) recently introduced new tools to help advertisers control their ad placement, by avoiding adjacency to unwanted keywords and handles, to ensure brand safety. Despite the promised a 99% efficacy rate for this feature, a number of major brands found their ads displayed alongside content promoting fascism and antisemitic harassment, and suspended their advertising on the platform.
  • Election integrity checklist. In the run-up for the European elections next June, 56 organisations call on European leaders to meet the challenge of safeguarding the integrity of the election information environment, using their powers under the Digital Services Act (DSA). Find the five key claims for the online platforms’ election plans here.
  • Election heatwave. Conspiracy theories and misinformation, namely on London’s Ultra Low Emission Zone (ULEZ) policy, played a role in a recent UK local election. There are concerns they could also impact next year’s UK general election, with climate issues becoming a potential culture war wedge issue. Read more here.
  • New ploys in the pipeline. The Kremlin seems to have shifted its climate change disinformation tactics; from portraying it as a Western invention to hinder Russia’s development to accepting the scientific consensus, but blaming the EU for corruption, hypocrisy, incompetence, and neo-colonialism in their climate actions – to ultimately promote Russian gas. Further insights in this article.
  • Qur’an quandary. Russian state controlled media outlets published false articles in Arabic, claiming the Swedish government supports the Qur’an burnings that happened in June in the country, likely in an attempt to disrupt Sweden’s NATO membership process, which is pending the formal approval from Turkey and Hungary.

What we’re reading

  • Siren call. Out of genuine and deepfake speech audio presented to listeners in this study, deep fakes were correctly spotted only 73% of the time. As speech synthesis algorithms improve, detecting them will likely become even harder.
  • Disinformer Barbie. The impersonation tactic used in a “covert operation” aiming to raise attention on the environmental impact of toy manufacturing was undoubtedly a success – but could backfire if used for malicious purposes such as spreading election misinformation. Read more here.
  • Confining the algorithmic genie. The AI Act, EU’s (and the world’s) first comprehensive regulatory framework for artificial intelligence, is being negotiated between the EU institutions and member states. Read the summary of the discussion between experts in AI, law, and policy, or watch the full panel.
  • Eh, AI? What is that again? This explainer sheds light on how artificial intelligence machine learning models work, and what are their potential dangers.
  • Fighting disinformation with pride. This site offers resources for LGBTQI campaigners – useful for other campaigners too! – including a disinformation self-learning guide to understand the mechanics of disinformation, and to take fast and effective action.
  • Telegram’s trenches. The “War in Ukraine: Military Bloggers dashboard” offers analysis of narratives and the reach of prominent Russian military bloggers on Telegram. Take a look.
  • Summit old, summit new. This Graphika report investigates two influence operations conducted by Russia-linked actors to shape online conversations around the July 2023 NATO Vilnius summit, one of which is likely connected to Doppelganger.

This week’s recommended read

Maria Giovanna Sessa, Research Manager at EU DisinfoLab, recommends taking a look at a recent study published in the Harvard Kennedy School Misinformation Review. The study finds a connection between Elon Musk’s Twitter/X acquisition and increased engagement among contentious far-right actors, while the blue-tick verification did not seem to contribute significantly to the boost in engagement. It sheds light on the complex interplay between platform ownership changes, verification mechanisms, and engagement dynamics within contentious user networks.

The latest from EU DisinfoLab

We’re honoured to have a thoughtful group of speakers, including Nina Jankowicz, Ben Nimmo, Lindsay Hundley, and Imran Ahmed, for our upcoming #Disinfo2023 annual conference. If you’re interested in learning from others who are also navigating the challenges of countering disinformation, we’d love for you to join us! A few remaining seats are available – consider reserving yours today!

Events

  • 13 September: DiTox 2023, 1st International Workshop on Disinformation and Toxic Content Analysis, will take place in Vienna, Austria, as part of LDK 2023.
  • 14 September: Sign up here for the next EDMO BELUX Lunch Lecture to hear about how scientists can cope with negativity on social media.
  • 20 September: How to adapt media literature resources to your local context? Register here to join this interactive online event that will offer you tips for localising educational resources.
  • 22 September: SANS OSINT Summit 2023 will take place online. Register here.
  • 28-29 September: EDMO Hubs will meet in Dubrovnik, Croatia.
  • 11-12 October: Last call! Register for #Disinfo2023.
  • 13-14 November: The Conference ‘Tackling disinformation: Strengthening democracy through information integrity’ will take place in Paris. Register here.
  • 20-21 November: The first edition of the European Congress on Disinformation and Fact-Checking will take place in Madrid and online, focusing on the theme of “Disinformation Across the EU-Ukraine Media Landscape”. Register here.
  • 26-27 February 2024: EDMO Scientific Conference 2024 will be organised in Amsterdam. Keep a close eye on the call for papers that will be launched in September.
  • Be a Digital Sherlock. Today is the last day to submit your application for the 360/Digital Sherlocks Fall 2023 programme, here.
  • EDMO Training Programme. Six more online training sessions planned for 2023, on topics ranging from YouTube to the economics of disinformation, OSINT, and InVID-WeVerify toolbox. Check them out here!
  • Food for thought. Watch the recording of this autumn’s first EDMO BELUX Lunch Lecture ‘Facebook Hustles’ with Guillaume Kuster.
  • Fellow up! Applications for the Center for Democracy & Technology (CDT)’s 2024 cohort of Non-Resident Fellows for a wide range of academic fields and disciplines are open. Details here.

Job opportunities

This good X!