Dear Disinfo Update readers,
Welcome to another instalment of our newsletter, where we sift through the noise to bring you the latest insights into the world of disinformation.
In this edition, we observe a new mysterious player in the Indian digital arena: ‘The DisinfoLab’! 👀 Although not officially claiming any connections to our authentic entity, it unmistakably appears to be a copycat attempt, riding on the coattails of our credibility.
Also in this newsletter: a wealth of pertinent reading recommendations addressing climate disinformation in light of COP28, a timely update to our factsheet on AI content policies across platforms, yet another stroll around the varying European disinformation landscapes, and jumping into a new project, ATHENA, to tackle foreign information manipulation and interference.
A few weeks ago, we embarked on an initiative to unite the counter disinformation community on Bluesky and Mastodon. The ones most bravely stepping away from certain platforms that are getting crazier by the day (no names mentioned) are already securely on the list. But if there’s anyone else out there … there, there, there – here’s your final call! Fill in this form.
As we tie a bow on this edition, consider this our last newsletter before the festive break. We promise to return with renewed energy in the new year. Happy reading and happy holidays!
Disinfo news & updates
- Who’s behind the Indian DisinfoLab? In the wake of our groundbreaking EP Today and Indian Chronicles investigations, a new and enigmatic entity has emerged in the Indian digital arena. Known as “The DisinfoLab”, the organisation has been actively championing the Modi government, raising eyebrows and suspicions alike. Using sources such as former staff of this Indian clone, the Washington post reports the Indian DisinfoLab is run by an Indian intelligence officer. While their official description mentioned there were clearly no links with our organisation, a former staff member clearly states that this group was mimicking organisations like ours or the Digital Forensic Research Lab (DFRLab) to piggyback on credibility in Western media. This development not only raises serious questions about the motives and methods of “The DisinfoLab”, but also highlights the increasingly murky waters of information warfare and propaganda in the digital age. As we continue to delve deeper, we remain committed to bringing the truth to light, ensuring transparency and integrity in the flow of information.
- The Amazing Library. This study by CheckFirst and AI Forensics sheds light on Amazon’s recommendation algorithm, revealing concerns about content amplification and user entrapment in misleading narratives. The study suggests that Amazon may be violating the EU’s Digital Services Act (DSA), and its findings aim to contribute to discussions on creating a safer online environment within the framework of the DSA.
- Dublin riots: platforms discuss their responses. Meta, TikTok, and Google appeared before the Oireachtas media committee to discuss their response to the Dublin riots, highlighting the steps taken to tackle disinformation. Criticism was directed at X (formerly Twitter) for not attending the meeting, with concerns raised about Elon Musk’s comments on Ireland.
- Meanwhile at X. Elon Musk has reinstated InfoWars founder and conspiracy theorist Alex Jones on X, despite previously stating that he would stay banned. Musk conducted a poll on whether Jones should return, which Jones won, leading Musk to allow his return. The decision may appeal to far-right supporters but is unlikely to sit well with advertisers who might prefer to distance themselves from endorsements of conspiracy theories.
- AI-powered harassment. This ISD article discusses a network of X/Twitter accounts using content generated by ChatGPT – in general highly authentic-looking, with some exceptions – to engage in a targeted harassment campaign against Alexey Navalny and his Anti-Corruption Foundation (ACF), to undermine the support among pro-Ukraine Western audiences.
- Fake celebrity cameo. A new disinformation tactic by the Russian government involves repurposing celebrity videos from platforms like Cameo to create false narratives to denigrate Ukraine’s president, Volodymyr Zelensky. Additionally, a separate campaign involving posts on Facebook and X features photos of global celebrities with block quotes echoing Kremlin propaganda messages. This article dissects some more Russian cyber and influence operations aiming to demoralise Ukrainians, degrade external support, and wreak havoc, including phishing campaigns impersonating the European Parliament staff.
- Ads. Ahead of the upcoming European Media Freedom Act (EMFA), the Slovenian Government recently adopted ‘Recommendations for transparent financing of advertising with public funds’. These guidelines aim to ensure transparent and non-discriminatory allocation of advertising funds, addressing previous concerns related to the support of disinformation by media outlets. New rules are now set up for advertising campaigns by ministries, governmental agencies, and services, emphasising principles like legality, truthfulness, transparency, objectivity, cost-effectiveness, and respect for media freedom. Additionally, campaigns exceeding €50.000 require government approval, and annual campaign reports will be published online starting with 2023 campaigns. Read more here (in Slovenian).
- Ticking the boxes. Without making much of a fuss about it, X has implemented a solution to comply with the Digital Services Act (DSA), offering researchers access to public data on their platform. Check out this X thread to discover how you can be granted access.
Brussels corner
- DSA databases. The European Commission published its ‘Digital Services Terms and Conditions Database’ that encompasses 790 terms and conditions from more than 400 services provided by over 290 distinct service providers. This database follows the launch of the DSA Transparency Database, which contains all content moderation decisions of online platforms.
- Meaningful DSA transparency! The European Commission is seeking input to shape the Digital Services Act (DSA) transparency. Contribute your insights on the reporting rules and templates before the 24 January deadline via this consultation.
What we’re reading
- Radical transparency. In this Seattle Times article, disinformation researcher Kate Starbird comes back to her experience in the 2020 US elections and draws lessons for the next election cycle.
- Hamilton 2.0. The Hamilton 2.0 Dashboard, developed by the Alliance for Securing Democracy, provides a summary analysis of the narratives and topics promoted by Russian, Chinese, and Iranian governments and state-backed media across various platforms.
This week’s recommended reads
As the climate future has been discussed these last two weeks at COP28 in Dubai (ending today), and climate disinformation being an important challenge before and during this event, we’re not limiting ourselves to just one recommended read! Our researcher Ana Romero Vicente has prepared for you, dear readers, an array of must-reads on the matter – our humble contribution to counter climate disinformation through information and research.
Ahead of COP28, the CAAD published a report warning that diplomats from Russia and China would attend the event. These nations are considered to be the most substantial sources of false or misleading information about the global climate, and have been accused of using climate disinformation as a geopolitical tool. The research also denounces other sources of disinformation such as companies that extract fossil fuels. Important note in this regard: There were 2.456 fossil fuel lobbyists granted access to this year’s conference. In this line, Amnesty International claimed in this article that “fossil fuel companies seek to shape public opinion through greenwashing and disinformation”. Additionally, suspicions arise with Dubai hosting the event, given the United Arab Emirates’ significant role in oil export. Leaked documents suggest that the UAE was planning to use its host role to pursue oil and gas deals around the world.
And there is more. This article from Deutsche Welle gives a good summary of how climate-denying bots on X or Facebook ads from Big Oil and right-wing news sites create an echo chamber of disinformation that clouds efforts to reduce emissions during COP28.
Additionally, this Desmog publication highlights that beneath the conflict of interests and the lobbying, at this year’s conference lies a minefield of strategic language and misleading messaging such as terms like “unabated” fossil fuels, “operational” emissions or being “part of the solution”. The use of these words is a distraction tactic intentionally employed due to its ambiguous nature, according to this RTVE article.
To conclude, do not miss the replay of this insightful event by the European Commission’s Joint Research Centre (JRC) that took place in parallel to the COP28. It advises those countering disinformation to lead the conversation with a positive narrative: don’t let disinformers set the agenda, provide a pre-bunking structure, and make facts more convincing than falsehoods!
The latest from EU DisinfoLab
- Upping our AI policy game. Recent technical developments and the growing use of generative AI systems by end-users have exponentially increased the challenges linked to disinformation. In September 2023, we published a factsheet on how some of the main platforms approach AI-manipulated or AI-generated content in their terms of use, and how they address its potential risk of becoming mis- and disinformation. In November 2023, YouTube and Meta announced additional measures regarding AI content. To keep up, we’ve updated our factsheet – version 2 of the ‘Platforms’ policies on AI-manipulated and generated misinformation’ factsheet is now live here!
- Connecting the dots… Between March and December 2023, the EU DisinfoLab released 20 factsheets as part of the series titled ‘The disinformation landscape across Europe’. These factsheets form the basis of our new comparative analysis, ‘Connecting the disinformation dots: insights, lessons, and guidance from 20 EU Member States’, that highlights consistent challenges faced by the European Union in combatting disinformation campaigns, and presents our ten recommendations addressed to policymakers and the general community. Download the study here!
- …and more. In our recent webinar ‘Connecting disinformation dots across Europe – lessons from the Slovak and Polish elections’, we presented key findings derived from our new comparative study. The webinar not also delved into the specific impact of disinformation during the national elections in these two countries, and explored insights and lessons that could be applied in anticipation of the upcoming EU elections. If you were unable to join us for the live webinar, catch up by watching the replay!
- Shielding Europe from FIMI. In early November, together with 13 partners, we set in motion ATHENA, a Horizon Europe project that contributes to Europe’s defence against foreign information manipulation and interference (FIMI). We’re curious and excited to start this promising journey! Explore ATHENA and discover our involvement in various other ongoing and past projects we’ve had the privilege to be a part of. If you’re seeking a committed partner for your future projects, don’t hesitate to reach out!
Events & announcements
- 12 December: The Heinrich Böll Foundation’s ‘Climate lies’ webinars aims to build resilience to climate disinformation. The second one in the series ‘How right-wing populists and conspiracy groups use climate lies in the German debate’ takes place this afternoon, and the third one focusing on the best practices to protect ourselves and take action against disinformation on the climate crisis will air in January. If you missed the inaugural edition with insights into the threats climate lies pose to our democracy, you can watch here.
- 14 December: This in-person event in Brussels tackles the crucial issue of disinformation in the Western Balkans, and offers insights to the results of the citizens’ dialogues organised to find collaborative strategies to counter disinformation.
- 15 December: This online conference, ‘MLA4MedLit: Standards and Best Practices in Media Literacy in Europe’, organised by an EDMO Working Group, discusses standards and best practices to help evaluate the effectiveness of interventions.
- 18 December: The next webinar in the series of the EDMO BELUX Lunch Lectures will focus on how democracies are affected by polarisation, with our guest speaker Kamil Bernaerts, Vrije Universiteit Brussel (VUB) and University of Warwick, who will present findings of a large-scale comparative study on the topic. Sign up here for this occasion to learn and exchange!
- 27 February – 1 March 2024: To explore and exchange on initiatives, tools, projects and practices around media literacy, apply to participate in the European Digital and Media Literacy Conference in Brussels.
- Advance research! The International Observatory on Information and Democracy invites a global collaboration to collect research on four key themes: AI, information ecosystems, and democracy; media politics and trust; data governance and democracy; with a cross-cutting focus on misinformation and disinformation. This call welcomes submissions from individuals or institutions engaged in any capacity with these critical issues. Explore the details here and share your insights before the deadline on 7 January 2024.
- Access a treasure trove. Meta Content Library is a web-based tool that allows researchers, since late November, to explore and understand data across Facebook and Instagram by offering a comprehensive, visual, searchable collection of publicly available content. To be eligible for product access, researchers must be affiliated with a qualified academic institution or a qualified research institution. Researchers from different disciplinary and professional backgrounds can apply, too. Find more information here!
- Living database to counter disinfo. Say No to Disinfo are building an online living database to aggregate, categorise, curate and extract key information from empirical studies on interventions to counter mis- and disinformation. Academics and campaigners are welcome to share any relevant experiments and field data by sending them to sahil[at]saynotodisinfo.com and/or ari[at]saynotodisinfo.com!
Jobs
- Newtral, a Spanish fact checking company, is looking for a Madrid-based Media Literacy project manager.
- Logically AI has open positions for a Media Literacy Trainer, a Senior OSINT Analyst, and several other roles.
- European Digital Rights (EDRi) is looking for its new Head of Policy to join its team in Brussels.