Good morning, Disinfo Update readers!

Welcome to this new edition of your bi-weekly newsletter on the fight against disinformation.

We have the pleasure of hosting our first 2023 webinar this Thursday, 9 February (2-3PM CET), with Alicia Wanless, Director of the Partnership for Countering Influence Operations at Carnegie Endowment for International Peace. Alicia will present her co-written study about the CERN model for studying the information environment. You’ll better understand how to build a joint approach on how to preserving it.

Enjoy your read!

Disinfo news & updates

  • Worrying. Harvard announced on 2 February that it will put an end to its Technology and Social Change Project, a project that studied social media disinformation, and that produced research on COVID-19 and the January 6 Capitol Riot misinformation in 2024, the year of the U.S. elections.
  • API access. On 2 February, Twitter announced that “starting February 9, we will no longer support free access to the Twitter API, both v2 and v1.1. A paid basic tier will be available instead”. While the social media platform will provide more details this week, Twitter’s API (Application Programming Interface) access changes could mark the end of an era in academic research and for civil society. On Sunday though, Elon Musk already changed this new policy, saying that bots with ‘good content’ can use Twitter’s API for free.  

EU policy monitor 

  • Are we there yet? Not quite. The European Parliament has not yet fully agreed on the competencies question regarding the European Media Freedom Act (EMFA). However, several public hearings and events have taken place recently. There was a presentation of the proposal to the Committee on Culture and Education (CULT) on 23 January. A public hearing was organised on 31 January by the Committees on Civil Liberties, Justice and Home Affairs (LIBE), Internal Market and Consumer Protection (IMCO) and CULT. Another one took place yesterday. As a reminder, the Commission’s consultation to input into the ongoing EMFA legislative process closed a couple of weeks ago. You can see a joint policy statement and the contribution from the People vs. BigTech coalition calling to remove article 17 (Content of media service providers on very large online platforms) from the proposal, the article that in itself creates a lot of legal uncertainty and a risk of bringing the media exemption back. On the Council’s side, the Audiovisual Working Party will be discussing article 17 in its next meeting on 8 February. If you have concerns about article 17 too, it is a good time to raise them, and share them with your responsible ministry! 
  • DSA enforcement and implementation gearing up. Services have until the end of next week to announce their user numbers. Shortly after, the European Commission will designate Very Large Online Platforms (VLOPs) and Very Large Search Engines (VLOSEs). The Digital Services Act (DSA) will then become applicable to them four months after their designation. While it will still take quite a while to see the impact of the legislation, things are slowly advancing and it will be important to actively engage in some of the key processes like delegated acts on auditing, or data access, and following up with the designation of national regulators (to be completed by February 2024). This piece by Paddy Leerssen will provide you with a great overview on what you could expect for the DSA risk assessments and audits to deliver, and raises some good alert points.  

What we’re reading

  • CHD expansion in Europe. The latest EDMO BELUX investigation describes how Children’s Health Defense (CHD), the US anti-vax group funded by Robert Kennedy Jr. and one of the largest spreaders of anti-vax disinformation on social media platforms, has used the COVID-19 pandemic as an opportunity to expand its activities in Europe. It also highlights the need for all EU actors to reflect on solutions to reduce the vulnerabilities used by foreign fringe actors to spread disinformation in Europe.
  • “The Era of the Algorithm”. In this piece, Eliot Higgins looks at the roots of disinformation and the necessity of developing critical thinking early on. “If we want a more sustainable and widespread approach to countering disinformation, we need to think in terms of building communities, and empowering individuals to make a difference.”

This week’s recommended read by

Raquel Miguel, researcher at EU DisinfoLab, recommends reading this NewsGuard report which confronts ChatGPT with hoaxes about COVID-19 or Ukraine, with discouraging results: in 80 percent of the cases, the chatbot becomes a mouthpiece for false claims and even foreign propaganda. However, the answers also point to some built-in security measures. The article illustrates the challenge that AI-based text generators pose to the counter-disinformation community and the need to develop an accurate policy to address the potential disruption. Only by understanding its nature and how it works can AI become part of the solution rather than the problem.

The latest from EU DisinfoLab

  • Don’t stop me now! This new blogpost by Ana Romero-Vicente looks at the growing disinformation threat against climate change: over 30 online sites dedicated exclusively to dissuading the need for climate action! It unpacks denialist discourses and nuances from recent years.

Events & announcements

  • 9 February. EU DisinfoLab webinar with Alicia Wanless, Director of the Partnership for Countering Influence Operations at Carnegie Endowment for International Peace, who will present her co-written study about the CERN model for studying the information environment (from 2 to 3PM CET). Register here!
  • 10 February. The vera.ai project will hold a series of online participatory design workshops this spring and is inviting fact-checkers and other disinformation professionals to join. The goal of these workshops is to engage end-users in the development of the vera.ai content verification tools so that these can best support their needs, processes and workflows. The workshop series will start on the week of 20/2 and conclude in early May. Interested in joining the workshops? Feel free to contact Lalya Gaye by 10/2!
  • 15 February. The DFRLab is now accepting applications to join the 360/Digital Sherlocks Spring 2023 cohort. Apply here.
  • 21-23 February. Join the hybrid UNESCO Conference, “Internet for Trust – Towards Guidelines for Regulating Digital Platforms for Information as a Public Good”.
  • 22 February. Register to this joint event by Media & Learning Association and Lie Detectors: “Diversification of support and funding for effective Digital Media Literacy activities in Europe”. 
  • 22-23 March. You have until 15 February to send your extended abstract to join the “Infox sur Seine” workshop. The goals of this workshop? To identify, gather and acknowledge the French counter-disinformation community, and map the current undergoing research work.  
  • 5-8 June. Registration for the 12th edition of RightsCon, which will host participants both in Costa Rica and online, is open. 
  • Put yourself on the map! The International Republican Institute’s (IRI) Beacon Project launched its second community mapping survey. Thank you for taking this fairly short survey, and contributing to a better understanding of the counter-disinformation community.
  • Mapping mis- and disinformation research. This EU-Dialogues survey is part of a study being implemented in the EU and Brazil to provide a mapping of the research groups, research projects, research centres and academic institutions performing research on the areas of mis- and disinformation in relation to political processes and issues. Participate here

Jobs

  • Bellingcat is looking to bring on 5 new members to work on the Global Authentication Project’s work on Civilian Harm in Ukraine, starting from the beginning of February 2023 for 3 months. Learn more about this volunteering opportunity here.
  • The European Endowment for Democracy is recruiting Armenian-speaking consultants for the revision of grants financial reports.
  • The Global Investigative Journalism Network is looking for an Executive Director.

This good tweet!

Possible lessons for the DSA enforcement!