Greetings, dear Disinfo Update readers,

First things first: We’re thrilled to announce that the registrations to the EU DisinfoLab 2023 Annual Conference are now open! Join renowned professionals from diverse backgrounds on 11–12 October 2023 in Krakow, Poland, to dig into the pressing issues in the disinformation space, with a mix of session formats and exciting networking opportunities. Don’t miss this opportunity to exchange, learn and network IRL – click the link below to register now!

In addition to #Disinfo2023, this newsletter plunges into climate change disinformation, Guidelines for public interest OSINT investigations, and of course the endlessly fascinating scene of EU policies, featuring EMFA, DSA, AI Act, and the Defence of Democracy Package. Taylor Swift fans (and those interested in social dynamics in fan groups) will be delighted to look into the recommended read by our Executive Director Alexandre Alaphilippe.

Enjoy!

Disinfo news & updates

  • AI falsity factories. The news-rating group NewsGuard published a report on AI fed content farms that they discovered, documenting how chatbots like ChatGPT are producing pseudo journalistic contents such as breaking news, lifestyle tips, or celebrity news, generated falsehoods for published pieces.
  • AI again. A German magazine Die Aktuelle published an artificial intelligence-generated “interview” with Michael Schumacher, a Formula 1 champion who has not been seen in public since he suffered severe head injuries in a skiing accident in December 2013. His family now plans legal action against the magazine. 
  • More AI. ChatGPT pumped out scores of Chinese-language disinformation even when it refused to do so when prompted with similar queries in English, according to research from NewsGuard.
  • Online Safety Bill & data access. Twitter’s new stricter data access policies, TikTok’s allegedly heavily redacted transparency – data access has become an increasing challenge for public interest researchers. Check out this piece by Center for Countering Digital Hate (CCDH) on an amendment to the UK’s Online Safety Bill that would give the regulator the power to appoint independent researchers with access to platforms’ data.
  • Twitter Blue. Twitter rolled out its new policies for verified accounts, making users pay for the blue checkmark and removing it from those previously verified accounts that did not sign up for the new Blue. The change shifted the meaning of the blue checkmark from ‘independently verified’ to ‘paid premium’. And the results were, expectedly, chaotic: impersonation and false information running rampant.

Brussels corner

  • Things are accelerating at the European Parliament on the European Media Freedom Act (EMFA). Amendments for the Internal Market and Consumer Protection (IMCO) Committee draft opinion came out (available here and here). The Culture and Education (CULT) Committee draft report is also out – and media exemption is all over the text. It was presented by the lead rapporteur last Wednesday (replay of the presentation and exchange available here), and the deadline for amendments is set for 5 May. Civil Liberties, Justice and Home Affairs (LIBE) draft opinion was published as well. Amendments deadline: 8 May. Read also here the ‘background analysis’ drafted by the Policy Department for Structural & Cohesion Policies of the Parliament and commissioned by the CULT Committee. The EMFA will be on the agenda of the meeting of the Education, Youth, Culture and Sport Council on 16 May, according to Euractiv. The Council issued its progress report (see the press release) on the file last Friday with the Swedish Presidency still hoping for a General Approach to be reached before the end of its Presidency. With a strong push from Germany alongside Poland and Hungary, and the support from France, it looks like the media exemption there is inevitable. 
  • The European Commission has finally announced the first 19 designated Very Large Online Platforms (VLOPs) and Search Engines (VLOSEs) under the Digital Services Act (DSA). This is an important milestone for the counter-disinformation community since these services will need to perform risk assessments and mitigate those risks under the DSA, including on disinformation. The key will be to ensure that these are done properly. As part of the procedure consists in preparing a delegated act on data access, the European Commission has launched a call for evidence to which stakeholders can contribute. This one will be impactful in shaping researchers’ – including NGOs’ – access to platform data, and to investigate how services are complying with their responsibilities to address systemic risks such as disinformation. The deadline to contribute is 23 May. Do make your voices heard, and share your expertise by responding to this call for evidence! 
  • Last Wednesday, Alexandre Alaphilippe, EU DisinfoLab’s Executive Director, took part in the ING2 Committee meeting, and presented Doppelganger, one of our latest investigations. You can replay this meeting here (from 9:35 AM). Make sure not to miss the exchange that followed the presentation where Alexandre highlights the challenges that the counter-disinformation community is facing. 
  • The Commission is on a mission to publish its proposal for the Defence of Democracy Package on 31 May. It will include a directive that many civil society organisations fear will resemble the American Foreign Agents Law, hurting civil society and potentially negatively impacting foreign philanthropy that most of us in the  counter-disinformation community are relying on to survive in the absence of core funding from the EU. The package will also have a number of recommendations to Member States to support civil society. The recommendations are non binding. We dare you to count a number of similar kinds of recommendations that were developed in the last years at the European level and their national implementation. Bleak picture, isn’t it? 
  • Members of the European Parliament have reached a provisional political agreement on the AI Act. Committee vote is scheduled for 11 May, and it is expected to go to the plenary vote in mid-June. 

What we’re reading

  • Boosting state-controlled media. Digital Forensic Research Lab (DFRLab) detected a rise in the number of followers for multiple Russian, Chinese, and Iranian state media outlets on Twitter. This followed an unannounced change in Twitter’s platform policy that allows it to algorithmically promote state-affiliated media outlets.
  • Social media influence operations. A report by Albert Zhang, Tilla Hoja, and Jasmine Latimore argues that the Chinese Communist Party’s (CCP) has developed a sophisticated and persistent capability to sustain coordinated networks of personas on social media to spread disinformation, wage public opinion warfare, and support its other levers of state power. This Twitter thread summarises the report.
  • The best of fact-checking. European Digital Media Observatory (EDMO) launched its ‘Best of Fact-checking Map’, an interactive map showcasing the best fact-checking content produced by EDMO and its national and regional Hubs. The map will be updated on a monthly basis – you can find the first edition here.

EU DisinfoLab April trends

In April, Doppelganger surfaced in France and Germany, and disinformation around the evergreen front-runners – Ukraine war, COVID-19, climate change – remained ample.

Below, you can find some of the April highlights of our monitoring of the French, German and Spanish disinfo landscapes:

  • In France, disinformation was fuelled by the highly polarised political context with the adoption of the pension reform, including cross-narratives such as the one claiming that members of the Ukrainian national guard were recruited by the French gendarmerie to participate in the repression of the protests. Regarding the war in Ukraine, new false covers of French and international media were discovered by RFI journalists, recalling the Doppelganger case. These impersonated documents were disseminated on pro-Kremlin Telegram channels. After the dissemination of several narratives related to child abuse, AFP Factuel’s fact-checkers published an article analysing why this subject generates so much disinformation and conspiracy theories.
  • In recent months in Germany, we have seen an increase in hoaxes created with generative AI, with for example, this fake photo portraying Julian Assange in prison. Climate change has become a key frame in the disinformation landscape in April, ahead of the COVID-19 pandemic, and surpassed only by the war in Ukraine. Disinformers are increasingly looking for new angles, including inciting fears of social control. A hoax going viral in Germany claims that paying for airline tickets in cash is no longer possible because the authorities want to register the pollution caused by each citizen with a kind of so-called “environmental account”. In April, many hoaxes tried to trick citizens into clicking and leaving their data on scam websites. Some of them used fake domains impersonating media outlets, such as Nordbayern or Bild, a strategy exposed by EU DisinfoLab in the Doppelganger report. In this case, the purpose was fraudulent.
  • In Spain, disinformation surrounding the war in Ukraine had a conspiracy tone by claiming that the war in Ukraine is false, a media stunt. Hoaxes promoting the Russian cause used false or unproven Russophobia claims: alleged threats to a woman for speaking Russian, or that the National Gallery withdrew a work for its protagonist’s resemblance to Putin. The regional elections are approaching, and we observed how the high political activity around this event is accompanied by innumerable misleading statements. The severe drought in Spain has brought a lot of disinformation, such as the Government preventing rain or clouds being dissolved to favour dryland cultivation. The most prominent hoaxes on climate change were openly expressed by political leaders: statements such as that cars pollute the air but do not harm the environment, or that CO2 is not a polluting gas, but rather beneficial for nature. In addition, other hoaxes suggest that forest fires are intentionally set to build wind farms. Several international organisations have been targets of conspiracy theories. Hoaxes claiming that the EU will ration the use of water, that the UN decriminalises pedophilia, that human sacrifice are made at the European Organization for Nuclear Research or that the World Economic Forum is controlling the Internet.

This week’s recommended read

Our own Alexandre invites you to dive into this fascinating Graphika report, which explores pop star Taylor Swift’s fandom. Unlike studies focused on Russian trolls or Chinese wolves, this report focuses on highly engaged groups of online fans and demonstrates the social dynamics at play. You’ll see how passions can lead some accounts to border on promoting conspiracies, and how debunkers of these theories are often targeted with online harassment, even being framed as homophobic.

The latest from EU DisinfoLab

  • #Disinfo2023. Registration for our annual flagship event is now open. Our 2023 edition will take place on 11-12 in Krakow, Poland. More information and registration link here
  • OSINT and ObSINT. A collective of like-minded organisations dedicated to conducting public-facing open-source investigative work has launched ObSINT, a platform envisioned as a hub of resources for the OSINT community. The new Guidelines for Public Interest OSINT Investigations developed by this collective were introduced in a webinar that EU DisinfoLab hosted on 20 April. Missed it? You can watch the recording of the webinar here.
  • Disinfo vs Climate change in Belgium. An EU DisinfoLab and EDMO BELUX study analyses the flows of climate disinformation in Belgium. Some of the experts in the field are pessimistic about the evolution of climate disinformation in Belgium, even suggesting that the worst is yet to come. This study wishes to curb this dark future and alleviate the adverse effects of climate change disinformation in the area.

Events and announcements

  • 4 May: “How can Civil Society Organisations (CSOs) use the Digital Services Act (DSA) for effective work in favour of a democratic online space?” Join this webinar to learn more about Democracy Reporting International’s new report and CSOs’ role in the DSA implementation. Register here
  • 16 May: The third EDMO BELUX Lunch Lecture will discuss truth(s). The guest speaker Eline Severs, assistant professor at the Department of Political Science at VUB, will present the contribution she prepared for the recent VUB Poincaré book on Truth(s). Register here.
  • 25 May: The EDMO Annual Conference brings together media, policymakers, academics, regulators, journalists and civil society to discuss the challenges of online disinformation. EU DisinfoLab’s Alexandre Alaphilippe will speak in a panel on disinformation and Open Source Intelligence (OSINT). Registrations are now open. 
  • 29 June: The European Commission and research projects AI4media, AI4Trust, TITAN and vera AI will organise a one-day conference in Brussels on the future of AI. More info and registrations coming up!
  • 11-12 October: Don’t forget to register to #Disinfo2023, our annual conference which will take place in Krakow, Poland. Join your peers to dig into the pressing issues in the disinformation space, and to exchange, learn and network! 
  • European Digital Rights (EDRi) has published a draft summarising the plan for a programme to address power dynamics in the digital rights field in Europe, developed in collaboration with participants from the digital rights field and social, economic and racial justice groups. EDRi is now looking for feedback on the ‘Decolonising the Digital Rights Field in Europe’ outline from the wider community – find the open consultation here.

Job opportunities

This good thread!