This week we’re joined by a new team member, Thomas Grandjouan, who will be leading our advocacy efforts. Welcome aboard, Thomas! We’re thrilled to have you!
Also, just one more reminder that our annual conference will be virtual this year, between September 28 and October 2. If you haven’t registered yet, do it here. Registration will close on September 24th.
EU DisinfoLab contributed to ITU/UNESCO Broadband Commission Report
On Friday, the ITU/UNESCO Broadband Commission for Sustainable Development released their global report, “Balancing Act: Countering digital disinformation while respecting freedom of expression.” This research was led by Prof. Kalina Bontcheva (University of Sheffield) and Dr. Julie Posetti (International Center for Journalists) along with international researchers and contributing authors, including former EU DisinfoLab Advocacy Officer Clara Hanot. The work charts the wide range of responses to disinformation across the world, focusing on the effects of these measures on freedom of opinion and expression, as enshrined in Article 19 of the Universal Declaration of Human Rights. Research took place between September 2019 and July 2020. The findings have been organized systematically into 11 categories of responses to disinformation under four umbrella categories:
- identification responses aimed at identifying, debunking, and exposing disinformation (monitoring, fact checking, investigations)
- responses aimed at producers and distributors of disinformation through altering the information environment (legislative and policy responses, counter disinformation campaigns, electoral responses)
- responses aimed at production and distribution (related to curation, algorithms, and demonetization)
- and responses aimed at the targeted audiences of disinformation campaigns (ethical, educational, and empowering responses).
The study finds that these responses are often complementary and collaborative. But there are also cases of interference (ie: journalists getting caught in the net of ‘fake news’ laws). The report advocates for a multi-faceted approach that addresses the socio-economic drivers of disinformation, rebuilds trust in democratic institutions and strengthens social cohesion amidst polarization while addressing the business models that thrive on paid disinformation. The report introduces a novel 23-step framework for assessing disinformation responses against freedom of expression. It also offers precise suggestions for different stakeholders that simultaneously strengthen freedom of expression, given the report’s major finding that access to reliable and trustworthy information (for instance through independent journalism) is essential to countering disinformation. Finally, the study provides a framework for addressing the complete disinformation life cycle — Instigators, Agents, Messages, Intermediaries, and Targets/Interpreters — shortened to the acronym ‘IAMIT’.
New tools for researchers and end-users
Mozilla has come out with a timely browser extension to encourage end-users to monitor YouTube’s recommendation algorithm. Regrets Reporter (a play on the phenomenon of “YouTube Regrets”) lets users send in reports about harmful videos that were recommended to them automatically. Using this crowdsourced end-user data, Mozilla will collaborate with researchers, journalists, policymakers, and YouTube engineers to construct more trustworthy recommendation engines. Meanwhile, Mozilla Fellow Emmi Bevensee, working at the anti-Defamation League and in collaboration with Network Contagion Research Institute, Pushshift, Open Collective, and iDramaLab, has just released an open source tool to help researchers and activists monitor dis- and misinformation. The Social Media Analysis Toolkit (SMAT) uses data visualizations to analyze discussion on Twitter, Reddit, 4chan, and 8chan (Telegram, Parler, and Gab coming soon). While we wait for the outcome – (and implementation) – of the Digital Services Act, this is an important reminder that there are ways to address transparency and research challenges in the immediate-term — things that may exceed, and enhance, regulatory response.
In the news
- 29 police officers in the North-Rhine Westphalia, Germany have been temporarily suspended after the discovery that they were sharing extreme rightwing and Neonazi content in a WhatsApp group, illegal under German law.
- An internal memo from a recently fired Facebook data scientist Sophie Zhang reveals how the company failed or was slow to act on evidence of political manipulation — often the use of “inauthentic assets” — in countries around the world.
- A new report by More in Common, “The New Normal”, looks at how people in seven countries perceive the motives of the news media’s coverage of the COVID-19 pandemic and “the impacts of COVID-19 on trust, social cohesion, democracy and expectations for an uncertain future.”
- The Ethical Journalism Network, together with the Evens Foundation and the Fritt Ord Foundation, has released a report on the challenges facing the media in Hungary. The research is drawn from interviews with practitioners, and includes recommendations for media and policy stakeholders.
- Ashley Camran at the Verge looks at the symbiotic relationship between TikTok and OnlyFans, a slightly more risque version of the crowdfunding platform Patreon.
- Graphika’s Camille Francois writes about the risks of an oversimplified understanding of Russian disinformation strategy.
Events and Announcements
- A new version of BotSlayer (v.1.3) is out for those looking to track potentially coordinated inauthentic information operations on Twitter in real-time.
- 24 September, 10h – 12h CEST – EU Commission CONNECT University Autumn School on Digital Health is hosting an online event, “Pandemic vs panic: fighting against disinformation”. More info here.
- 24 September, at 09h CEST – Global Forum for Media Development (GFMD) is hosting a webinar on the Australian Competition and Consumer Commission (ACCC)’s recent review of digital platforms. Registration is here.
- 5 October – 4 November – The Knight Center is offering a new MOOC “Digital investigations for journalists: How to follow the digital trail of people and entities”. Learn more here.
- Digital Action is looking for a Policy Adviser (remote) and an intern (UK).
- Privacy International is looking for a Digital Campaigns & Fundraising Officer
- The Wikimedia Foundation is hiring a Disinformation Research Scientist and a Human Rights Lead
- Internews is hiring a consultant to work on Open Source Community Management & Documentation