Consultations, consultations! Last week we submitted our response to the European Commission’s consultation on the Digital Services Act (DSA), and shared our position on how this legislative package can address disinformation. In addition, we joined 28 civil society organisations in a call for universal ad transparency by default. Today we are also submitting our views on the European Democracy Action Plan (EDAP). This consultation closes today so be sure to respond if you haven’t yet!
The Code of Practice (still needs some practice)
Last Thursday, the Commission released a staff working document assessing the Code of Practice on Disinformation (aka the COP). The document draws from past evaluations of the code and recalls the areas that still need attention. Many areas relate to elections and political advertising, for example: microtargeting, lack of transparency around political and issue-based ads, lack of EU level spending limit for political ads, and a generally scattered approach among platforms. These gaps point to the importance of the European Democracy Action Plan and the Digital Services Act in addressing disinformation. Lack of commonly-shared definitions was (once again) noted as a shortcoming. So was the platforms’ self-assessment reporting: the information they provided on trustworthy content, warning labels, flagging, and other end-user focused initiatives was found insufficient for determining effectiveness. In response, the assessment proposes introducing two levels of “meaningful KPIs”. It also calls for stronger and more structured cooperation between platforms and the research community, for instance, empowering the research community with better data access. We couldn’t agree more. Also, it’s good to see the assessment point out the need for work on ‘manipulative online behaviour’. If you want a longer overview, TechCrunch has it for you. Finally, despite their limitations, it is worth noting that the first monthly reports from the signatories are out now, covering their responses to Covid19-related disinformation.
Platforms bulk up on Election Integrity
In the leadup to the 2020 US elections, ‘election integrity’ on social media feels like an infinitely expanding field. Facebook has announced additional measures around “encouraging voting, connecting people to authoritative information, and reducing the risks of post-election confusion”. Twitter has a new policy to remove or label tweets containing false information intended to undermine public confidence in elections and civic processes. They’ll also remove tweets that claim victory before the election results are verified. Google, for its part, is addressing risks in Search, considering that autocomplete is so often used by citizens who are looking for information related to voting. Incorrect information about election procedures and results will simply not appear in searches. For example, a search for “voting by mail illegal” will not yield any suggestions. These policies are important, interesting, and contested. But, as we keep learning (see above) they’re also not enough. And as the Election Integrity Partnership continues to show, research has a critical role to play in checking for cracks and loopholes in platform policies, as well as proposing policy changes. When elections worldwide are so dependent on this social media infrastructure, we really need all hands on deck.
In the news
- Independent sellers are delivering one-star reviews to competitors on Amazon’s marketplace. In the UK, a probe into this harmful phenomenon of fake reviews is underway.
- Microsoft has found that hackers from Russia, China and Iran launched cyberattacks on US presidential campaigns ahead of elections.
- The 2020 Reuters Institute Digital News Report finds that people would prefer that journalists report false statements from politicians even if it gives them oxygen, and that platforms should block inaccurate political ads, even if this means letting platforms arbitrate the truth.
- A recent article in the Journal of Computational Social Science looks at the inevitability of online echo chambers and the role of unfollowing as a mitigation strategy.
- Sarah Fischer from Axios peeks into Tiktok’s recommendation algorithm after platform executives shared details with reporters last week.
- NYU Stern Center for Business and Human Rights published a position on Section 230 reform. 22 pages is actually pretty concise for this topic.
Events and Announcements
- As a consequence of Privacy Shield fallout, Ireland’s privacy regulator has ordered Facebook to stop moving data from the EU to the US.
- The German Marshall Fund has launched the Digital New Deal, a campaign to address domestic and foreign manipulation of democratic debate online.
- Tristan Haris’s new documentary “The Social Dilemma” – released September 9th – will be “an inconvenient Truth for tech”.
- 14 – 17 September – the European Parliament is hosting a conference on democracy and EU tech policy. View the program and register here.
- 24 September, 19h CET – First Draft News is holding a webinar on online information disorder: building up resilience in front of vicarious trauma. Sign-ups here.
- 28 Sept – 2 Oct – We told you already but our EU DisinfoLab annual conference on disinformation is fast approaching! The link to register is here.