The many faces fighting disinformation
Disinformation represents a diffuse and rapidly evolving set of challenges. It requires a broad response and the harmonised efforts of diverse actors. Disinformation is also a transversal threat by which more and more actors find themselves confronted. EU DisinfoLab believes that a thriving, decentralized civil society ecosystem is key to an effective response. A decentralized civil society architecture – in which a network of organisations, initiatives, and individuals operate in an agile, harmonised manner – would also create multiplier effects and build capacity and resilience among those currently at the periphery of the disinformation threat (climate activists, health professionals, etc.). After in-depth discussions with 14 civil society actors – each innovating and responding to different aspects of dis and misinformation – we have assembled several recommendations to safeguard and support a resilient, decentralized civil society ecosystem. We address these recommendations to three major stakeholder groups: funders, platforms, and civil society ourselves.
Recommendations for Funders: Increase Flexible Support and Innovate Funding Modalities
Often anti-disinformation work is done by people outside of their professional role – volunteer fact checkers, researchers and developers pursuing side projects. 43% of the interviewed initiatives had between 0 and 2 employees. One half were founded in the last three years. These actors are innovative and agile, often merging disciplines and experimenting with novel strategies. They have bottom-up expertise that gives them legitimacy and precision, for example, access to niche audiences. However, this same idiosyncrasy makes them more difficult for funders to understand, and the nature of anti-disinformation work limits their funding options. Existing funding sources like complex grants are less accessible to individuals and newcomers. Even for more established NGOs, flexible long-term financing that is not tied to specific outputs or pre-defined projects is hard to come by. In general, more diverse sources of funding are necessary: many philanthropic organisations are located in the US, as are many major social media platforms. This leads to a disproportionate number of initiatives responding to disinformation in the US context, and sometimes to a US-centric perspective on disinformation.
Actors fighting disinformation have specific needs beyond financial support: independence, credibility/reputation (including strategic publicity), cybersecurity, access to data, and legal clarity in key areas of their work (related to the use of personal data in research, etc.). Recommendations that emerged from these discussions included:
- Long-term core funding. Make available more flexible, long-term funding (ideally three or more years) not tied to specific outputs or numerical KPIs. This will allow actors to be proactive and responsive to evolutions in disinformation.
- Tailoring to micro-organisations. Design funding specifically to micro-organisations and initiatives (0 – 2 employees). The funding criteria and deliverables should be flexible and the process as streamlined as possible.
- Small grants for new entrants. Support individuals and new entrants into the field of disinformation research and response by leveraging small grants. Prizes and competitions should also be considered as a way to support newcomers and those pursuing this work outside of their formal employment. These awards could be scheduled around specific topics, or given based on past achievement.
- Funding beyond elections. While support for anti-disinformation activities in the context of elections is critical, funders should ensure that they support civil society at all times, including in the wake of elections when disinformation narratives continue to circulate and polarize. Election funding should be broad and flexible, for instance, given for a year rather than a period of months.
- Increased communication and coordination. Funders should improve communication and coordination between one another to avoid a glut of support in certain areas and neglect in others (ie: disinformation during elections, versus disinformation in the wake of elections). Funders of anti-disinformation activities could consider pooling their support into a central platform, mechanism or hub. This kind of forum could also improve awareness of funding sources and distribution for diverse actors. Funders taking part in this mechanism could harmonise their funding modalities to further ease the process for applicants and recipients.
- Support for cybersecurity. The threat of cyberattacks and forms of cyberviolence, from hacking to harassment, are ever present for actors in this space. Funders need to recognize that cybersecurity is both expensive and human resource intensive. Support in this area could come in the form of software and services, training, or legal protections.
Recommendations for Platforms: Rebalance the Relationship with Civil Society
Civil society actors countering disinformation often have strained relationships with the major social media platforms where the majority of disinformation is spread. Among those surveyed, 21% qualified their relationship with platforms as “weak” and 28% as “in opposition”. In general there is an asymmetry with respect to information and to labour. Civil society has only a partial view of the spread of disinformation online, which reduces their ability to address disinformation and assess the efficacy of their response. On the other hand, civil society performs much of the heavy lifting behind the services platforms’ claim to provide, from fact-checking to media literacy to the identification of disinformation actors and foreign interference networks. These asymmetries can be rectified through regulation, but also by the platforms’ own initiative, in recognition of the fact that disinformation is a shared threat.
- Improve the toolbox for monitoring and research. Civil society needs more tools for researching disinformation, with more insight into platforms’ curation of content and a clearer vision horizontally across platforms. Many tools are biased towards the English language, North-American and Western European users; tools should be viable in more languages and with the ability to parse data by country. In many cases civil society will build these tools themselves, but platforms must facilitate this.
- More data sharing with researchers. Increase data sharing dramatically for independent third parties working in the public interest (researchers, investigators, developers). Ideally, platforms should make their APIs fully open and accessible, sharing all data that is public on their platforms in real time through open APIs.
- Archive and review decisions. Platforms should increase user participation in fact checking and labeling. Users should be able to contribute flagged content to an archive for platforms and researchers to review, in order to improve content moderation and design.
- Pool support for civil society. Platforms should consider pooling financial support for civil society actors in this space. Merging platform support in a centralised mechanism could safeguard recipients’ independence (to the extent that bilateral support can come with invisible strings attached). A centralised funding mechanism could also improve distribution across the civil society ecosystem. This type of voluntary opt-in mechanism is distinct in spirit from fines or taxes that platforms might be obliged to pay: the latter is remedial, while the former is proactive and in solidarity.
Recommendations for Civil Society: Harmonise Our Decentralized Efforts and Increase Participation
Critical to the decentralized approach is complementarity and collaboration. The decentralized ecosystem is strengthened by strategic centralization (network nodes) – initiatives and coalitions with particular capacity or expertise. Collaborations can merge different skill sets and perspectives, build partnerships across countries, and generally prevent siloed work.
Disinformation itself is often a product of collaboration, deriving power through amplification by online audiences. This kind of “participatory disinformation” (Starbird, 2019) necessitates participatory responses. While it is difficult to conceptualize a whole-of-society response to disinformation, the actors interviewed here make this concept more concrete. 64% of the actors interviewed use crowdsourcing in some way (to identify disinformation, to provide expertise for debunking, to verify content, to donate personal data for research, etc.). 57% work with volunteers. A participatory model is clearly at the center of fact checking, research, and development. We recommend the following:
- Foster independent, multi-stakeholder coalitions around disinformation inflection points. During the US 2020 elections, the Election Integrity Partnership brought together independent organisations to monitor disinformation specifically related to the voting process and the outcome of the election. Similar collaboration could be envisioned around anti-vaccine disinformation, for instance. It is important that this monitoring agenda be set proactively and not reactively.
- Build strategic cross-sector teams to address disinformation policy areas. Policymakers do not collaborate systematically with the academic community and those investing in heavy research methodologies, which has led to a knowledge-gap or policy-product divide. This kind of collaboration will be needed to produce the monitoring tools used by regulatory authorities for platform auditing.
- Provide continuous training and capacity building. Training programs in areas from open source investigation to statistics to cybersecurity would help build capacity among civil society actors. Training programs also help set standards in this field and establish shared taxonomies for actors from different disciplines and sectors. Recognised certifications for specific skills could be considered.
- Maximize participation and include wide audiences through crowdsourcing methods. Beyond its role in providing data and services, crowdsourcing should also be understood as a form of media literacy and user empowerment because it models a more transparent, reciprocal relationship between people and digital technology. Civil society should share their experiences with crowdsourcing and various networked or layered participatory models as they iterate and improve these approaches. Knowledge sharing in this area is especially important because crowdsourcing is a complex endeavor and the potential for abuse must also be carefully considered.
- Build resilience outward, working with other cross sections of civil society. Disinformation is a horizontal challenge, and those with expertise in the area of disinformation can reinforce capacity and resilience among those who may be affected by disinformation (LGBTQ and feminist organisations, conservationists and environmental organisations, health professionals, human rights defenders, etc.). Diversity is a critical component of fact checking and disinformation research, ensuring the perspectives and expertise of a wide cross section of society. Just as we say of the platforms, no particular group in civil society should be the ‘arbiter of truth’.
- Raise our voice in the regulatory debate. Despite the powerful role of civil society and the relevance of co-regulatory response, these challenges require regulation. The meaningful input of international civil society is critical to ensuring the effectiveness of regulatory frameworks and their enforcement.