May 11, 2020

by Lauren Hamm

Printable PDF version here

The Russian Internet Research Agency’s interference into the 2016 US elections was a wake-up call for many on the power of false narratives on social media and their potential to influence societal perceptions of political and social issues. While monitoring and debunking foreign interference has thus become a priority on decision-makers’ agendas, such campaigns inspired a multitude of actors to conduct online disinformation campaigns at scale. Fast forward to 2020, COVID-19 has birthed a reorientation of the disinformation ecosystem towards the pandemic. We are living in an “infodemic” and so it’s now more pertinent than ever to reflect on what we’re seeing.

Disinformation is a multifaceted and complex issue that can rather be understood as a symptom of a much broader information disorder powered by social media from which many malicious actors benefit for various and sometimes interrelated purposes. From our continuous monitoring and research into disinformation campaigns in Europe over the past years, we feel it is important to elaborate on a few faces of disinformation for which regulatory responses may differ.

Last year, Camille François, Graphika’s Chief Innovation Officer, established a framework for describing and analysing influence operations. Known as A Disinformation ABC, it describes the state of disinformation through three vectors:

  • Manipulative Actors with clear intention to disrupt democratic processes or the information ecosystem;
  • Deceptive Behaviours, as tactics and techniques used by manipulative Actors;
  • Harmful Content pushed to hurt and/or undermine individuals, organisations or public interest and influence the public debate.

Using this framework allowed us to reflect on the disinformation we observe in our day-to-day activities, but it also drew attention to the need to further reflect on the intentions behind the use of disinformation. Accordingly, we were able to draw out categories based on our own research and that of those working in the disinformation field, including scholars, think-tanks, policymakers, and non-profit organisations, among others:

  • Foreign influence – the use of disinformation by foreign actors to disrupt societies over certain issues and/or to push an agenda (foreign actor(s) to a domestic audience).
  • Political – the use of disinformation to undermine adversaries and/or push an agenda (domestic actor(s) to a domestic audience).
  • Lucrative – the use of disinformation to make a profit.
  • Issue-based – the use of disinformation to serve an ideological, normative, and/or financial goal.  

When talking about disinformation and intent, we mean those who deliberately create and/or spread disinformation to cause harm. Elaborating on this is particularly crucial for nuancing what we mean by issue-based disinformation. Often, the average end-user may passively be involved in spreading disinformation because they genuinely believe the content. In this way, an individual may share disinformation simply because the disinformation confirms their beliefs.

Most campaigns are fluid and can be found at the crossroads of multiple categories. This is partly because there is often symbiosis between these categories in that profit-driven disinformation may also — sometimes unintentionally — serve a political agenda, as an example. Likewise, some actors may directly engage in disinformation for several intentions.

Our investigation on EP Today demonstrated how useful it is to separate content from intent. EP Today had syndicated RT and Voice of America content for the purpose of increasing the visibility of EP Today’s website and its Facebook page. In this way, the sources of the content and the narratives embedded within it didn’t matter so much; it was rather related to both RT and Voice of America’s terms of use, which allowed for easy republication.

As touched on by Gary Machado, our Managing Director, “uncovering malicious behaviour from state actors is only one facet of disinformation”. With that in mind, the majority of disinformation cannot be explained by foreign influence alone. Disinformation can be regarded as a strategy that can serve various objectives. This analysis therefore merely serves to nuance our common understanding of the disinformation phenomenon.

Foreign influence

Foreign influence campaigns, especially when aimed at influencing electoral processes, have been the subject of close political and media attention. Such campaigns orchestrated by foreign states, sometimes via proxies, are well-prepared, coordinated campaigns. With the use of fake profiles, bots, advertisements, infiltrating groups and pages, these campaigns can be conducted across several platforms with the aim of polarising societies over sensitive issues. In this context, the infamous IRA campaign during the 2016 US election exploited racial inequalities in American society by specifically targeting African Americans with Black Lives Matter content. Other examples of foreign influence include the Twitter and Facebook operation linked to Egypt, the UAE, and Saudi Arabia and the Chinese influence operation conducted at the height of its trade war with the US. These examples show how foreign influence operations can also be used to push a domestic ideological and/or political agenda internationally.

COVID-19 case study:

Graphika recently released an investigation revealing how Iran’s International Union of Virtual Media had been promoting its agenda through the lens of COVID-19. Such narratives framed the virus as a US bioweapon. This was done by creating English-language content such as memes and cartoons for websites, which was then disseminated by a handful of specially created social media accounts. While it was a relatively small operation whose impact shouldn’t be overstated, it gives us a clear idea of the actors, behaviours, and content behind a foreign influence campaign.

Political disinformation

Disinformation for political motives can be related to electoral campaigning periods, but it can also serve broader, long-term goals of a given political group or administration. In this way, disinformation can serve the objective of agenda-setting, with the goal of promoting political activities, (de)emphasising societal and political issues, and/or undermining adversaries. Politicians and political parties may engage in disinformation by using fake identities and websites, artificially amplifying certain topics (sometimes in political advertisements), and by manipulating content to name a few. Such examples include ISD and Avaaz’s briefing on the 2019 European Parliamentary elections where they noted how bad-intentioned political parties had basically used the same playbook as foreign interference actors. At the same time, political actors do not always need to rely on deceptive behaviours to communicate their messages. In some cases, disinformation has become part and parcel of political communication, with Donald Trump’s frequent attacks on some American media being a case in point.

COVID-19 case study:

Harvard Kennedy School researchers recently drew attention to how Brazilian President Jair Bolsonaro has been using COVID-19 misinformation as a political weapon. The authors note how Bolsonaro leads the “coronavirus denial movement” by downplaying the seriousness of the pandemic, which serves the purpose of keeping the economy open and shrugging responsibility for Brazil’s very likely recession, forecasted by experts even before the pandemic. Here we can see that COVID-19 dis– and misinformation serves Bolsonaro’s broader political agenda.

Issue-based disinformation

Issue-based disinformation can encompass a whole range of actors, including activists, religious bodies, consultancies, extreme supporters, and NGOs. With the use of fake identities, online profiles and websites, these actors mobile around an issue and conduct scaled campaigns by targeting specific online/offline groups with content with the aim of achieving an ideological, normative, and/or financial goal. An investigation from CORRECTIV and Frontal21 revealed how the US Heartland Institute was supporting climate change deniers in Germany with the goal of undermining climate protection measures. In addition, an Open Democracy exposé detailed how a global network of ‘crisis pregnancy centres,’ backed by US pro-life groups, had targeted vulnerable women with anti-abortion disinformation. These examples illustrate a symbiotic relationship in that these campaigns likely served broader political goals of a political party or administration. More concretely, the climate change deniers had affiliations to the AfD, the German far-right party, whereas the US pro-life groups reportedly had connections with members of the Trump administration. This ideological proximity between stakeholders can often blur the line when it comes to attributing disinformation to a source.

COVID-19 case studies:

The COVID-19 pandemic has witnessed an influx of disinformation from the anti-vaxxer movement. Promoting discredited conspiracy theories and undermining the efforts of public authorities to contain the virus, this disinformation has thrived in closed Facebook groups and escaped Facebook’s policies.

As another example, while the anti-5G movement has been brewing for some time, the COVID-19 infodemic has provided a fertile ground for anti-5G sentiments to enter the mainstream. Campaigners have been able to capitalise on the pandemic by linking the cause of COVID-19 to 5G, resulting in offline actions such as the harassment of telecoms engineers and the burning of mobile phone masts in the UK.

Lucrative disinformation

Lucrative disinformation concerns disinformation that is profitable. Actors range from PR firms, fringe networks, to the average end-user, and their strategies are arguably not so dissimilar to those of social media influencers. Quite often, actors use disinformation as a tool or clickbait to drive traffic from social media to fake websites managed by the very same actor in order to profit from online advertising. Other classic techniques involve automated reposting and content syndication to maximise the reach and engagement with content. Content can be emotional, sensationalist, relate to major events and breaking news, or involve scams. As mentioned before, syndicated content can sometimes come from foreign state-owned media, which produce engaging content with loose copyright policies. This can subsequently blur the lines with foreign influence. Lucrative disinformation can also profit from politics with an example being the infamous Macedonian students who spread disinformation during the 2016 US election. After realising that spreading left-wing content didn’t make as much money, the students promoted pro-Trump content using a network of fake Facebook pages to drive traffic to fake websites in order to profit from online advertising.

COVID-19 case study:

Not long ago, we released an investigation on an Africa-based network that had built clickbait health scams and syndicated Russia Today/Sputnik content for profit, and we recently found that this very same network had pivoted towards spreading coronavirus falsehoods. One of their articles claimed that hundreds of Italian migrants were en route to Africa to flee the coronavirus, which was subsequently debunked by Libération’s CheckNews. This evolution demonstrates that the content, although harmful in its ability to shape public debate, is rather secondary for the network. The network is primarily money-oriented, and it has capitalised on the uncertainty caused by the pandemic as well as the way in which social media works.


Based on this analysis, we have outlined four disinformation faces revealed in both our own and the community’s research:

  • Foreign influence – the use of disinformation to disrupt societies over certain issues and/or to push an agenda (foreign actor(s) to a domestic audience).
  • Political – the use of disinformation to undermine adversaries and/or push an agenda (domestic actor(s) to a domestic audience).
  • Lucrative – the use of disinformation to make a profit.
  • Issue-based – the use of disinformation to serve an ideological, normative, and/or financial goal.  

These categories are not static: the evidence shows that they are fluid in the actors, behaviours and content they encompass, but can also be symbiotic in the way one may reinforce the other. As this analysis demonstrates, the COVID-19 pandemic has borne witness to the reorientation of the disinformation ecosystem around the pandemic with actors capitalising on the infodemic for many different reasons.

It is therefore important to reflect on intent, i.e. the possible goals and motivations behind disinformation. Similar outcomes may be achieved by multiple campaigns from a multitude of actors. Some actors may just use disinformation tactics without care for the societal and/or political ramifications or for whose agenda it serves. By taking this into consideration, we can begin to nuance a common misconception that disinformation is only used with the intention of foreign influence or political change.

Taken collectively, this illustrates how blurred the lines are when it comes to attribution. Looking at intent alone does not give us all the answers, but when accompanied by the use of the ABC framework, we can better grasp the bigger picture.

We invite our community to reach out to us and share their thoughts on this piece.