Disinfo graphic banner blue w logo-01

Disinfo Update 14/01/2019

2018 has been an intense year in the fight against disinformation. We wish 2019 to be the year of new resolutions (and actions) from online platforms and regulators. DisinfoLab is preparing a great present for you and you can make a wishlist!  What are your plans on 28-29 May? Binge-watching Netflix? Depression after European elections? Tell us who you would like to hear at our annual conference instead.
Make your wishlist

Whatever dirty tricks it takes?

The political debate is not done yet with online manipulation in the US. A top official in President Trump’s campaign, Paul Manafort shared political polling data with a business associate tied to Russian intelligence, according to a court filing unsealed on Tuesday. The document provided the clearest evidence to date that Trump ‘s campaign may have tried to coordinate with Russians during the 2016 presidential race.

Well, is the grass greener by the Democrat’s? Nothing is so sure, as it has been revealed the Democrats faked Facebook pages on the outlaw of alcohol in Alabama Race. According to an activist who worked on the project “If you don’t do it, you’re fighting with one hand tied behind your back” According to him Republicans are using such trickery and that Democrats cannot unilaterally give it up “You have a moral imperative to do this — to do whatever it takes.” The New York Times already reported last month on a separate project by the cybersecurity company New Knowledge that used its own bogus conservative Facebook page and sent Russian-looking Twitter accounts to follow Republican candidate Roy S. Moore to make it appear as if he enjoyed Russian support.

If disinformation becomes a legitimate campaign tool, some candidates might win the race, but we’ll be losing the democratic debate in society.  

Mark good resolution or Zuck Fireside chat

Apparently, Mark Zuckerberg’s 2019 challenge is to get out of his bunker and talk to people, describes Quartz. Facebook CEO Mark Zuckerberg announced his annual challenge: hosting public talks about the future of technology. Zuckerberg, who doesn’t like public appearances, will every few weeks meet with leaders, experts and Facebook’s community members to talk about “the opportunities, the challenges, the hopes, and the anxieties” related to the topic, he said in a Facebook post on Jan. 8.  Damian Collins, chair of the UK DCMS parliamentary committee reacted to this announcement in a sarcastic Tweet by inviting Zuckerberg to come to London to answer questions from parliamentarians, as he refused to appear for an audition. But according to Kurt Wagner in Recode, we should actually listen to him.

Mark, we heard you. Today, we are inviting Facebook VP for Public Policy Richard Allan to speak at our annual conference and meet with the civil society working on disinformation. We’ll keep you posted on his reply.

Twitter breaking the ice

At CES last week, Twitter representatives announced to Engadget that it will be launching a new program to let users reshape how conversations on its site look and feel. The first version of the beta will focus on a new design for the way conversation threads work on Twitter told Sara Haider, Twitter’s director of product management to TechCrunch. This includes a different colour scheme, and visual cues to highlight important replies. The idea is for users to try out new organization and context features with their followers, such as the status updates and “ice breaker” tweets we saw being tested last year, which are designed to encourage people to talk to each other. Twitter is set to start testing the program in the coming weeks, and while anyone will be able to apply to join, only a few thousand users are going to get in.

Older and wiser?

Older Americans are disproportionately more likely to share fake news on Facebook, according to a new analysis by researchers at New York and Princeton Universities. The study, published in Science Advances, examined user behaviour in the months before and after the 2016 US presidential election. Among the users who agreed to share their Facebook data, 11 percent of users older than 65 shared a hoax, while just 3 percent of users 18 to 29 did. These findings clearly support that media literacy efforts should also target elders.


What to read, watch and listen to this week:


See all past and upcoming events in our agenda

Looking for fame? Calls for papers and awards

  • EU Project call for proposal “Media Literacy for All”: open until 28 February 2018
  • WeVerify, an EU co-funded horizon 2020 project that deals with algorithms supported verification of digital content hold its first meeting in Sofia on 14 and 15 January, and you can let them know what issues you think are the most pressing in this area. They might include your answer in their research and project work.
Disinfo graphic banner blue w logo-01

Disinfo Update – 7/01/2019

2018 has been an intense year in the fight against disinformation. We wish 2019 to be the year of new resolutions (and actions) from online platforms and regulators. DisinfoLab is preparing a great present for you and you can make a wishlist!  What are your plans on 28-29 May? Binge-watching Netflix? Depression after European elections? Tell us who you would like to hear at our annual conference instead.
Make your wishlist

Brace yourselves, regulation is coming

The latest scandals from Cambridge Analytica to 2016 elections interference have caught legislators’ attention to tech regulation in the US. This summer, California’s state legislature passed a groundbreaking bill that would give residents unprecedented control over their data. According to Wired, 2019 is much likely to see US data protection legislation as tech companies are now pushing for national legislation to avoid such state-based rules. The Trump administration’s National Telecommunications and Information Administration has released its own point-by-point proposal, describing in unspecific terms a set of “privacy outcomes” the administration would like to see.

Facebook company of the year

Mark Zuckerberg addresses his wishes and reviews all actions taken by Facebook in 2018 to face its new challenges, in particular regarding elections interference and and spread of harmful content.

Nothing is so sure, as The New York Times has been provided with more than 1,400 pages from Facebook moderation rulebooks by an employee. An examination of the files revealed numerous gaps, biases and outright errors. As Facebook employees grope for the right answers, they have allowed extremist language to flourish in some countries while censoring mainstream speech in others. In the absence of governments or international bodies that can set standards, Facebook is experimenting on its own.

After two years of repeated scandals, is Facebook at a turning point where it finally understands it is both a platform and a publisher? Nicolas Thompson and Fred Vogelstein from Wired come back on those past to years of hell for the company.


Russia’s media regulator investigating if BBC is “compliant with russian law“. The announcement of the investigation came a day after Ofcom said it was considering imposing sanctions on RT – which is financed by the Russian state – following an investigation into its skewed coverage of the Sergei Skripal poisoning. Meanwhile, a list of BBC reporters in Russia has been leaked online along with their photos. The leak comes after Sunday Times published names of journalists working for Moscow-backed Sputnik’s UK bureau

Play with the Devil

Just before Christmas break, we welcomed Marije Arentze, from media literacy initiative Drog, for our 3rd Webinar. Drog is a multidisciplinary team of academics, journalists and media experts. They conduct research, offer workshops, educational programmes and create innovative tools thathelp you build “resistance” to disinformation, using “gamification”. Drog hascreated  the social impact game Bad News, where you take on the role of a propagandist. They believe the best way to cultivate a sixth sense to recognize and expose disinformation is to create it yourself. Read about their approach and the challenges of media literacy in our latest article.

How much of the internet is fake? 

How much of the internet is fake? Studies generally suggest that, year after year, less than 60 percent of web traffic is human; some years, according to researchers, a healthy majority of it is bot. Even your Facebook friends might be fakes. Researchers are now able to copy the “styles” of source faces onto destination faces, creating blends that have copied features, but which look like entirely new people. Than who should we trust? Doubt is everywhere, as Der Spiegel star reporter confessed he made up more than a dozen stories. According to Anna Altman, this story might change the relationship of the newspaper with its readers, who placed trust in a robust media known for the emotional writing of its reporters and reliable fact-checking process.


What to read, watch and listen to this week:

See all past and upcoming events in our agenda

Looking for fame? Calls for papers and awards

Disinfo graphic banner blue w logo-01

Disinfo update – 17/12/2018

Google still has no answers for YouTube’s biggest problem

Now its Sundar Pichai’s turn to be grilled by lawmakers. Last week, Google CEO took the stand before the House Judiciary Committee in Washington, DC. Among the topics discussed, Pichai has been confronted on Youtube’s moderation problem, especially regarding conspiracy videos on the platform. Pichai referred to the company policies against hate speech and that the platform started adding “authoritative” context to its search results for breaking news stories earlier this year. This issue is even more worrying that according to a Pew Research study, social networks, and Youtube in particular, have become the main source of information for younger Americans. 

Washington confidential 

Facebook and Google auditions have shed light on various channels of disinformation. As the US Senate is investigating possible Russian interference in the 2016 elections, a report conducted by Oxford University’s Computational Propaganda Project and Graphika shows that nearly every social media platform has been used by Russian influence to support Trump’s election.

The Guardians and the war on truth

For the first time, journalists have been chosen by Time magazine as “person of the year”. Jamal Khashoggi, Maria Ressa, Wa Lone and Kyaw Soe Oo, and the newsroom of Capital Gazette have been killed or targeted for their work. The magazine has chosen to present them as “The Guardians” in “The war on Truth”, current period of defiance against the press and manipulation of information. This choice is a strong statement against the spread of misinformation by leaders who have sought to undermine critical independent journalism. Among shortlisted candidates are also Donald Trump and Vladimir Putin.

“It became clear that the manipulation and abuse of truth is the common thread of so many of this year’s major stories, from Russia to Riyadh to Silicon Valley,” Edward Felsenthal, Time’s editor in chief and chief executive, said during the announcement on NBC’s “Today” show.

This is the end

Just when journalists are in the headline for defending the truth, fact-checking journalists decided to break their partnership with Facebook. According to them, the platform has not done enough for their action to be efficient. They are particularly resentful against the company hiring a PR firm to go after its opponents, in a way that fuelled conspiracy theories against George Soros. “Why should we trust Facebook when it’s pushing the same rumors that its own factcheckers are calling fake news?” said a current Facebook fact checker

In the Library

What to read, watch and listen to this week:


  • December 18DisinfoLab Webinar: How to tackle disinformation with education? With Marije Arentze from Drog project

See all previous and upcoming disinfo events in our agenda

Fear of getting bored during the Christmas holidays? You can prepare a submission for one of those calls for papers:

  • ACM Transactions on Social Computing: Special Issue on Negotiating Truth and Trust in Socio-Technical Systems
  • The Media & Democracy program at the Social Science Research Council invites submission of abstracts for a research workshop organized in collaboration with Cristian Vaccari (Loughborough University), to be held in New York City on June 13–14, 2019.
  • The Centre for Direct Democracy Studies (CDDS) at the Faculty of Law of the University of Białystok, Poland (UwB) announced a call for papers for the upcoming sixth volume in the European Integration and Democracy Series, devoted to challenges for democracy, the rule of law (Rechtsstaat) and the respect for fundamental rights, posed by contemporary disinformation practices and digital media. Deadline is extended to 31 January 2019. Submit here. 

HR corner

Lie Detectors is looking for a Germany Programme Director and Germany Programme Assistant based in Berlin.

Disinfo graphic banner blue w logo-01

Disinfo update – 11/12/2018

EU has a plan

On Wednesday, the European Commission released its awaited Action Plan against disinformation. The plan focuses on external disinformation, strengthening of EU communication budget and  significantly increases EastStratCom resources, the EU counter-disinformation Unit, to oppose Russian-financed campaigns. Nevertheless, the danger in focusing on external threats is to neglect endogenous sources of disinformation in Member States.

Hence, the Commission and Member States will set up a Rapid Alert System, which concrete form has not been detailed yet. Audiovisual media regulators will be tasked to monitor the implementation of the Code of Conduct signed by online platforms. We are looking forward to this monitoring, as it might constitute a first step for audiovisual regulators to step up on this issue.

Finally, it is very positive that teams of multi-disciplinary independent fact-checkers and researchers will be supported, as well as media literacy initiatives. EU DisinfoLab supports this recommendation in its report on automated tackling of disinformation to be presented on 13 December to the European Parliament Science and Technology Options Assessment Panel.

According to Jakub Kalenski from Atlantic Council and Roland Freudenstein from Wilfried Martens Center“what we need now is a follow-up in the sense of a speedy implementation. The extra resources need to be dedicated to those people who have a track-record in countering disinformation, not to units that talk about “strategic communication”, but in fact do not deliver in countering disinformation.”

From bots to economic boom?

In a culture that is heavily structured around image, it is now clear that DeepFakes are disinformation next big thing. WITNESS, the human rights organization focused on the power of video and technology for good, held the first-ever expert meeting connecting experts to share recommendations around this issue. Interestingly, investor Ryan Holmes is of the opinion that in an economy based on information exchanges, economical solutions will arise to prove the validity of this information, thus fake news patrolling might be the next internet boom.

France: did Facebook fueled anger?

Fuel protests and demonstrations organized in the past weeks have partly turned into a violent riot in Paris and other major cities. The movement that started in small towns and rural areas, was born on Facebook. Interestingly, the recent change in Facebook algorithm favoring local news might have contributed to spread the anger and help coordinate the actions. Newspaper Le Monde, reports that the French General Defense Secretariat is investigating on fake accounts sharing disinformation on the issue. According to a study, hundreds of accounts could be linked to Russian influence.


What to read, watch and listen to this week:


See all previous and upcomi eng disinfo events in our agenda


Media literacy initiative Lie Detectors is one of the laureates of 2018 the Digital Skills Award

Disinfo graphic banner blue w logo-01

Disinfo Update 06/12/2018

Facebook: the empty chair

Last Tuesday certainly wasn’t a good day for Richard Allen, Facebook VP for Policy. As Zuckerberg refused to appear (his chair was pointedly left empty), Allen answered questions of a “Grand Committee” of MPs from 9 countries at Britain’s House of Commons. Among the concerns raised by lawmakers was Facebook’s policies regarding third-party application developers and the use and collection of user data. After having seized internal documents from a lawsuit opposing Facebook to Six4Three, an app developing company, MP, Damian Collins suggested that the platform was made aware of suspicious Russian behaviour on its platform as early as 2014, which according to Facebook was a false alarm. These emails, to which the Wall Street Journal had access, also show that Facebook considered charging companies for continued access to user data in 2012. In the meantime, following up on the “Definers” case, BuzzFeed News, accessed to emails proving that Sandberg herself was actively involved in looking into Soros and his possible financial motivations. 

Regulation is now?

Also auditioned, Elizabeth Denham, UK information Commissioner advocates for transnational cooperation between regulators. According to her, the era of self-regulation is over. Following the audition, Parliamentarians from across the world signed a declaration on the “Principles of the Law Governing the Internet”. Can this be any worse for Facebook? Mark Scott from Politico explains why this “empty chair” strategy now is backfiring.

Trying to prove its goodwill, last week, Zuckerberg published a blueprint for content moderation on the platform. Among the measures, he announced the company will hold content moderation meetings with outside experts which minutes will be published.

Read here the full transcript of the audition

Local news

Meanwhile, Facebook will have a new function to police on its app, as it launched “Today In”, its local news aggregator available in 400 small to medium-sized US cities and Australia. In Europe, Facebook is donating 4.5 million pounds ($5.8 million) to train journalists in Britain to support communities that have lost local newspapers and reporters, due to readers switching online.

Defending journalists

First draft kicked off its journalists training project “CrossCheck” in Nigeria, ahead of elections happening in February in the country. Journalists from around the world unite against disinformation but that doesn’t please everyone. During a visit in Paris, Russian foreign minister Lavrov openly declared being “preoccupied” by initiatives such as the Journalism Trust initiative launched by Reporters Without Border. For the organisation, such statement is particularly worrying in a climate of defiance towards a free and rigorous press. From the case of Filipino Journalist Maria RessaPeter Pomerantsev  advocates for the establishment of clear digital rights.

Think global act local

Meanwhile, Facebook will have a new function to police on its app, as it launched “Today In”, its local news aggregator available in 400 small to medium-sized US cities and Australia. In Europe, Facebook is donating 4.5 million pounds ($5.8 million) to train journalists in Britain to support communities that have lost local newspapers and reporters, due to readers switching online.

From Russia with Trolls

Last week, the Guardian revealed the influence of Russian trolls over British media. Members of a Russian “troll army” were quoted more than 80 times across British-read media outlets before Twitter revealed their identity and banned them. Regarding Russian influence on news media, Eric Schmidt, the chief executive of Google’s parent company Alphabet, has said the search engine is preparing to take action against state-run Russian news agencies, including Russia Today and Sputnik, which are accused of spreading propaganda by US intelligence agencies.


What to read, watch and listen to this week:


Disinfo graphic banner blue w logo-01

Disinfo Update 22/11/2018

Dr. Evil

On 18 December, the next DisinfoLab will feature Mieje Arentze from Drog, a research and media-literacy initiative that will teach you how to create your very own fake-news. So, build your army of online trolls and spread conspiracies to influence the public debate and contemplate the dangerous implications disinformation can have on your daily life and society as a whole!

Facebook Season 3 Episode 6 : Thanksgiving

Elliot Schrage, Facebook’s outgoing head of communications and policy, took responsibility for hiring Definers Public Affairs in a blog post on the eve of the US Thanksgiving holiday. He admitted mandating the firm to investigate “Freedom from Facebook” campaign financing by George Soros. The whole company policy towards external consulting firms will be reviewed he claims.

An international “Grand Committee” of national parliaments (UK, Canada, Australia, Argentina, Irland, Brasil, Latvia and Singapour), called for Mark Zuckerberg to appear in an audition in London on November 27. Zuckerberg turned down the request, but Policy VP Richard Allan will go. In reply, the Parliament has used its legal powers to seize internal Facebook documents alleged to contain significant revelations about Facebook decisions on data and privacy controls that led to the Cambridge Analytica scandal.

This call happens just when the LSE Commission on Truth, Trust and Technology, a group made up of British MPs, academics and industry leaders, proposed the Government should hand fresh powers to a new observation body in the UK, rather than existing regulators such as Ofcom and the Information Commissioner.

Journalists: one for all, all for one

In an age of misinformation, is collaboration the future of journalism? Former Vice CTO Jesse Knight advocates for media to consolidate a common publishing platform, and thus gain independence from Google and social media platforms. First Draft also calls for support of sustainable newsroom collaborations and verification projects. Verification project “Comprova” in Brazil is a concrete example of such cooperation. In the context of Brazilian elections, partners worked on debunking rumors and suspicious content on social media and on WhatsApp.

Good bots / Bad bots?

Speaking of Whatsapp, Witness, an NGO working on teaching people to use video to testify human rights abuses, issued a series of recommendations on regulating content on Whatsapp. For example, a database of debunked messages that allows reverse image search or easier ways to block WhatsApp direct marketing services

Good news, such reverse databases already exists for Twitter. In the lead up to last month’s election in Brazil, Aos Fatos built a Twitter bot that automatically corrects people who share fake news stories. Called Fátima, the automated account leverages AI to scan Twitter for URLs that match fact checks in Aos Fatos’ database of articles. Then, the bot replies to the Twitter user with a link to the fact check. Bots spread a lot of fakery, but they can also debunk it.


What to read, watch and listen to this week:


HR corner

Disinfo graphic banner blue w logo-01

Disinfo update – 19/11/2018

Oops I did it again!

New York Times revealed how Facebook, when trying to fight through the crisis, hired a public affairs company missioned to write articles criticizing tech competitors while downplaying the impact of Russia’s misinformation campaign on the platform, and pushing the idea that George Soros was behind a growing anti-Facebook movement. Facebook replied to inaccuracies in a blogpost and President of Open Society Foundation, Patrick Gaspard, addressed a letter to Mark Zuckerberg. As scandals accumulate on Facebook, disinformation researchers, just as citizens start to get out of patience says Nina Jancowicz.

A Message from Paris

Last week was a busy week in Paris. The city was hosting the internet governance forum (IGF), organised by the United Nations. This year, the information disorders issues took a significant part in the debate on media and content. We can only welcome the fact that overall, all debates promoted an active multi-stakeholder collaboration in this field. In the meantime was held the Paris peace forum, a new annual event that gathers all actors of global governance. On that occasion , 7 heads of State (among them Justin Trudeau and Emmanuel Macron) signed an opinion committing to support Reporters Without Borders work of the International Information and Democracy Commission.

Fake can kill

Disinformation can fuel ethnic violence. After information shared on Whatsapp have caused lynching in India (read here the summary of our last webinar about disinformation in India), incendiary images shared on Facebook have contributed to ethnic violence in Plateau state in Nigeria. Facebook’s third-party fact-checking partners in the country have committed just four full-time fact checkers to review false information, on a platform used by 24 million Nigerians. Read the very well documented story on BBC news. Similar issue was raised at IGF by journalist Ishara Danasekar in Sri Lanka.


What to read, watch and listen to this week:


Disinfo graphic banner blue w logo-01

Disinfo Update 12/11/2018

Midterm fakes

Midterms polarized results mirror a polarized debate between democrats and republicans, with rumours of fraud and voters suppression spread by both camps. But that was not the only disinformation, the New York Times called on its readers to share examples of election-related misinformation. In all, more than 4,000 examples of misinformation were submitted. Some were even spread by candidates themselves, as the “jobs not mobs” slogan, which started out as a meme, then turned into a political slogan.

More tapas?

Spain and Russia have agreed on a disinformation forum to tackle fake-news in Catalonia.

Web Summit: the internet father and the sons of anarchy

Last week, the tech community held its major world conference: The Web Summit. Father of the internet Tim Berners-Lee announced a “contract for the web”, setting ethical standards around privacy and open access to the internet. Speaking of ethics, European commissioner Vera Jourova suggested an Hippocrat oath for tech designers in order to put an end to “the online anarchy around elections”. As they feel the change coming, big techs seem to have joined the advocacy for regulation, yet Mark Scott from Politico warns politicians not to get fooled… 

Metadata crafts

MIT technology review present two startups using algorithms to track when images are edited, from the moment they’re taken. Typically, pictures online contain metadata that can be manipulated. Data & Society Research Affiliate Amelia Acker’s analyzes how bad actors manipulate metadata to create effective disinformation campaigns, and provides tips for researchers and technology companies trying to spot this “data craft.” Meanwhile, InVid image fact-checking plug-in is now used by 7000 people.


To read this week:
Reporters without borders published its “International Declaration on Information and Democracy, which establishes democratic guarantees for the global information and communication space.
“Belief in Fake News is Associated with Delusionality, Dogmatism, Religious Fundamentalism, and Reduced Analytic Thinking” in the Journal of Applied Research in Memory and Cognition.
 UNDP published the first publicly available study that analyses how social media is used by al-Shabaab, Boko Haram and ISIL to contribute to radicalisation in seven African countries.

Mark your calendar

November 12 @ Paris — Hackathon : Relever les défis des troubles informationnels à l’ère numérique
November 29 @ Pullman Riga Oldtown — “Disinformation and Fake news challenge to democracy” event hosted by civil society organisation “ManaBalss”, part of a EC supported project “Smart eDemocracy Against Fake News (SMARTeD)”
December 3 @ Berlin  —  News Impact Summit with @GoogleNewsInit  — December 6 @ Oxford — Book launch: Journalism, ‘Fake News’ & Disinformation
Disinfo graphic banner blue w logo-01

Disinfo update – 29/10/2018

Facebook is paying the bill

Following the Cambridge Analytica data scandal, Facebook has been fined £500,000 by UK’s data protection watchdog and European Parliament urges for the ban of targeted political advertising on Facebook to restrict the diffusion of false political information during the upcoming European elections. Under such scrutiny, Facebook has announced it will be downranking stories with false headlines. The new rating comes after several other recent changes to Facebook’s fact-checking project. Timely, former UK prime minister Nick Clegg, just has been appointed head of the global affairs and communications team. Hopefully, this appointment will ease relations between the social media platform and EU institutions. But on the eve of US mid-term elections, and as Facebook promised it would disclose the identity of whom political ads were paid by, Vice revealed it has easily managed to buy news on behalf of senators. Just when the company said last week it had removed 82 pages, accounts and groups linked to Iran that had targeted U.K. and U.S. users. Still a long way to go…

Brazil, India and Whatsapp

On 28/10/2018, Jair Bolsonaro was elected president of Brazil. As for most recent elections, false news was intensively spreading during the campaign. Different in Brazil was the massive spread of disinformation through Whatsapp. Alarmed, fact-checkers published an opinion in the New York Times which revealed the scale of the phenomenon. A similar situation on Whatsapp is happening in India. Viral misinformation has become a core problem on the platform, especially because misleading content is now shared virally through end-to-end encrypted solutions.

Join our next webinar on November 8. Govindraj Ethiraj, founder of fact-checking initiative BOOM will describe the disinformation landscape in India and how we can deal with fact-checking on Whatsapp.

From journalism to viruses

Did you ever wonder how journalists verify the truthfulness of eyewitness videos? The New York Times provides some insight on his eyewitness raw video verification process: a mix of traditional journalistic diligence and cutting-edge internet skills. Besides journalists, a new actor has entered the fight against disinformation. McAffee, the 30-year old company traditionally known for its anti-virus software, published an analysis of how cybersecurity breaches could contribute to disseminate false information during the US mid-term elections campaign. 


Some readings recommendations for your week