Disinfo graphic banner blue w logo-01

Disinfo Update 28/01/2019

Rationalisation vs cognitive laziness

In the New York Times, professors Gordon Pennycook and David Rand explain why we believe in fake news. In this debate, one group claims that our ability to reason is hijacked by our partisan convictions: that is, we’re prone to rationalization. The other group claims that the problem is that we often fail to exercise our critical faculties: that is, we’re mentally lazy. However, recent research suggests a silver lining to the dispute: Both camps appear to be capturing an aspect of the problem.

These findings might have real implications for public policy, as the research suggests that the solution to politically charged misinformation should involve devoting resources to the spread of accurate information and to training or encouraging people to think more critically.

Chief executive of transparency

In a bold move to restore trust, CEOs of social platforms have entered into transparency mode.. In an op-ed  titled “The facts about Facebook” in the Wall Street Journal, Mark Zuckerberg explains the company’s advertising strategy. In this article, he admits that the company’s data dealings can “feel opaque” and cause distrust. But he maintained that users are in control and that Facebook asks the user’s consent to access his data, as required by the GDPR.

In an extensive interview to RollingStone, Twitter CEO Jack Dorsey talks about him and the social media company. Asked about the amplification role that the company plays on the amplification of false narratives, he replies “The question we’re now asking ourselves is, if that is indeed misleading, how do we stop its spread? We can amplify the counter-narrative. We do have a curation team that looks to find balance. A lot of times when our president tweets, a Moment occurs, and we show completely different perspectives. So a lot of times, people don’t just see that tweet.”

Lobbying or regulation, that is the question

European Commission documents obtained through a freedom of information request by Corporate Europe Observatory, a lobbying watchdog, shed light on Facebook’s lobbying strategy towards European Institutions. Laura Kayali describes in Politico how Facebook tried to discourage regulation, and how the company’s views on regulation led to tension with the European Commission.

The misunderstanding certainly was that the approach toward regulation differs in the EU than the US, where the company has its headquarters. Yet, even the US regulator is considering fining Facebook for violating a legally binding agreement with the government to protect the privacy of personal data. In this context, US policymakers are considering a data privacy legislation.

Some other news:

  • EU most hackable elections? Politico Laurens Cerulus points out the cybersecurity and disinformation risks ahead of May’s European elections, coming back to the history of European governments hacking.
  • News-rating verification plug-in NewsGuard, a partner of Microsoft’s Edge browser warns readers of daily mail that “this website generally fails to maintain basic standards of accuracy and accountability”. The company conduct its ratings based on nine journalistic criteria and issue “Red” or “Green” credibility ratings, accompanied by detailed written “Nutrition Labels” for each website.
  • How to use 4Chan to cover conspiracy theories: practical tips by Daniel Funke in Poynter
  • Fake news in the fight against Ebola: rumours spread over the radio and Whatsapp critically hinders the work of health authorities and humanitarians’ organisations.
  • How the #10yearchallenge sparked lots of fake and misleading images: the viral game has been an opportunity for several fake information to pup up.

Agenda and announcements

HR corner

  • IFTF Institute for the future is hiring a Lab Director to expand its Digital Intelligence Lab into a world-changing center for forward-looking research that examines how new technologies can be used both to benefit and challenge democracy and what we need to do to build a healthy and resilient digital society. If you meet these requirements, send resume, references, and a cover letter explaining why you are the right person for this job to Katie Fuller at kfuller@iftf.org
  • First Draft is expanding its team in London and New-York. A dozen positions are now open.
  • NewsGuard, a new service that fights misinformation by rating news and information websites for credibility and transparency, is preparing to expand to several European markets, including the United Kingdom, Germany, Italy, and France. NewsGuard is looking for trained journalists, experienced editors and fact-checkers interested in joining its European editorial teams. Ideal candidates will be able to write and speak both in English and in at least one language commonly spoken in the countries listed above. Interested applicants should send a resume and brief cover letter to NewsGuard’s Executive Editor, Eric Effron, at eric.effron@newsguardtech.com.
  • BuzzFeed News is hiring two contractors to help produce debunking videos.
Disinfo graphic banner blue w logo-01

Disinfo Update 21/01/2019

Sputnik who?

Last week, Facebook announced it took down two large-scale disinformation operations linked to Russian state actors and operating across Eastern and Central Europe. The largest network presented independent newspapers, that were linked to Russian state news agency Sputnik. Facebook said the 364 pages and accounts removed Thursday had almost 800,000 followers and had spent around $135,000 on ads on the platform between October 2013 and this month. The countries targeted included Romania, Latvia, Estonia, Lithuania, Georgia, and Moldova. The pages frequently promoted anti-NATO sentiment and protest movements, Facebook said. DFR Lab published an extensive analysis of the operation. 

Now Sheryl has a plan

Timely, Facebook’s COO Sheryl Sandberg unveiled five new ways the company would be addressing these issues at the annual DLD conference in Munich, staged ahead of the World Economic Forum.

1. Investing in safety and security

2. Protections against election interference

3. Cracking down on fake accounts and misinformation

4. Making sure people can control the data they share about themselves

5. Increasing transparency

Some other Facebook news:

Whatsapp disinfo problem

Disinformation on Whatsapp is difficult to monitor and debunk due to the encrypted feature of the messaging app. Though, the messaging app has a disinformation problem (see the last Brazilian elections or health disinformation spreading on the platform) and tries to figure out what solution could be implemented. It has started to ban users who show suspicious behaviour or may really be software bots. It has also added a notification to show when a message has been forwarded from another account and has limited the number of times you can forward a link. Now, together with the dutch project Drog, they will try to educate its users through an in-app online game, Bad News. The game has been presented at our last webinar, read the summary here.

Global disinformation warming

A study from the Yale School of Forestry & Environmental illustrates how a large-scale misinformation campaign has eroded public trust in climate science and stalled efforts to achieve meaningful policy, but also how an emerging field of research is providing new insights into this dynamic. In the paper, they identify potential strategies to confront these misinformation campaigns across four related areas: public inoculation, legal strategies, political mechanisms, and financial transparency.


Agenda and announcements

Europuls – Centre for European Expertise, a Romanian NGO gathering experts in EU affairs based in Brussels and Bucharest, together with the Association for Independent Press from the Republic of Moldova, are launching an updated version of the StopFals app for mobile phones. Pinnochio, whose nose grows every time a lie is told, will help users quickly distinguish between partial fakes, manipulative information or serious fakes.

Calls for papers and awards

  • The first AoIR Flashpoint Symposium seeks to investigate platform-driven changes and emergent practices of everyday-life content production occurring “below the radar”: Submissions are due by 20 February 2019
Disinfo graphic banner blue w logo-01

Disinfo Update 14/01/2019

2018 has been an intense year in the fight against disinformation. We wish 2019 to be the year of new resolutions (and actions) from online platforms and regulators. DisinfoLab is preparing a great present for you and you can make a wishlist!  What are your plans on 28-29 May? Binge-watching Netflix? Depression after European elections? Tell us who you would like to hear at our annual conference instead.
Make your wishlist

Whatever dirty tricks it takes?

The political debate is not done yet with online manipulation in the US. A top official in President Trump’s campaign, Paul Manafort shared political polling data with a business associate tied to Russian intelligence, according to a court filing unsealed on Tuesday. The document provided the clearest evidence to date that Trump ‘s campaign may have tried to coordinate with Russians during the 2016 presidential race.

Well, is the grass greener by the Democrat’s? Nothing is so sure, as it has been revealed the Democrats faked Facebook pages on the outlaw of alcohol in Alabama Race. According to an activist who worked on the project “If you don’t do it, you’re fighting with one hand tied behind your back” According to him Republicans are using such trickery and that Democrats cannot unilaterally give it up “You have a moral imperative to do this — to do whatever it takes.” The New York Times already reported last month on a separate project by the cybersecurity company New Knowledge that used its own bogus conservative Facebook page and sent Russian-looking Twitter accounts to follow Republican candidate Roy S. Moore to make it appear as if he enjoyed Russian support.

If disinformation becomes a legitimate campaign tool, some candidates might win the race, but we’ll be losing the democratic debate in society.  

Mark good resolution or Zuck Fireside chat

Apparently, Mark Zuckerberg’s 2019 challenge is to get out of his bunker and talk to people, describes Quartz. Facebook CEO Mark Zuckerberg announced his annual challenge: hosting public talks about the future of technology. Zuckerberg, who doesn’t like public appearances, will every few weeks meet with leaders, experts and Facebook’s community members to talk about “the opportunities, the challenges, the hopes, and the anxieties” related to the topic, he said in a Facebook post on Jan. 8.  Damian Collins, chair of the UK DCMS parliamentary committee reacted to this announcement in a sarcastic Tweet by inviting Zuckerberg to come to London to answer questions from parliamentarians, as he refused to appear for an audition. But according to Kurt Wagner in Recode, we should actually listen to him.

Mark, we heard you. Today, we are inviting Facebook VP for Public Policy Richard Allan to speak at our annual conference and meet with the civil society working on disinformation. We’ll keep you posted on his reply.

Twitter breaking the ice

At CES last week, Twitter representatives announced to Engadget that it will be launching a new program to let users reshape how conversations on its site look and feel. The first version of the beta will focus on a new design for the way conversation threads work on Twitter told Sara Haider, Twitter’s director of product management to TechCrunch. This includes a different colour scheme, and visual cues to highlight important replies. The idea is for users to try out new organization and context features with their followers, such as the status updates and “ice breaker” tweets we saw being tested last year, which are designed to encourage people to talk to each other. Twitter is set to start testing the program in the coming weeks, and while anyone will be able to apply to join, only a few thousand users are going to get in.

Older and wiser?

Older Americans are disproportionately more likely to share fake news on Facebook, according to a new analysis by researchers at New York and Princeton Universities. The study, published in Science Advances, examined user behaviour in the months before and after the 2016 US presidential election. Among the users who agreed to share their Facebook data, 11 percent of users older than 65 shared a hoax, while just 3 percent of users 18 to 29 did. These findings clearly support that media literacy efforts should also target elders.


What to read, watch and listen to this week:


See all past and upcoming events in our agenda

Looking for fame? Calls for papers and awards

  • EU Project call for proposal “Media Literacy for All”: open until 28 February 2018
  • WeVerify, an EU co-funded horizon 2020 project that deals with algorithms supported verification of digital content hold its first meeting in Sofia on 14 and 15 January, and you can let them know what issues you think are the most pressing in this area. They might include your answer in their research and project work.
Disinfo graphic banner blue w logo-01

Disinfo Update – 7/01/2019

2018 has been an intense year in the fight against disinformation. We wish 2019 to be the year of new resolutions (and actions) from online platforms and regulators. DisinfoLab is preparing a great present for you and you can make a wishlist!  What are your plans on 28-29 May? Binge-watching Netflix? Depression after European elections? Tell us who you would like to hear at our annual conference instead.
Make your wishlist

Brace yourselves, regulation is coming

The latest scandals from Cambridge Analytica to 2016 elections interference have caught legislators’ attention to tech regulation in the US. This summer, California’s state legislature passed a groundbreaking bill that would give residents unprecedented control over their data. According to Wired, 2019 is much likely to see US data protection legislation as tech companies are now pushing for national legislation to avoid such state-based rules. The Trump administration’s National Telecommunications and Information Administration has released its own point-by-point proposal, describing in unspecific terms a set of “privacy outcomes” the administration would like to see.

Facebook company of the year

Mark Zuckerberg addresses his wishes and reviews all actions taken by Facebook in 2018 to face its new challenges, in particular regarding elections interference and and spread of harmful content.

Nothing is so sure, as The New York Times has been provided with more than 1,400 pages from Facebook moderation rulebooks by an employee. An examination of the files revealed numerous gaps, biases and outright errors. As Facebook employees grope for the right answers, they have allowed extremist language to flourish in some countries while censoring mainstream speech in others. In the absence of governments or international bodies that can set standards, Facebook is experimenting on its own.

After two years of repeated scandals, is Facebook at a turning point where it finally understands it is both a platform and a publisher? Nicolas Thompson and Fred Vogelstein from Wired come back on those past to years of hell for the company.


Russia’s media regulator investigating if BBC is “compliant with russian law“. The announcement of the investigation came a day after Ofcom said it was considering imposing sanctions on RT – which is financed by the Russian state – following an investigation into its skewed coverage of the Sergei Skripal poisoning. Meanwhile, a list of BBC reporters in Russia has been leaked online along with their photos. The leak comes after Sunday Times published names of journalists working for Moscow-backed Sputnik’s UK bureau

Play with the Devil

Just before Christmas break, we welcomed Marije Arentze, from media literacy initiative Drog, for our 3rd Webinar. Drog is a multidisciplinary team of academics, journalists and media experts. They conduct research, offer workshops, educational programmes and create innovative tools thathelp you build “resistance” to disinformation, using “gamification”. Drog hascreated  the social impact game Bad News, where you take on the role of a propagandist. They believe the best way to cultivate a sixth sense to recognize and expose disinformation is to create it yourself. Read about their approach and the challenges of media literacy in our latest article.

How much of the internet is fake? 

How much of the internet is fake? Studies generally suggest that, year after year, less than 60 percent of web traffic is human; some years, according to researchers, a healthy majority of it is bot. Even your Facebook friends might be fakes. Researchers are now able to copy the “styles” of source faces onto destination faces, creating blends that have copied features, but which look like entirely new people. Than who should we trust? Doubt is everywhere, as Der Spiegel star reporter confessed he made up more than a dozen stories. According to Anna Altman, this story might change the relationship of the newspaper with its readers, who placed trust in a robust media known for the emotional writing of its reporters and reliable fact-checking process.


What to read, watch and listen to this week:

See all past and upcoming events in our agenda

Looking for fame? Calls for papers and awards