Disinfo graphic banner blue w logo-01

Disinfo Update 05/02/2019


Last week was the first exam session for the signatories of the EU code of practice. This week’s newsletter will come back to the first reports by the signatories (Google, Twitter, Facebook, Mozilla and trade associations representing the advertising industry), presenting their efforts to comply with their commitments. 

Facebook (link to Facebook’s report)

  • Will extend the transparency on political advertising already in place in the US, Brazil, and UK. The person paying for the ad will need to confirm its location and identity. The ads will be archived for seven years.
  • Will strengthen the verification process of content authenticity and detection of fake accounts.
  • Will continue to rely on a network of fact-checkers to detect false stories and will promote trusted content in the News Feed.
  • Will establish a European research advisory commission to award research identified as relevant to the Academic community.

Google (link to Google’s report)

  • Has already implemented obligations for advertisers to comply with policies against misrepresentation complemented by a “valuable inventory policy”. In 2019, the company should provide additional metrics specific to EU member states and upgrade implementation mechanisms before the European elections.
  • Regarding political ads in the context of EU elections, Google will also introduce an election-ads transparency report and searchable ad library in a downloadable format.
  • The company also underlines its efforts to surface fact-check content and support projects to surface indicators of credibility. 
  • Hired dedicated staff to work on disinformation project.
  • Will roll out a new version of Firefox with additional privacy settings and tracking protection targeted to reduce user’s exposure to disinformation campaigns. A specific version including relevant add-ons should be released in March.
  • Mozilla foundation also recruited a cohort specifically working on disinformation issues

Mozilla (link to Mozilla’s report)

  • Hired dedicated staff to work on disinformation project.
  • Will roll out a new version of Firefox with additional privacy settings and tracking protection targeted to reduce user’s exposure to disinformation campaigns. A specific version including relevant add-ons should be released in March.
  • Mozilla Foundation also recruited a cohort specifically working on disinformation issues.

According to Mozilla, Facebook ad archive doesn’t provide enough technical feature to allow third-party to efficiently use their data to provide apps that would bring more transparency to political targeting. Mozilla is currently developing an add-on to its Firefox browser that would analyse ad targeting on Facebook.

Twitter (link to Twitter’s report)

  • Also underlines its ad policy, draws attention to its annual transparency report and ads transparency centre launched in 2018. Promoted content are also clearly labelled as such.
  • Twitter will continue its efforts to address spam, malicious automation, and fake accounts.
  • In terms of support to the research community, Twitter released account suspected of information operations and provides support to organisations focusing on the European environment (including EU Disinfolab, which received support from Twitter in 2018).

Advertising trade associations (reports from World Federation of Advertisers, European Association of Communication Agencies and Interactive Advertising Bureau Europe)

When it comes to the trade associations representing the advertising sector, they committed to raise awareness about the Code amongst their members. But one can regret the absence of corporate signatories, while brands and advertisers should play a greater role in the efforts to demonetize purveyors of disinformation.

Commission wants more efforts

At a conference in Brussels, European Commissioner for digital affairs Mariya Gabriel welcomed the efforts made by the signatories of the code of practice but stressed remaining weaknesses to be improved ahead of the upcoming European elections in May:

  • Most of the initiatives are deployed only in some members states
  • The platform’s mostly commit on transparency over political ads
  • KPIs should be detailed for each member states
  • Support to research should be strengthened
  • Users should be empowered with relevant tools

The Commissioner also announced a week dedicated to media literacy in March.

Such efforts by tech companies should be welcomed and supported. From our perspective, we still regret a lack of transparent access to data from several actors. As each of the signatories are focusing on political advertising transparency, and repositories of political ads, we can regret the lack of harmonisation in this matter, as a common repository would help analyse cross-platforms campaigns.


  • Researchers at Cardiff University will study the rise of online alternative political media and public attitudes towards mainstream media.  The three-year project will investigate the production, content and consumption of both left and right-wing alternative online political media.
  • Facebook restricts campaigners’ ability to check ads for political transparency: Social media network says the change was part of a crackdown on third party plug-ins
  • Could ‘fake news’ be good news for journalism? What is the future of journalism in a world of ‘fake news’, social media and citizen journalism? LSE professor Charlie Beckett takes a look.
  • Disinformation and democracy: The home front in the information war: In this Discussion Paper, Paul Butcher assesses the various efforts that have been made to fight the spread of disinformation and finds that the results are mixed

Agenda and announcements

ASD seeking input on a study on institutionalizing hybrid threat response. Take part to the survey

HR corner

  • The world wide web foundation has open positions as policy manager and communication officer.
Disinfo graphic banner blue w logo-01

Disinfo Update 28/01/2019

Rationalisation vs cognitive laziness

In the New York Times, professors Gordon Pennycook and David Rand explain why we believe in fake news. In this debate, one group claims that our ability to reason is hijacked by our partisan convictions: that is, we’re prone to rationalization. The other group claims that the problem is that we often fail to exercise our critical faculties: that is, we’re mentally lazy. However, recent research suggests a silver lining to the dispute: Both camps appear to be capturing an aspect of the problem.

These findings might have real implications for public policy, as the research suggests that the solution to politically charged misinformation should involve devoting resources to the spread of accurate information and to training or encouraging people to think more critically.

Chief executive of transparency

In a bold move to restore trust, CEOs of social platforms have entered into transparency mode.. In an op-ed  titled “The facts about Facebook” in the Wall Street Journal, Mark Zuckerberg explains the company’s advertising strategy. In this article, he admits that the company’s data dealings can “feel opaque” and cause distrust. But he maintained that users are in control and that Facebook asks the user’s consent to access his data, as required by the GDPR.

In an extensive interview to RollingStone, Twitter CEO Jack Dorsey talks about him and the social media company. Asked about the amplification role that the company plays on the amplification of false narratives, he replies “The question we’re now asking ourselves is, if that is indeed misleading, how do we stop its spread? We can amplify the counter-narrative. We do have a curation team that looks to find balance. A lot of times when our president tweets, a Moment occurs, and we show completely different perspectives. So a lot of times, people don’t just see that tweet.”

Lobbying or regulation, that is the question

European Commission documents obtained through a freedom of information request by Corporate Europe Observatory, a lobbying watchdog, shed light on Facebook’s lobbying strategy towards European Institutions. Laura Kayali describes in Politico how Facebook tried to discourage regulation, and how the company’s views on regulation led to tension with the European Commission.

The misunderstanding certainly was that the approach toward regulation differs in the EU than the US, where the company has its headquarters. Yet, even the US regulator is considering fining Facebook for violating a legally binding agreement with the government to protect the privacy of personal data. In this context, US policymakers are considering a data privacy legislation.

Some other news:

  • EU most hackable elections? Politico Laurens Cerulus points out the cybersecurity and disinformation risks ahead of May’s European elections, coming back to the history of European governments hacking.
  • News-rating verification plug-in NewsGuard, a partner of Microsoft’s Edge browser warns readers of daily mail that “this website generally fails to maintain basic standards of accuracy and accountability”. The company conduct its ratings based on nine journalistic criteria and issue “Red” or “Green” credibility ratings, accompanied by detailed written “Nutrition Labels” for each website.
  • How to use 4Chan to cover conspiracy theories: practical tips by Daniel Funke in Poynter
  • Fake news in the fight against Ebola: rumours spread over the radio and Whatsapp critically hinders the work of health authorities and humanitarians’ organisations.
  • How the #10yearchallenge sparked lots of fake and misleading images: the viral game has been an opportunity for several fake information to pup up.

Agenda and announcements

HR corner

  • IFTF Institute for the future is hiring a Lab Director to expand its Digital Intelligence Lab into a world-changing center for forward-looking research that examines how new technologies can be used both to benefit and challenge democracy and what we need to do to build a healthy and resilient digital society. If you meet these requirements, send resume, references, and a cover letter explaining why you are the right person for this job to Katie Fuller at kfuller@iftf.org
  • First Draft is expanding its team in London and New-York. A dozen positions are now open.
  • NewsGuard, a new service that fights misinformation by rating news and information websites for credibility and transparency, is preparing to expand to several European markets, including the United Kingdom, Germany, Italy, and France. NewsGuard is looking for trained journalists, experienced editors and fact-checkers interested in joining its European editorial teams. Ideal candidates will be able to write and speak both in English and in at least one language commonly spoken in the countries listed above. Interested applicants should send a resume and brief cover letter to NewsGuard’s Executive Editor, Eric Effron, at eric.effron@newsguardtech.com.
  • BuzzFeed News is hiring two contractors to help produce debunking videos.
Disinfo graphic banner blue w logo-01

Disinfo Update 21/01/2019

Sputnik who?

Last week, Facebook announced it took down two large-scale disinformation operations linked to Russian state actors and operating across Eastern and Central Europe. The largest network presented independent newspapers, that were linked to Russian state news agency Sputnik. Facebook said the 364 pages and accounts removed Thursday had almost 800,000 followers and had spent around $135,000 on ads on the platform between October 2013 and this month. The countries targeted included Romania, Latvia, Estonia, Lithuania, Georgia, and Moldova. The pages frequently promoted anti-NATO sentiment and protest movements, Facebook said. DFR Lab published an extensive analysis of the operation. 

Now Sheryl has a plan

Timely, Facebook’s COO Sheryl Sandberg unveiled five new ways the company would be addressing these issues at the annual DLD conference in Munich, staged ahead of the World Economic Forum.

1. Investing in safety and security

2. Protections against election interference

3. Cracking down on fake accounts and misinformation

4. Making sure people can control the data they share about themselves

5. Increasing transparency

Some other Facebook news:

Whatsapp disinfo problem

Disinformation on Whatsapp is difficult to monitor and debunk due to the encrypted feature of the messaging app. Though, the messaging app has a disinformation problem (see the last Brazilian elections or health disinformation spreading on the platform) and tries to figure out what solution could be implemented. It has started to ban users who show suspicious behaviour or may really be software bots. It has also added a notification to show when a message has been forwarded from another account and has limited the number of times you can forward a link. Now, together with the dutch project Drog, they will try to educate its users through an in-app online game, Bad News. The game has been presented at our last webinar, read the summary here.

Global disinformation warming

A study from the Yale School of Forestry & Environmental illustrates how a large-scale misinformation campaign has eroded public trust in climate science and stalled efforts to achieve meaningful policy, but also how an emerging field of research is providing new insights into this dynamic. In the paper, they identify potential strategies to confront these misinformation campaigns across four related areas: public inoculation, legal strategies, political mechanisms, and financial transparency.


Agenda and announcements

Europuls – Centre for European Expertise, a Romanian NGO gathering experts in EU affairs based in Brussels and Bucharest, together with the Association for Independent Press from the Republic of Moldova, are launching an updated version of the StopFals app for mobile phones. Pinnochio, whose nose grows every time a lie is told, will help users quickly distinguish between partial fakes, manipulative information or serious fakes.

Calls for papers and awards

  • The first AoIR Flashpoint Symposium seeks to investigate platform-driven changes and emergent practices of everyday-life content production occurring “below the radar”: Submissions are due by 20 February 2019
Disinfo graphic banner blue w logo-01

Disinfo Update 14/01/2019

2018 has been an intense year in the fight against disinformation. We wish 2019 to be the year of new resolutions (and actions) from online platforms and regulators. DisinfoLab is preparing a great present for you and you can make a wishlist!  What are your plans on 28-29 May? Binge-watching Netflix? Depression after European elections? Tell us who you would like to hear at our annual conference instead.
Make your wishlist

Whatever dirty tricks it takes?

The political debate is not done yet with online manipulation in the US. A top official in President Trump’s campaign, Paul Manafort shared political polling data with a business associate tied to Russian intelligence, according to a court filing unsealed on Tuesday. The document provided the clearest evidence to date that Trump ‘s campaign may have tried to coordinate with Russians during the 2016 presidential race.

Well, is the grass greener by the Democrat’s? Nothing is so sure, as it has been revealed the Democrats faked Facebook pages on the outlaw of alcohol in Alabama Race. According to an activist who worked on the project “If you don’t do it, you’re fighting with one hand tied behind your back” According to him Republicans are using such trickery and that Democrats cannot unilaterally give it up “You have a moral imperative to do this — to do whatever it takes.” The New York Times already reported last month on a separate project by the cybersecurity company New Knowledge that used its own bogus conservative Facebook page and sent Russian-looking Twitter accounts to follow Republican candidate Roy S. Moore to make it appear as if he enjoyed Russian support.

If disinformation becomes a legitimate campaign tool, some candidates might win the race, but we’ll be losing the democratic debate in society.  

Mark good resolution or Zuck Fireside chat

Apparently, Mark Zuckerberg’s 2019 challenge is to get out of his bunker and talk to people, describes Quartz. Facebook CEO Mark Zuckerberg announced his annual challenge: hosting public talks about the future of technology. Zuckerberg, who doesn’t like public appearances, will every few weeks meet with leaders, experts and Facebook’s community members to talk about “the opportunities, the challenges, the hopes, and the anxieties” related to the topic, he said in a Facebook post on Jan. 8.  Damian Collins, chair of the UK DCMS parliamentary committee reacted to this announcement in a sarcastic Tweet by inviting Zuckerberg to come to London to answer questions from parliamentarians, as he refused to appear for an audition. But according to Kurt Wagner in Recode, we should actually listen to him.

Mark, we heard you. Today, we are inviting Facebook VP for Public Policy Richard Allan to speak at our annual conference and meet with the civil society working on disinformation. We’ll keep you posted on his reply.

Twitter breaking the ice

At CES last week, Twitter representatives announced to Engadget that it will be launching a new program to let users reshape how conversations on its site look and feel. The first version of the beta will focus on a new design for the way conversation threads work on Twitter told Sara Haider, Twitter’s director of product management to TechCrunch. This includes a different colour scheme, and visual cues to highlight important replies. The idea is for users to try out new organization and context features with their followers, such as the status updates and “ice breaker” tweets we saw being tested last year, which are designed to encourage people to talk to each other. Twitter is set to start testing the program in the coming weeks, and while anyone will be able to apply to join, only a few thousand users are going to get in.

Older and wiser?

Older Americans are disproportionately more likely to share fake news on Facebook, according to a new analysis by researchers at New York and Princeton Universities. The study, published in Science Advances, examined user behaviour in the months before and after the 2016 US presidential election. Among the users who agreed to share their Facebook data, 11 percent of users older than 65 shared a hoax, while just 3 percent of users 18 to 29 did. These findings clearly support that media literacy efforts should also target elders.


What to read, watch and listen to this week:


See all past and upcoming events in our agenda

Looking for fame? Calls for papers and awards

  • EU Project call for proposal “Media Literacy for All”: open until 28 February 2018
  • WeVerify, an EU co-funded horizon 2020 project that deals with algorithms supported verification of digital content hold its first meeting in Sofia on 14 and 15 January, and you can let them know what issues you think are the most pressing in this area. They might include your answer in their research and project work.
Disinfo graphic banner blue w logo-01

Disinfo Update – 7/01/2019

2018 has been an intense year in the fight against disinformation. We wish 2019 to be the year of new resolutions (and actions) from online platforms and regulators. DisinfoLab is preparing a great present for you and you can make a wishlist!  What are your plans on 28-29 May? Binge-watching Netflix? Depression after European elections? Tell us who you would like to hear at our annual conference instead.
Make your wishlist

Brace yourselves, regulation is coming

The latest scandals from Cambridge Analytica to 2016 elections interference have caught legislators’ attention to tech regulation in the US. This summer, California’s state legislature passed a groundbreaking bill that would give residents unprecedented control over their data. According to Wired, 2019 is much likely to see US data protection legislation as tech companies are now pushing for national legislation to avoid such state-based rules. The Trump administration’s National Telecommunications and Information Administration has released its own point-by-point proposal, describing in unspecific terms a set of “privacy outcomes” the administration would like to see.

Facebook company of the year

Mark Zuckerberg addresses his wishes and reviews all actions taken by Facebook in 2018 to face its new challenges, in particular regarding elections interference and and spread of harmful content.

Nothing is so sure, as The New York Times has been provided with more than 1,400 pages from Facebook moderation rulebooks by an employee. An examination of the files revealed numerous gaps, biases and outright errors. As Facebook employees grope for the right answers, they have allowed extremist language to flourish in some countries while censoring mainstream speech in others. In the absence of governments or international bodies that can set standards, Facebook is experimenting on its own.

After two years of repeated scandals, is Facebook at a turning point where it finally understands it is both a platform and a publisher? Nicolas Thompson and Fred Vogelstein from Wired come back on those past to years of hell for the company.


Russia’s media regulator investigating if BBC is “compliant with russian law“. The announcement of the investigation came a day after Ofcom said it was considering imposing sanctions on RT – which is financed by the Russian state – following an investigation into its skewed coverage of the Sergei Skripal poisoning. Meanwhile, a list of BBC reporters in Russia has been leaked online along with their photos. The leak comes after Sunday Times published names of journalists working for Moscow-backed Sputnik’s UK bureau

Play with the Devil

Just before Christmas break, we welcomed Marije Arentze, from media literacy initiative Drog, for our 3rd Webinar. Drog is a multidisciplinary team of academics, journalists and media experts. They conduct research, offer workshops, educational programmes and create innovative tools thathelp you build “resistance” to disinformation, using “gamification”. Drog hascreated  the social impact game Bad News, where you take on the role of a propagandist. They believe the best way to cultivate a sixth sense to recognize and expose disinformation is to create it yourself. Read about their approach and the challenges of media literacy in our latest article.

How much of the internet is fake? 

How much of the internet is fake? Studies generally suggest that, year after year, less than 60 percent of web traffic is human; some years, according to researchers, a healthy majority of it is bot. Even your Facebook friends might be fakes. Researchers are now able to copy the “styles” of source faces onto destination faces, creating blends that have copied features, but which look like entirely new people. Than who should we trust? Doubt is everywhere, as Der Spiegel star reporter confessed he made up more than a dozen stories. According to Anna Altman, this story might change the relationship of the newspaper with its readers, who placed trust in a robust media known for the emotional writing of its reporters and reliable fact-checking process.


What to read, watch and listen to this week:

See all past and upcoming events in our agenda

Looking for fame? Calls for papers and awards

Disinfo graphic banner blue w logo-01

Disinfo update – 17/12/2018

Google still has no answers for YouTube’s biggest problem

Now its Sundar Pichai’s turn to be grilled by lawmakers. Last week, Google CEO took the stand before the House Judiciary Committee in Washington, DC. Among the topics discussed, Pichai has been confronted on Youtube’s moderation problem, especially regarding conspiracy videos on the platform. Pichai referred to the company policies against hate speech and that the platform started adding “authoritative” context to its search results for breaking news stories earlier this year. This issue is even more worrying that according to a Pew Research study, social networks, and Youtube in particular, have become the main source of information for younger Americans. 

Washington confidential 

Facebook and Google auditions have shed light on various channels of disinformation. As the US Senate is investigating possible Russian interference in the 2016 elections, a report conducted by Oxford University’s Computational Propaganda Project and Graphika shows that nearly every social media platform has been used by Russian influence to support Trump’s election.

The Guardians and the war on truth

For the first time, journalists have been chosen by Time magazine as “person of the year”. Jamal Khashoggi, Maria Ressa, Wa Lone and Kyaw Soe Oo, and the newsroom of Capital Gazette have been killed or targeted for their work. The magazine has chosen to present them as “The Guardians” in “The war on Truth”, current period of defiance against the press and manipulation of information. This choice is a strong statement against the spread of misinformation by leaders who have sought to undermine critical independent journalism. Among shortlisted candidates are also Donald Trump and Vladimir Putin.

“It became clear that the manipulation and abuse of truth is the common thread of so many of this year’s major stories, from Russia to Riyadh to Silicon Valley,” Edward Felsenthal, Time’s editor in chief and chief executive, said during the announcement on NBC’s “Today” show.

This is the end

Just when journalists are in the headline for defending the truth, fact-checking journalists decided to break their partnership with Facebook. According to them, the platform has not done enough for their action to be efficient. They are particularly resentful against the company hiring a PR firm to go after its opponents, in a way that fuelled conspiracy theories against George Soros. “Why should we trust Facebook when it’s pushing the same rumors that its own factcheckers are calling fake news?” said a current Facebook fact checker

In the Library

What to read, watch and listen to this week:


  • December 18DisinfoLab Webinar: How to tackle disinformation with education? With Marije Arentze from Drog project

See all previous and upcoming disinfo events in our agenda

Fear of getting bored during the Christmas holidays? You can prepare a submission for one of those calls for papers:

  • ACM Transactions on Social Computing: Special Issue on Negotiating Truth and Trust in Socio-Technical Systems
  • The Media & Democracy program at the Social Science Research Council invites submission of abstracts for a research workshop organized in collaboration with Cristian Vaccari (Loughborough University), to be held in New York City on June 13–14, 2019.
  • The Centre for Direct Democracy Studies (CDDS) at the Faculty of Law of the University of Białystok, Poland (UwB) announced a call for papers for the upcoming sixth volume in the European Integration and Democracy Series, devoted to challenges for democracy, the rule of law (Rechtsstaat) and the respect for fundamental rights, posed by contemporary disinformation practices and digital media. Deadline is extended to 31 January 2019. Submit here. 

HR corner

Lie Detectors is looking for a Germany Programme Director and Germany Programme Assistant based in Berlin.

Disinfo graphic banner blue w logo-01

Disinfo update – 11/12/2018

EU has a plan

On Wednesday, the European Commission released its awaited Action Plan against disinformation. The plan focuses on external disinformation, strengthening of EU communication budget and  significantly increases EastStratCom resources, the EU counter-disinformation Unit, to oppose Russian-financed campaigns. Nevertheless, the danger in focusing on external threats is to neglect endogenous sources of disinformation in Member States.

Hence, the Commission and Member States will set up a Rapid Alert System, which concrete form has not been detailed yet. Audiovisual media regulators will be tasked to monitor the implementation of the Code of Conduct signed by online platforms. We are looking forward to this monitoring, as it might constitute a first step for audiovisual regulators to step up on this issue.

Finally, it is very positive that teams of multi-disciplinary independent fact-checkers and researchers will be supported, as well as media literacy initiatives. EU DisinfoLab supports this recommendation in its report on automated tackling of disinformation to be presented on 13 December to the European Parliament Science and Technology Options Assessment Panel.

According to Jakub Kalenski from Atlantic Council and Roland Freudenstein from Wilfried Martens Center“what we need now is a follow-up in the sense of a speedy implementation. The extra resources need to be dedicated to those people who have a track-record in countering disinformation, not to units that talk about “strategic communication”, but in fact do not deliver in countering disinformation.”

From bots to economic boom?

In a culture that is heavily structured around image, it is now clear that DeepFakes are disinformation next big thing. WITNESS, the human rights organization focused on the power of video and technology for good, held the first-ever expert meeting connecting experts to share recommendations around this issue. Interestingly, investor Ryan Holmes is of the opinion that in an economy based on information exchanges, economical solutions will arise to prove the validity of this information, thus fake news patrolling might be the next internet boom.

France: did Facebook fueled anger?

Fuel protests and demonstrations organized in the past weeks have partly turned into a violent riot in Paris and other major cities. The movement that started in small towns and rural areas, was born on Facebook. Interestingly, the recent change in Facebook algorithm favoring local news might have contributed to spread the anger and help coordinate the actions. Newspaper Le Monde, reports that the French General Defense Secretariat is investigating on fake accounts sharing disinformation on the issue. According to a study, hundreds of accounts could be linked to Russian influence.


What to read, watch and listen to this week:


See all previous and upcomi eng disinfo events in our agenda


Media literacy initiative Lie Detectors is one of the laureates of 2018 the Digital Skills Award

Disinfo graphic banner blue w logo-01

Disinfo Update 06/12/2018

Facebook: the empty chair

Last Tuesday certainly wasn’t a good day for Richard Allen, Facebook VP for Policy. As Zuckerberg refused to appear (his chair was pointedly left empty), Allen answered questions of a “Grand Committee” of MPs from 9 countries at Britain’s House of Commons. Among the concerns raised by lawmakers was Facebook’s policies regarding third-party application developers and the use and collection of user data. After having seized internal documents from a lawsuit opposing Facebook to Six4Three, an app developing company, MP, Damian Collins suggested that the platform was made aware of suspicious Russian behaviour on its platform as early as 2014, which according to Facebook was a false alarm. These emails, to which the Wall Street Journal had access, also show that Facebook considered charging companies for continued access to user data in 2012. In the meantime, following up on the “Definers” case, BuzzFeed News, accessed to emails proving that Sandberg herself was actively involved in looking into Soros and his possible financial motivations. 

Regulation is now?

Also auditioned, Elizabeth Denham, UK information Commissioner advocates for transnational cooperation between regulators. According to her, the era of self-regulation is over. Following the audition, Parliamentarians from across the world signed a declaration on the “Principles of the Law Governing the Internet”. Can this be any worse for Facebook? Mark Scott from Politico explains why this “empty chair” strategy now is backfiring.

Trying to prove its goodwill, last week, Zuckerberg published a blueprint for content moderation on the platform. Among the measures, he announced the company will hold content moderation meetings with outside experts which minutes will be published.

Read here the full transcript of the audition

Local news

Meanwhile, Facebook will have a new function to police on its app, as it launched “Today In”, its local news aggregator available in 400 small to medium-sized US cities and Australia. In Europe, Facebook is donating 4.5 million pounds ($5.8 million) to train journalists in Britain to support communities that have lost local newspapers and reporters, due to readers switching online.

Defending journalists

First draft kicked off its journalists training project “CrossCheck” in Nigeria, ahead of elections happening in February in the country. Journalists from around the world unite against disinformation but that doesn’t please everyone. During a visit in Paris, Russian foreign minister Lavrov openly declared being “preoccupied” by initiatives such as the Journalism Trust initiative launched by Reporters Without Border. For the organisation, such statement is particularly worrying in a climate of defiance towards a free and rigorous press. From the case of Filipino Journalist Maria RessaPeter Pomerantsev  advocates for the establishment of clear digital rights.

Think global act local

Meanwhile, Facebook will have a new function to police on its app, as it launched “Today In”, its local news aggregator available in 400 small to medium-sized US cities and Australia. In Europe, Facebook is donating 4.5 million pounds ($5.8 million) to train journalists in Britain to support communities that have lost local newspapers and reporters, due to readers switching online.

From Russia with Trolls

Last week, the Guardian revealed the influence of Russian trolls over British media. Members of a Russian “troll army” were quoted more than 80 times across British-read media outlets before Twitter revealed their identity and banned them. Regarding Russian influence on news media, Eric Schmidt, the chief executive of Google’s parent company Alphabet, has said the search engine is preparing to take action against state-run Russian news agencies, including Russia Today and Sputnik, which are accused of spreading propaganda by US intelligence agencies.


What to read, watch and listen to this week:


Disinfo graphic banner blue w logo-01

Disinfo Update 22/11/2018

Dr. Evil

On 18 December, the next DisinfoLab will feature Mieje Arentze from Drog, a research and media-literacy initiative that will teach you how to create your very own fake-news. So, build your army of online trolls and spread conspiracies to influence the public debate and contemplate the dangerous implications disinformation can have on your daily life and society as a whole!

Facebook Season 3 Episode 6 : Thanksgiving

Elliot Schrage, Facebook’s outgoing head of communications and policy, took responsibility for hiring Definers Public Affairs in a blog post on the eve of the US Thanksgiving holiday. He admitted mandating the firm to investigate “Freedom from Facebook” campaign financing by George Soros. The whole company policy towards external consulting firms will be reviewed he claims.

An international “Grand Committee” of national parliaments (UK, Canada, Australia, Argentina, Irland, Brasil, Latvia and Singapour), called for Mark Zuckerberg to appear in an audition in London on November 27. Zuckerberg turned down the request, but Policy VP Richard Allan will go. In reply, the Parliament has used its legal powers to seize internal Facebook documents alleged to contain significant revelations about Facebook decisions on data and privacy controls that led to the Cambridge Analytica scandal.

This call happens just when the LSE Commission on Truth, Trust and Technology, a group made up of British MPs, academics and industry leaders, proposed the Government should hand fresh powers to a new observation body in the UK, rather than existing regulators such as Ofcom and the Information Commissioner.

Journalists: one for all, all for one

In an age of misinformation, is collaboration the future of journalism? Former Vice CTO Jesse Knight advocates for media to consolidate a common publishing platform, and thus gain independence from Google and social media platforms. First Draft also calls for support of sustainable newsroom collaborations and verification projects. Verification project “Comprova” in Brazil is a concrete example of such cooperation. In the context of Brazilian elections, partners worked on debunking rumors and suspicious content on social media and on WhatsApp.

Good bots / Bad bots?

Speaking of Whatsapp, Witness, an NGO working on teaching people to use video to testify human rights abuses, issued a series of recommendations on regulating content on Whatsapp. For example, a database of debunked messages that allows reverse image search or easier ways to block WhatsApp direct marketing services

Good news, such reverse databases already exists for Twitter. In the lead up to last month’s election in Brazil, Aos Fatos built a Twitter bot that automatically corrects people who share fake news stories. Called Fátima, the automated account leverages AI to scan Twitter for URLs that match fact checks in Aos Fatos’ database of articles. Then, the bot replies to the Twitter user with a link to the fact check. Bots spread a lot of fakery, but they can also debunk it.


What to read, watch and listen to this week:


HR corner

Disinfo graphic banner blue w logo-01

Disinfo update – 19/11/2018

Oops I did it again!

New York Times revealed how Facebook, when trying to fight through the crisis, hired a public affairs company missioned to write articles criticizing tech competitors while downplaying the impact of Russia’s misinformation campaign on the platform, and pushing the idea that George Soros was behind a growing anti-Facebook movement. Facebook replied to inaccuracies in a blogpost and President of Open Society Foundation, Patrick Gaspard, addressed a letter to Mark Zuckerberg. As scandals accumulate on Facebook, disinformation researchers, just as citizens start to get out of patience says Nina Jancowicz.

A Message from Paris

Last week was a busy week in Paris. The city was hosting the internet governance forum (IGF), organised by the United Nations. This year, the information disorders issues took a significant part in the debate on media and content. We can only welcome the fact that overall, all debates promoted an active multi-stakeholder collaboration in this field. In the meantime was held the Paris peace forum, a new annual event that gathers all actors of global governance. On that occasion , 7 heads of State (among them Justin Trudeau and Emmanuel Macron) signed an opinion committing to support Reporters Without Borders work of the International Information and Democracy Commission.

Fake can kill

Disinformation can fuel ethnic violence. After information shared on Whatsapp have caused lynching in India (read here the summary of our last webinar about disinformation in India), incendiary images shared on Facebook have contributed to ethnic violence in Plateau state in Nigeria. Facebook’s third-party fact-checking partners in the country have committed just four full-time fact checkers to review false information, on a platform used by 24 million Nigerians. Read the very well documented story on BBC news. Similar issue was raised at IGF by journalist Ishara Danasekar in Sri Lanka.


What to read, watch and listen to this week: