by Vernise Tantuco /Rappler.com
Even with their 3rd party fact checking program, misinformation continues to spread on Facebook – through its chat app, Facebook Messenger.
At a glance:
- Misinformation often spreads unchecked in messaging apps. Because these are considered private spaces, trending content often evades detection and fact checking. Some users are exploiting these gaps in implementation to get around Facebook’s misinformation policies.
- WhatsApp is the world’s most popular chat app, followed by Messenger. Both are owned by Facebook. In the Philippines, Facebook Messenger is the chat app of choice, with WhatsApp placing 4th in the country’s total number of active users.
- Activity on messaging apps increased when the COVID-19 pandemic began. In March 2020, Facebook reported that total messaging in “countries hit hardest by the virus” increased by more than 50% that month.
- Despite these risks, Facebook’s fact checking efforts are focused mainly on publicly available posts, allowing misinformation to spread and potentially go viral, unchecked on their messenger apps, especially on Messenger.
- Though Messenger has slowly been rolling out features to curb the spread of misinformation, these rollouts are staggered and these features are available in only 8 countries, excluding the Philippines and 68 others
MANILA, Philippines – Two weeks into 2020, a rumor spread through messaging services claiming that a case of Severe Acute Respiratory Syndrome (SARS) was confirmed in the Shangri-La Plaza mall in Mandaluyong City.
It was cause for concern: Weeks earlier, health authorities from Wuhan, China confirmed 44 cases of what was then a mystery disease, now known as COVID-19. However the SARS rumor turned out to be false.
The claim, which was submitted to Rappler for verification, became the first of thousands of claims about COVID-19 fact checked by members of Coronavirus Facts Alliance, a project of the International Fact-Checking Network (IFCN) at Poynter.
As of January 26, 2021, alliance members, which include fact checking organizations from more than 70 countries, have already fact-checked 10,742 COVID-19 related claims. Of these, 1,352 or 13% are from messaging apps or text messages. Many more rumors may have gone undetected because they are passed along privately from one chat conversation to another.
As it is, misinformation on apps like WhatsApp, Messenger, and Viber – 3 of the most popular in the Philippines – has been a challenge for years now, for both the companies that own them and independent fact checkers.
Exponential Growth
The spread of misleading information, hoaxes, and disinformation on these apps is a concern because their use has been growing exponentially.
Based on data from Hootsuite and We Are Social’s annual digital reports, the number of users of the world’s top 2 messaging apps – WhatsApp and Messenger – has quadrupled globally since 2014. (See the graph below)
Use grew further during the pandemic. In March 2020, Facebook reported that total messaging in “countries hit hardest by the virus” increased by more than 50% that month.
WhatsApp and Messenger are both owned by Facebook. Unfortunately, while the social media giant has initiated numerous efforts to curb misinformation on its social media products, much remains to be done in relation to its messaging apps.
In 2014, Facebook separated the chat function from its main app, redirecting users to download a separate application, Facebook Messenger. Since then, use of the Messenger app globally has tripled.
In the Philippines, Messenger has been the chat app of choice for the past 5 years and its use in the country has also grown exponentially. The chart below shows growth in the use of specific chat apps among internet users in the Philippines since 2015. Messenger use has quadrupled since 2015.
Comparatively, use of other messaging apps among Philippine internet users has not grown significantly. 1
Given the growth in use, and the way some users are subverting Facebook’s misinformation policies while sharing content over messaging, the impact of current fact-checking efforts will continue to be significantly undermined if no urgent action is taken.
Slow action, staggered rollouts of solutions allow real world harm
Rappler has built an interactive timeline of when the 3 chat apps – WhatsApp, Messenger, and Viber – began putting in place features to minimize the spread of misinformation on their services.
Action against misinformation has been more aggressive on WhatsApp ever since 2018, when the company began experimenting in India with restricting the number of times a message can be forwarded. This change followed a series of lynchings in the country earlier that year.
A few months later, before the Brazilian presidential election, there was a truck drivers’ strike that was organized entirely through WhatsApp. “That was the first moment we saw disinformation on WhatsApp,” says IFCN Associate Director Cristina Tardáguila, describing the chaos that ensued. “And that was the first time I saw WhatsApp as very dangerous too.”
These events pushed the world’s most popular chat app to introduce initiatives meant to curb misinformation on their platform.
Also in 2018, WhatsApp introduced labels to indicate when a message has been forwarded and limited the number of times users can forward a message to multiple chats at once.
In August 2019, following their initial experiments in India, they introduced a label to show when a message has been forwarded through a chain of 5 or more people.
In April 2020, amid the COVID-19 pandemic they announced that these kinds of messages can be forwarded to only one chat at a time, which they say has resulted in a 70% reduction in the number of highly forwarded messages on WhatsApp.
However, policies for Messenger rolled out at a snail’s pace and only in select countries. This is problematic because in countries like the Philippines, WhatsApp places 4th in popularity, far behind Messenger, and yet more action was taken to address misinformation on WhatsApp.
Messenger began to label forwarded messages in April 2019, almost a year after WhatsApp first began to act. However these labels only say that the message has been forwarded and do not identify messages that have been forwarded through a chain of 5 or more people.
“We’re committed to ensuring everyone has access to accurate information, and removing harmful content related to COVID-19 across our family of apps,” a Facebook representative said in an email to Rappler.
“We recently introduced a forwarding limit on Messenger, so messages can only be forwarded to five people or groups at a time. Limiting forwarding is an effective way to slow the spread of viral misinformation and harmful content that has the potential to cause real world harm,” the representative said.
We recently introduced a forwarding limit on Messenger, so messages can only be forwarded to five people or groups at a time. Limiting forwarding is an effective way to slow the spread of viral misinformation and harmful content that has the potential to cause real world harm,” the representative said.
Rappler had discussed this lack of limits with Facebook’s Communications team between September and October 2020. Forward limits were implemented on Messenger in the Philippines 3 months after their initial announcement, in December. With the limit, users are able to forward a message to 5 people at once before a warning is flashed. After they acknowledge the limit, they can continue forwarding the same message. The label on a forwarded message indicates that a friend has “forwarded a video” or link, but not that it has been frequently forwarded, which is the case on WhatsApp.
When they announced the release of forward limits on September 3, 2020, Facebook did not mention that it was rolled out in 8 countries only. A Facebook representative specified in an email to Rappler that the forward limit had been made available only in the US, New Zealand, Croatia, Sri Lanka, Chile, Tunisia, Australia, and Myanmar since August 27, 2020. The majority of these countries had elections in 2020.
Slower action and staggered rollouts can allow harm in countries where Messenger is the preferred chat app. Hootsuite and We Are Social’s data show that Facebook Messenger is preferred in 74 countries.2
Among the countries that prefer Messenger over WhatsApp is the US, where Facebook says it had been cracking down on misinformation in the run-up to the presidential elections last November, 2020.
Myanmar, where Facebook was used to incite hate speech against Rohingya Muslims in 2018, is another country that prefers Messenger. Myanmar held a general election on November 8, 2020. (Recently, in early February of 2021, the country experienced a coup d’etat, ousting the elected civilian government and the new military government blocked access to Facebook.)
Not a new problem
Rappler has been spotting false claims on messaging apps long before the COVID-19 pandemic.
For example, amid measles outbreaks in 2018, three different people emailed Rappler about a forwarded message warning others not to receive tetanus vaccines at health centers. The hoax claimed that members of the Islamic State (ISIS) were spreading AIDS and killing people through the vaccination shots.
Rappler fact checked the claim on December 21, 2018, or 10 days after Congress extended martial law in Mindanao, which was originally declared after ISIS-affiliated extremists clashed with government troops in October 2017.
Amid the COVID-19 pandemic, Rappler fact checked a number of forwarded messages that had caused panic. In January, at least 5 rumors circulated, saying there were COVID-19 positive patients in hospitals or an office building. In March, a list of hotels and malls to avoid circulated – these were supposedly places that 11 COVID-19 positive patients frequented.
Also circulated through Messenger were supposed cures against the virus which ranged from ingesting aspirin dissolved in lemon juice boiled with honey, or eating boiled garlic and drinking the water used to boil it.
In June 2020, a conspiracy theory about COVID-19 spread on Messenger, telling recipients not to get vaccinated against the disease once a vaccine is available in the Philippines. The vaccine, the message said, would be a way to forcibly inject microchips into citizens.
As in the case of the Dengvaxia scare, this affects how people will respond to vaccination when the COVID-19 vaccines are finally available.
All of these illustrate the need for Facebook to act immediately to prevent real world harm and also raise the question of why the company did not act earlier, and act globally.
Fact-checking WhatsApp: ‘You speak in a void’
What makes fact checking in messaging apps so difficult?
Unlike fact-checking publicly available videos or posts, which can be found using social media monitoring tools, these messages need to be sent to fact checkers directly for verification. On platforms like Viber and WhatsApp, they’re also protected by end-to-end encryption, which allows users to share text and media without fear of surveillance.3
For IFCN Associate Director Cristina Tardáguila, the lack of data available to fact-checkers is a challenge, both in terms of surfacing content to fact check and knowing whether stories are reaching their intended target. Tardáguila is also the founder of the Brazilian fact checking platform Agência Lupa and was one of the authors of a study on misinformation on WhatsApp in relation to the 2018 Brazilian presidential elections.
“We have no idea what’s trending on WhatsApp. So it’s really, really hard to decide what you should fact check because you have no idea [about] the viralization of a certain topic,” she told Rappler during a call on October 7.
Fact checkers avoid writing about claims that aren’t viral, in order to avoid inadvertently amplifying these claims and giving them more oxygen.
On public channels, fact checkers can observe how many people like, share, or comment on their stories, which gives them an idea of the number of people they are reaching, Tardáguila explained. “In WhatsApp, you speak in a void, right? You’re talking to, maybe just one person? You have no idea how many people will actually read, like, dislike, or comment [on it]. It’s really a weird feeling.”
At the moment, the only way fact checkers spot misinformation in messaging apps is through tiplines. Claims that Rappler eventually fact checked were typically screenshots either emailed by concerned netizens for verification or reposted on the main Facebook platform.
Credibility amplification: ‘a friend of a friend’
Another challenge that public health communicators, journalists, and fact checkers face when it comes to misinformation in forwarded messages are friends and family, who, according to studies, people tend to trust more than public sources.
According to Kantar Media, participants in a 2018 study considered WhatsApp “safer” and regarded the friend sharing a piece of news as an endorser. This makes information shared through the chat app feel more trustworthy than something on the, more ‘public’, Facebook newsfeed.4
The sender behind the message is also something people in marketing are interested in. For instance, a fitness brand would choose an ambassador who’s already fit and leads a healthy lifestyle, because he or she would be more credible as someone supporting the product.
“I think the source [of the misinformation] also matters… because that’s how we do it with brands also; it’s why influencer marketing works. If the source sounds even remotely credible, it will spread like wildfire,” Andrea5, who works for an international tech and marketing firm, told Rappler in a phone interview last August.
The messages that Rappler has fact checked often cite friends of friends or family members. For example, the supposed SARS case in January 2020 was “from a friend of KT.” A rumor that spread in April about raiding hospitals for personal protective equipment was supposedly verified by “a friend who is the child of the CEO of [The Medical City],” a hospital in Pasig. Another chain message from January informed co-workers that the sender’s manager “confirmed cases of COVID-19” at PBCom Tower in Makati.
The World Health Organisation (WHO) has COVID-19 information centers on Messenger, WhatsApp and Viber. The Philippines’ Department of Health (DOH) has one on Viber. But newsgroups, fact checkers, and even these public sources such as the DOH and the WHO must compete with content shared by “friends,” whether these are real friends or, in the case of forwarded messages, complete strangers.
Establishing “authority becomes a competition,” Jed Dalangin, senior manager for Product and Experience Development at Certified Digital Marketer (CDM), told Rappler over the phone in August. CDM gives digital marketing training to businesses and individuals.
Original source: unknown
While messages are packaged to make it appear that the “source” of a message is somebody known to the sender, in reality, many messages have been forwarded so many times that tracing the original source is difficult.
As of early November 2020, as evidenced by the video below, users can still forward content on Messenger more than 5 times, despite earlier pronouncements by Facebook. The video below that is forwarded is a claim that was shared with Rappler through Messenger.
Rappler debunked the claim and it was also debunked by other 3rd party fact checkers, including the Spanish Maldia.es and the UK-based Full Fact. The actual video clip was taken in Ecuador and posted by journalist Carlos Vera on Twitter.
This debunking video by Rappler also shows how messages look when they are forwarded on Facebook Messenger. Notice that despite numerous forwards, the app does not show that the user who sent the message is not the original sender.
Further, while the content itself has already been fact checked by several fact check organizations, the messaging app does not have labels identifying it as potentially misleading content.
Rappler’s video also shows how a message can be sent to chats with one other person, but Messenger users can message up to 150 people at once. From October 2018 to August 2019, members of the same Facebook group could even chat with up to 250 people at once.
The Limits of fact checking
Facebook has 3rd party fact checking partners in most countries where Messenger is popular. These include Canada, Australia, New Zealand, Iraq, Libya, Egypt, Norway, and Greece. But in other countries like Afghanistan and South Sudan, they do not have partners.
Even though information spreads differently on public social media channels than chat apps, Facebook regards their efforts against misinformation on their main platform – like removing content that may lead to real-world harm and 3rd party fact checking – as part of their efforts on Messenger. This policy failure had real world risk. Fact checking efforts on the Facebook platform do not guarantee that misleading and potentially harmful messages will not still be spread over Messenger.
In an email to Rappler on September 25, a Facebook spokesperson explained that the Messenger and WhatsApp platforms have different user bases and features, making it challenging to standardize approaches to misinformation across the board.
Facebook had run separate tests on forward limits – the number of conversations a message can be forwarded to at once – to assess its impact for Messenger, the Facebook spokesperson said.
Facebook’s fact checking program labels posts as false or misleading, and accordingly sanctions pages that are repeat offenders by limiting their ability to monetize, and reduces the distribution of those posts.
But while Facebook takes action on the screenshots of false messages shared publicly on the platform, it stops there. The very same message that was fact-checked already on the Facebook platform can continue to spread unchecked in Messenger conversations. In addition, users are still free to take screenshots of photos, download videos, or forward existing content to their friends and family.
by Vernise Tantuco, with Gemma B. Mendoza /Rappler.com
1. Based on Hootsuite and We Are Social’s annual digital reports, which are only available up to 2020 as of publishing. Data from SimilarWeb as of February 2nd shows that Messenger is the number one app in the Philippines on the Google Play Store, based on Usage Rank (Usage Rank, according to SimilarWeb, is a calculationan algorithm that is based on current installs and active users).
2. Based on the average daily Android app users in each country in December 2019.
3. Facebook Messenger allows users to opt for end-to-end encrypted person-to-person chats through “secret conversations.” Secret conversations do not support group messages and voice or video calls. Doffman, Z. (2020, July 25). Why You Should Stop Using Facebook Messenger. Retrieved February 01, 2021, from Forbes.
4. Studies presented at the online fact checking conference Global Fact 7 in July 2020, said that people tend to believe their friends when verifying information on social media and that some messages on Whatsapp on WhatsApp are influential because they come from friends and family.
5. Name has been changed on the request of the interviewee to protect her clients and company.
Editor’s note: This story previously stated that users in the Philippines could forward content on Messenger more than 5 times at once as of February 2021. This has been corrected to reflect that Facebook Messenger started imposing limits in December 2020. The video in this story has also been corrected to show that there is a label that indicates a friend has “forwarded a video.”