Merry Christmas and a Happy New Year to you all! We’ll be resting up over the break before our return in early January. We look forward to collaborating with you in 2020 🙂
Is Twitter going back to its roots?
Last week, Twitter CEO Jack Dorsey announced Twitter’s plan to build a decentralised standard for social media. Dorsey said that this standard would enable individuals to use a variety of services to access the same network, just as people choose different email providers to see the same messages. But, why is Twitter pursuing this? Well, it comes down to policing speech online. Dorsey notes how Twitter’s global policy to address hate speech and mis– and disinformation is unlikely to be more effective without placing an extra burden on its users.
Dorsey was inspired by an article written by Mike Masnick in which he argues to focus on protocols rather than platforms to promote free speech. With protocols, “there could be widespread competition in which anyone could design their own interfaces, filters, and additional services, allowing whichever ones work best to succeed, without having to resort to outright censorship for certain voices,” wrote Masnick. But, conversely, Verge’s Casey Newton highlights that this direction could allow Twitter to avoid responsibility for some of the platform’s unintended consequences and could enable the formation of hate networks.
Fighting against the rise of misinfodemics
Amidst the growing phenomenon of health-related misinformation, a coalition of 50 LGBTQ, HIV/AIDS, and public health organisations wrote to Facebook last Monday to demand the removal of ads promoting dangerous and false information about HIV prevention. Among the ads, The Guardian reported that one cited how a form of HIV medication caused side effects such as bone and kidney conditions. The ad was paid for by a law firm with the aim of hinting at the possibility of financial compensation. In related news, the Digital Health Lab released a report on health equity through health fact-checking, which outlines a standard of care for health fact-checking, including special attention paid towards vulnerable communities and stigmatised health topics.
In the news
- The UK witnessed an unprecedented amount of deception in its general election to the point where “disinformation has now become normalised,” according to the Institute for Strategic Dialogue. While on the topic of the election, the BBC has revealed that AI wrote its election result stories. Pretty scary, huh?
- Although initially scheduled to be in operation by the end of 2019, last Thursday Facebook announced the delay in the launch of its Oversight Board. In the same blog post, Facebook also shared a human rights review of the Board, claiming that it will “serve as a resource for board members”.
- One Country, One Censor: The Committee to Protect Journalists’ new report details the various methods and tactics China uses to influence the media in Taiwan and Hong Kong, including outright violence against journalists and online disinformation campaigns to name a few.
- The American Press Institute has released strategies for journalists to tell the truth in a time of misinformation and polarisation. Among them is the strategy of the “truth sandwich,” which involves stating a true fact, then the falsehood, then the true fact again.
- Today, we have released the final report on our investigation into a pro-Indian influence network that managed 265 fake media outlets worldwide. This comes at the moment of the BBC’s coverage of our investigation, which looks deeper into those behind the network.
- A recent study has found that users created their own misinformation when given accurate information. More specifically, people given accurate statistics on controversial issues such as migration tended to misremember those numbers to fit commonly held beliefs. There’s a concise write up here.
Events and Announcements
- Call for contributions to the international MISDOOM Symposium on misinformation in online media. The deadline is February 1st.
- Oxford Institute has released an online resource guide for civil society groups looking to better deal with the problem of disinformation.
- Amnesty International has updated its Citizen Evidence Lab for navigating open-source information.
- Think SHEEP before you share: First Draft has created a catchy practice to prevent online users from sharing and being fooled by mis– and disinformation.