Tracking Exposed is a non-profit, free software project that aims to analyze evidence of algorithmic personalization. It was founded in 2016 by Claudio Agosti, a self-taught hacker and developer, currently researcher at the University of Amsterdam. In 2018, the University of Amsterdam’s Department of Media Studies expanded on Claudio’s work, creating the project ALEX – Algorithms Exposed, Investigating Automated Personalization and Filtering for Research and Activism. Through his work, Claudio is empowering end users to understand how aggressively the algorithm is mediating information for them. The project has been applied to four platforms so far. “For Facebook and YouTube, it’s about information quality. For Amazon and Pornhub, it’s about explaining how algorithms exist in other places and can still have an impact on you.”
“The power of the algorithm is to personalize so we cannot expect a group of experts to fully understand. That’s why we try to make technology that is reusable as much as possible.”
Claudio mostly works with researchers in the Netherlands and in Italy, though he has also carried out work related to Argentina. The academic label is important for the legitimacy of the project, and also for finding collaborators and growing the project. The work on Pornhub was carried out by a researcher he met while teaching at the Digital Methods Summer School, who wanted to expose heteronormativity on the platform.
Tracking Exposed gathers data through crowdsourcing, but getting access to wider audiences and groups is challenging. To the extent that outreach depends on marketing and visibility, this can exceed the capacity of a researcher. For a small programming project like Claudio’s, recruitment requires compromises and calculations: investment in onboarding, potential damage and security risks.
“The tool is helpful for the skilled user but not yet accessible for the end user; that’s one of the biggest problems” Claudio explains.
Claudio is working now to make the tool simpler, with more visual results. He would like users to be able to play with and control their own algorithms. He’s also interested in pursuing strategic litigation using his findings as evidence of platform rights violations. He has done election-related monitoring in the past, and he has a project coming up to work with a polling company. Elections are emerging increasingly a business case, he observes; offering this high expertise service could be a form of sustainability for Tracking Exposed. However, advertising composes a small portion of our informative experience, and so we should remain focused rather on the role of our personalized algorithms, Claudio notes: “We should have control of our algorithm because that is the tool responsible for all the content selected.”