On February 7, 2019 the Internet Policy Review published an op-ed by Stefania Milan and Claudio Agosti. We reflect on personalization algorithms and elections, and share some ideas about algorithmic sovereignty and literacy. Thanks to Frédéric Dubois for the invitation.
“Personalisation algorithms allow platforms to carefully target web content to the tastes and interests of their users. They are at the core of social media platforms, dating apps, shopping and news sites. They make us see the world as we want to see it. By forging a specific reality for each user, they silently and subtly shape customised “information diets”, including around our voting preferences. We still remember Facebook’s CEO Mark Zuckerberg testifying before the US Congress (in April 2018) about the many vulnerabilities of his platform during election campaigns. With the elections for the European Parliament scheduled for May 2019, it is about time to look at our information diets and take seriously the role of platforms in shaping our worldviews. But how? Personalisation algorithms are kept a closely guarded secret by social media platform companies. The few experiments auditing these algorithms rely on data provided by platform companies themselves. Researchers are sometimes subject to legal challenges by social media companies who accuse them of violating the Terms of Services of their utility. As we speak, technological fencing-offs are emerging as the newest challenge to third-party accountability. Generally, auditing algorithms fail to involve ordinary users, missing out on a crucial opportunity for awareness raising and behavioural change.
The Algorithms Exposed (ALEX) project1, funded by a Proof of Concept grant of the European Research Council, intervenes in this space by promoting an approach to algorithms auditing that empowers and educates users. ALEX stabilises and expands the functionalities of a browser extension – fbtrex – an original idea of lead developer Claudio Agosti. Analysing the outcomes of Facebook’s news feed algorithm, our software enables users to monitor their own social media consumption, and to volunteer their data for scientific or advocacy projects of their choosing. It also empowers advanced users, including researchers and journalists, to produce sophisticated investigations of algorithmic biases. Taking Facebook and the forthcoming EU elections as a test case, ALEX unmasks the functioning of personalisation algorithms on social media platforms.”
Continue reading in the website of the Internet Policy Review
Welcome to the web self of the project Algorithms Exposed (AKA ALEX).
ALEX is meant to provide researchers, advocates, policymakers and journalists with reliable and accessible algorithmic auditing methods and data. In our view, it empowers users to independently monitor, compare, and reflect upon their information diets. It is innovative because:
- It puts the user in the driver seat, by giving individuals full control of data extraction patterns. In other words, users can decide at any time if and what data to volunteer and to what research project. They can also decide to withdraw their participation whenever they are so inclined.
- It supports data protection and user privacy: it is fully GDPR-compliant, it does not interfere with the user privacy settings and observes only publicly available content appearing in the Facebook news feed, not individual profiles or pages.
- Users retain data sovereignty. Users decide what selected portions of their social media experience they want to share; they own their data and can at any time see how data are used in aggregate analyses.
- It is open-source! You can download and customise it. Although we use Facebook as a test case, our methodology can be applied to any other social media platform.
- By enabling self-awareness, it promotes a healthy information diet. By unpacking the functioning of personalisation algorithms, it fosters a responsible use of social media.