The State of Data-Driven Reasoning - Part I
This is part of a series in which I explore the current narrative around data-driven reasoning in social applications.
Governance, especially in a state of emergency, has immensely improved since numbers, indicators, categorisations, and comparisons have been utilised in the second cholera pandemic in the first half of the nineteenth-century [1]. Quantification gives a basis on which change can be understood, risk calculated, and actions defined. Since then, the reliance on data-driven reasoning has intensified through two important events. Firstly, neoliberal policies of the ’80s have completed the dismantling of public and private monopolies over telecommunications networks and digital technologies [2]. It permitted corporate interests to control the narrative about data-driven reasoning. Secondly, the breakthroughs in machine intelligence in the 90’s exemplified by the victory of Deep Blue winning against Garry Kasparov. These events have transformed data into the new oil of the digital economy which is sold through the impression of a “speedy, accessible, revealing, panoramic, prophetic, smart, specific, side-effects-free” algorithm ([3], [4]). Today, the PWC 2020 report shows that those organisations form the largest sector in global market capitalisation. Like no other occupation, data scientists define “the pathway to the future” and thus “they must take a leading part in [ensuring] that the new world […] is a good sort of a place in which to live” as was written in the MIT Technology Review of January 1932.
While recent innovations have indeed led to some societal benefits, they have also demonstrated their potential to be abused or misused in ways their designers could not have imagined [5]. To understand how this was made possible, it is essential to understand the environment in which data-driven solutions are designed. The organisational culture in which they were developed is strikingly homogeneous in social and political positions, marked by a lack of diversity and unwillingness to embrace pluralism ([6], [7], [8]). This culture has been found to navigate rather than resolve the ethical minefield that a business model of informational capitalism faces [9]. As there are no guardrails, it is upon the companies themselves to determine what is good or right. This is examplified by the operation in “the trenches of an International Organisation or NGO” [10]. As the anonymous describes how donors expect them to pivot “as muchof our work as possible to the Covid-19 response[,] responsible data practices are as theoretical as ever[,] without guardrails.” The anonymous concludes that the “only way we will ever get better at this, is if somebody makes us.” Until data driven institutions stop being the owner of ethics, they will continue to define and bend them, creating side-effects that will enter the public discourse through the media, just as the scandals around Facebook and Google’s DeepMind did ([11], [12], [13], [14]).
References
[1] Schweber, Libby (2006). Disciplining Statistics: Demography and Vital Statistics in France and England, 1830–1885. Duke University Press.
[2] Treguer, Felix (2019). Data Politics: Worlds, Subjects, Rights. Chapter: Seeing Like Big Tech: Security Assemblages, Technology, and the Future of State Bureaucracy. Routledge.
[3] Valverde, Mariana and Mopas, Michael (2004). Global governmentality: Governing international spaces. Chapter: Insecurity and the dream of targeted governance. Routledge.
[4] Beer, David (2018). The Data Gaze: Capitalism, power and perception. Sage.
[5] O’Neil, Cathy (2016). Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Penguin Books.
[6] Moss, Emanuel and Metcalf, Jacob (2020) Ethics Owners: A New Model of Organizational Responsibility in Data-Driven Technology Companies. Data & Society Research Institute.
[7] Crawford, Kate et al. (2019). AI now 2019 report.
[8] Cook, Katy (2020). The Psychology of Silicon Valley: Ethical Threats and Emotional Unintelligence in the Tech Industry. Palgrave Macmillan.
[9] Zuboff, Shoshana (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Profile Books.
[10] Anonymous (2020). Data Justice and Covid-19: Global Perspectives. Chapter: Reining in Humanitarian Technology. Meatspace Press.
[11] Powles, Julia and Hodson, Hal H. (2017). Google DeepMind and healthcare in an age of algorithms. Health and Technology.
[12] Wagner, Ben (2018). Being Profiled: Cogitas Ergo Sum. Chapter: Ethics as an Escape from Regulation: From “ethics-Washing” to Ethics-Shopping?’ Amsterdam University Press.
[13] Taylor, Linnet and Dencik, Lina (2020). Constructing Commercial Data Ethics. Technology and Regulation.
[14] Doctorow, Cory (2020) Facebook Is Going After Its Critics in the Name of Privacy. Wired. Accessed: 21.11.2020