Machine Habitus. Massimo Airoldi. Читать онлайн. Newlib. NEWLIB.NET

Автор: Massimo Airoldi
Издательство: John Wiley & Sons Limited
Серия:
Жанр произведения: Социология
Год издания: 0
isbn: 9781509543298
Скачать книгу
social environment – like a social media feed, or like IAQOS’ multicultural neighbourhood.

      The culture in the code allows machine learning algorithms to deal with the complexity of our social realities as if they truly understood meaning, or were somehow socialized. Learning machines can make a difference in the social world, and recursively adapt to its variations. As Salvatore Iaconesi and Oriana Persico noted in one of our interviews: ‘IAQOS exists, and this existence allows other people to modify themselves, as well as modify IAQOS.’ The code is in the culture too, and confounds it through techno-social interactions and algorithmic distinctions – between the relevant and the irrelevant, the similar and the different, the likely and the unlikely, the visible and the invisible. Hence, together with humans, machines actively contribute to the reproduction of the social order – that is, to the incessant drawing and redrawing of the social and symbolic boundaries that objectively and intersubjectively divide society into different, unequally powerful portions.

      Machines as sociological objects

      Why only now?, one may legitimately ask. In fact, the distinction between humans and machines has been a widely debated subject in the social sciences for decades (see Cerulo 2009; Fields 1987). Strands of sociological research such as Science and Technology Studies (STS) and Actor-Network Theory (ANT) have strongly questioned mainstream sociology’s lack of attention to the technological and material aspects of social life.

      In 1985, Steve Woolgar’s article ‘Why Not a Sociology of Machines?’ appeared in the British journal Sociology. Its main thesis was that, just as a ‘sociology of science’ had appeared problematic before Kuhn’s theory of scientific paradigms but was later turned into an established field of research, intelligent machines should finally become ‘legitimate sociological objects’ (Woolgar 1985: 558). More than thirty-five years later, this is still a largely unaccomplished goal. When Woolgar’s article was published, research on AI systems was heading for a period of stagnation commonly known as the ‘AI winter’, which lasted up until the recent and ongoing hype around big-data-powered AI (Floridi 2020). According to Woolgar, the main goal of a sociology of machines was to examine the practical day-to-day activities and discourses of AI researchers. Several STS scholars have subsequently followed this direction (e.g. Seaver 2017; Neyland 2019). However, Woolgar also envisioned an alternative sociology of machines with ‘intelligent machines as the subjects of study’, adding that ‘this project will only strike us as bizarre to the extent that we are unwilling to grant human intelligence to intelligent machines’ (1985: 567). This latter option may not sound particularly bizarre today, given that a large variety of tasks requiring human intelligence are now routinely accomplished by algorithmic systems, and that computer scientists propose to study the social behaviour of autonomous machines ethologically, as if they were animals in the wild (Rahwan et al. 2019).

      With the recent emergence of a multidisciplinary scholarship on the biases and discriminations of algorithmic systems, the interplay between ‘the social’ and ‘the technical’ has become more visible than in the past. One example is the recent book by the information science scholar Safiya Umoja Noble, Algorithms of Oppression (2018), which illustrates how Google Search results tend to reproduce racial and gender stereotypes. Far from being ‘merely technical’ and, therefore, allegedly neutral, the unstable socio-technical arrangement of algorithmic systems, web content, content providers and crowds of googling users on the platform contributes to the discriminatory social representations of African Americans. According to Noble, more than neutrally mirroring the unequal culture of the United States as a historically divided country, the (socio-)technical arrangement of Google Search amplifies and reifies the commodification of black women’s bodies.

      I