Duty Free Art. Hito Steyerl. Читать онлайн. Newlib. NEWLIB.NET

Автор: Hito Steyerl
Издательство: Ingram
Серия:
Жанр произведения: Изобразительное искусство, фотография
Год издания: 0
isbn: 9781786632456
Скачать книгу

      The point is: people use proxies in order to deal with the terror of total Dasein, or an economy of presence based on the technologically amplified scarcity of human attention and physical presence.

      Even strike-organizer Djordjevic started pursuing a form of proxy politics after the failed art strike. He stopped making art under his own name. Years later he reemerged as a technical assistant for a certain Walter Benjamin’s lecture tours, and has kind of represented him ever since. Whether Benjamin himself is on strike is not known.

       4

       Proxy Politics: Signal and Noise

      A while ago I met an extremely interesting developer. He was working on smartphone camera technology. Photography is traditionally thought to represent what is out there by means of technology, ideally via an indexical link. But is this really true anymore? The developer explained to me that the technology for contemporary phone cameras is quite different from traditional cameras: the lenses are tiny and basically rubbish, which means that about half of the data being captured by the camera sensor is actually noise. The trick, then, is to write the algorithm to clean the noise, or rather to discern the picture from inside the noise.

      But how can the camera know how to do this? Very simple: It scans all other pictures stored on the phone or on your social media networks and sifts through your contacts. It analyzes the pictures you already took, or those that are associated with you, and it tries to match faces and shapes to link them back to you. By comparing what you and your network already photographed, the algorithm guesses what you might have wanted to photograph now. It creates the present picture based on earlier pictures, on your/its memory. This new paradigm is being called computational photography.1

      The result might be a picture of something that never ever existed, but that the algorithm thinks you might like to see. This type of photography is speculative and relational. It is a gamble with probabilities that bets on inertia. It makes seeing unforeseen things more difficult. It will increase the amount of noise just as it will increase the amount of random interpretation.

      And that’s not even to mention external interference into what your phone is recording. All sorts of systems are able to remotely turn your camera on or off: companies, governments, the military. It could be disabled in certain places—one could for instance block its recording function close to protests or conversely broadcast whatever it sees. Similarly, a device might be programmed to autopixelate, erase, or block secret, copyrighted, or sexual content. It might be fitted with a so-called dick algorithm to screen out NSFW (Not Suitable/Safe For Work) content, automodify pubic hair, stretch or omit bodies, exchange or collage context, or insert location-targeted advertising, pop-up windows, or live feeds. It might report you or someone from your network to police, PR agencies, or spammers. It might flag your debt, play your games, broadcast your heartbeat. Computational photography has expanded to cover all this.

      It links control robotics, object recognition, and machine learning technologies. So if you take a picture on a smartphone, the results are not as premeditated as they are premediated. The picture might show something unexpected, because it might have cross-referenced many different databases: traffic control, medical databases, frenemy photo galleries on Facebook, credit card data, maps, and whatever else it wants.

      Relational Photography

      Computational photography is therefore inherently political —not in content but form. It is not only relational but also truly social, with countless systems and people potentially interfering with pictures before they even emerge as visible.2 And of course this network is not neutral. It has rules and norms hardwired into its platforms, and they represent a mix of juridical, moral, aesthetic, technological, commercial, and bluntly hidden parameters and effects. You could end up airbrushed, wanted, redirected, taxed, deleted, remodeled, or replaced in your own picture. The camera turns into a social projector rather than a recorder. It shows a superposition of what it thinks you might want to look like plus what others think you should buy or be. But technology rarely does things on its own. Technology is programmed with conflicting goals and by many entities, and politics is a matter of defining how to separate its noise from its information.3

      So what are the policies already in place that define the separation of noise from information, or that even define noise and information as such in the first place? Who or what decides what the camera will “see”? How is it being done? By whom or what? And why is this even important?

      The Penis Problem

      Let’s have a look at one example: drawing a line between face and butt, or between “acceptable” and “unacceptable” body parts. It is no coincidence that Facebook is called Facebook and not Buttbook, because you can’t have any butts on Facebook. But then how does it weed out the butts? A list leaked by an angry freelancer reveals the precise instructions given on how to build and maintain Facebook’s face, and it shows us what is well known, that nudity and sexual content are strictly off limits, except art nudity and male nipples, but also how its policies on violence are much more lax, with even decapitations and large amounts of blood acceptable.4 “Crushed heads, limbs etc are OK as long as no insides are showing,” reads one guideline. “Deep flesh wounds are ok to show; excessive blood is ok to show.” Those rules are still policed by humans, or more precisely by a global subcontracted workforce from Turkey, the Philippines, Morocco, Mexico, and India, working from home, earning around $4 per hour. These workers are hired to distinguish between acceptable body parts (faces) and unacceptable ones (butts). In principle, there is nothing wrong with having rules for publicly available imagery. Some sort of filtering process has to be implemented on online platforms: no one wants to be spammed with revenge porn or atrocities, regardless of there being markets for such imagery. The question concerns where and how to draw the line, as well as who draws it, and on whose behalf. Who decides on signal vs. noise?

      Let’s go back to the elimination of sexual content. Is there an algorithm for this, like for face recognition? This question first arose publicly in the so-called Chatroulette conundrum. Chatroulette was a Russian online video service that allowed people to meet on the web. It quickly became famous for its “next” button, for which the term “unlike button” would be much too polite. The site’s audience at first exploded to 1.6 million users per month by 2010. But then a so-called “penis problem” emerged, referring to the many people who used the service to meet other people naked.5 The winner of a web contest called to “solve” the issue ingeniously suggested running a quick facial recognition or eye tracking scan on the video feeds—if no face was discernable, it would deduce that it must be a dick.6

      This exact workflow was also used by the British Secret Service when it secretly bulk extracted user webcam stills in its spy program Optic Nerve. Video feeds of 1.8 million Yahoo users were intercepted in order to develop face and iris recognition technologies. But—maybe unsurprisingly—it turned out that around 7 percent of the content did not show faces at all. So—as suggested for Chatroulette—they ran face recognition scans on everything and tried to exclude the dicks for not being faces. It didn’t work so well. In a leaked document the GCHQ admits defeat: “There is no perfect ability to censor material which may be offensive.”7

      Subsequent solutions became a bit more sophisticated. Probabilistic porn detection calculates the amount of skintoned pixels in certain regions of the picture, producing complicated taxonomic formulas, such as this one:

      a. If the percentage of skin pixels relative to the image size is less than 15 percent, the image is not nude. Otherwise, go to the next step.

      b. If the number of skin pixels in the largest skin region is less than 35% of the total skin count, the number of skin pixels in the second largest region is less than 30% of the total skin count and the number of skin pixels in the third largest region is less than 30% of the total skin count, the image is not nude.

      c. If the number of skin pixels in the largest skin region is less than 45% of the total skin count, the image is not