Trevor Paglen: how images mean things

“This time has been many, many things, but in terms of making art, it’s been a moment at which there’s been a huge change in the way that images mean things. Some small examples of that might be: suddenly an aeroplane in the sky takes on a different meaning; suddenly a handle of a gas pump takes on a different meaning; suddenly a runner speeding past you means something different. There’s a massive change in your relationship to the world around you, in terms of what kinds of associations you have, and in the meanings that you ascribe to different things and different kinds of images.” In En Liang, “Trevor Paglen: How COVID-19 Changed the Way We See the World”, in Art Review, 10 September 2020

Trevor Paglen: AI and Power

“Ideology’s ultimate trick has always been to present itself as objective truth, to present historical conditions as eternal, and to present political formations as natural. Because image operations function on an invisible plane and are not dependent on a human seeing-subject (and are therefore not as obviously ideological as giant paintings of Napoleon) they are harder to recognize for what they are: immensely powerful levers of social regulation that serve specific race and class interests while presenting themselves as objective.

The invisible world of images isn’t simply an alternative taxonomy of visuality. It is an active, cunning, exercise of power, one ideally suited to molecular police and market operations–one designed to insert its tendrils into ever-smaller slices of everyday life. […] Machine-machine systems are extraordinary intimate instruments of power that operate through an aesthetics and ideology of objectivity, but the categories they employ are designed to reify the forms of power that those systems are set up to serve.” Trevor Paglen, “Invisible Images (Your Pictures Are Looking at You)”, in The New Inquiry, December 8, 2016

Trevor Paglen: Apes

“Neural networks cannot invent their own classes; they’re only able to relate images they ingest to images that they’ve been trained on. And their training sets reveal the historical, geographical, racial, and socio-economic positions of their trainers. […] engineers at Google decided to deactivate the “gorilla” class after it became clear that its algorithms trained on predominantly white faces and tended to classify African Americans as apes.” Trevor Paglen, “Invisible Images (Your Pictures Are Looking at You)”, in The New Inquiry, December 8, 2016

Trevor Paglen: When you put an image on Facebook

“[…] something completely different happens when you share a picture on Facebook than when you bore your neighbors with projected slide shows. When you put an image on Facebook or other social media, you’re feeding an array of immensely powerful artificial intelligence systems information about how to identify people and how to recognize places and objects, habits and preferences, race, class, and gender identifications, economic statuses, and much more.

Regardless of whether a human subject actually sees any of the 2 billion photographs uploaded daily to Facebook-controlled platforms, the photographs on social media are scrutinized by neural networks with a degree of attention that would make even the most steadfast art historian blush.” Trevor Paglen, “Invisible Images (Your Pictures Are Looking at You)”, in The New Inquiry, December 8, 2016

Trevor Paglen: Digital Images don’t Need Human Eyes

“What’s truly revolutionary about the advent of digital images is the fact that they are fundamentally machine-readable: they can only be seen by humans in special circumstances and for short periods of time. A photograph shot on a phone creates a machine-readable file that does not reflect light in such a way as to be perceptible to a human eye. A secondary application, like a software-based photo viewer paired with a liquid crystal display and backlight may create something that a human can look at, but the image only appears to human eyes temporarily before reverting back to its immaterial machine form when the phone is put away or the display is turned off. However, the image doesn’t need to be turned into human-readable form in order for a machine to do something with it. […] The fact that digital images are fundamentally machine-readable regardless of a human subject has enormous implications. It allows for the automation of vision on an enormous scale and, along with it, the exercise of power on dramatically larger and smaller scales than have ever been possible.” Trevor Paglen, “Invisible Images (Your Pictures Are Looking at You)”, in The New Inquiry, December 8, 2016

Trevor Paglen: Invisible Images

“over the last decade or so, something dramatic has happened. Visual culture has changed form. It has become detached from human eyes and has largely become invisible. Human visual culture has become a special case of vision, an exception to the rule. The overwhelming majority of images are now made by machines for other machines, with humans rarely in the loop. The advent of machine-to-machine seeing has been barely noticed at large, and poorly understood by those of us who’ve begun to notice the tectonic shift invisibly taking place before our very eyes.” Trevor Paglen, “Invisible Images (Your Pictures Are Looking at You)”, in The New Inquiry, December 8, 2016