“The role that I’m playing is more of a curator than anything else. The first thing I do is curate the data that the network is going to study, and once the network is trained I also curate the output data, deciding which outputs I am going to keep and whether I want to tweak the training sets so that the network learns something different. So all of the work that I do is really through curation.” Seth Thompson, “The Artist, The Curator, and the Neural Net: A Conversation with Robbie Barrat”, in Paprika!, November 8, 2018.
“Neural networks cannot invent their own classes; they’re only able to relate images they ingest to images that they’ve been trained on. And their training sets reveal the historical, geographical, racial, and socio-economic positions of their trainers. […] engineers at Google decided to deactivate the “gorilla” class after it became clear that its algorithms trained on predominantly white faces and tended to classify African Americans as apes.” Trevor Paglen, “Invisible Images (Your Pictures Are Looking at You)”, in The New Inquiry, December 8, 2016
“[…] something completely different happens when you share a picture on Facebook than when you bore your neighbors with projected slide shows. When you put an image on Facebook or other social media, you’re feeding an array of immensely powerful artificial intelligence systems information about how to identify people and how to recognize places and objects, habits and preferences, race, class, and gender identifications, economic statuses, and much more.
Regardless of whether a human subject actually sees any of the 2 billion photographs uploaded daily to Facebook-controlled platforms, the photographs on social media are scrutinized by neural networks with a degree of attention that would make even the most steadfast art historian blush.” Trevor Paglen, “Invisible Images (Your Pictures Are Looking at You)”, in The New Inquiry, December 8, 2016