Previous

Trevor Paglen on Technological Points of View

Multidisciplinary artist, Trevor Paglen, discusses the critical role of artists in exposing technology's inherent biases with Peter Bauman (Monk Antony). Paglen emphasizes the need to challenge corporate optimism and reimagine technology beyond capitalist frameworks.‍
About the Author
Trevor Paglen, CLOUD #135, Hough Lines, 2019. Courtesy of the artist


Trevor Paglen on Technological Points of View

Multidisciplinary artist, Trevor Paglen, discusses the critical role of artists in exposing technology's inherent biases with Peter Bauman (Monk Antony). Paglen emphasizes the need to challenge corporate optimism and reimagine technology beyond capitalist frameworks.

Peter Bauman: Do we need artists to lean on the more critical side of AI to balance the corporate optimism society is otherwise inundated with? How does this align with your aim to reveal the invisible side of technology?

Trevor Paglen: I definitely think yes, for a lot of reasons. Firstly, a lot of AI has to do with images, whether that’s facial recognition or generative images and video, etc. Artists have thousands of years of collective thinking about images that give us huge insights into the often terrible assumptions built into computer vision and AI systems. Moreover, the ethos of working as an artist often involves pulling things apart, seeing what’s in there, and learning to see what the world looks like through the “eyes” of various AI systems. This is a much-needed approach towards a fuller understanding of the technologies we’re building.

Peter Bauman: Much of your own work involves this “pulling things apart,” like ImageNet Roulette (2019), which was early to expose the systemic biases in AI algorithms. How would you judge the level of progress in the five years following?

Trevor Paglen: I think works like the things I did with Kate Crawford helped show a lot of the most obvious examples of bias, racism, misogyny and other horrible politics built into so many technical systems. There’s been a pretty big effort in the tech industry to stop the most glaring and obvious forms of labeling, but in our Excavating AI paper, we were trying to make a more subtle point that I think hasn’t been understood, which is that the concept of “bias” implies that the concept of “unbiased” exists. But “unbiased” doesn’t exist once you get outside the world of mathematics.

Every person and, by extension, every technology has a “point of view"—a series of assumptions built into it about how the world works and how it should work.


That being the case, it’s impossible to create an “unbiased” technology; it’s only possible to build technologies that more or less effectively try to hide those biases.

Peter Bauman: What happens if these biases go unrecognized? You’ve asked Louisiana Channel, “How do the things that we build change us? In other words, how do we change the landscapes around us and in doing so, how do we change who we are?” How has AI already begun reshaping our societal structures and individual identities?

Trevor Paglen: Absolutely. There are a million examples. Think of very basic examples of cell phones or Google Maps, which have dramatically changed how we navigate our everyday lives. The fact that you don’t have to find a telephone in a fixed location to call someone else, who also has to be at a fixed location—and you have to know where they are—is just by itself huge. That’s one tiny yet massive example. If we fast forward to the cultural stuff, think about someone like JD Vance, who’s entire public personality is, in my estimation, a product of deep fakes and algorithmic recommendation engine k-holes. He’d be utterly unintelligible and marginalized in a pre-recommendation-algorithm media landscape.

Trevor Paglen, Image Operations. Op.10, 2018. Courtesy of the artist



Peter Bauman: That intersection of politics, technology and art is a rich area your work explores. You’ve said before that “seeing is never neutral. What kinds of politics and values are built into the computer systems that see the world for us?" There’s an assumption with technology that over time, with updates and improvements, these problems will naturally be resolved. What’s the problem with that thinking?

Trevor Paglen: The things we perceive are highly informed by what we’ve been primed to see. Our histories, cultures, memories and experiences profoundly shape literally how we see and the meanings we attribute to the things we observe. There’s no way around that.

If we look at technical systems, we should be asking, “What is this particular technology optimized to see?” and “Who does that benefit?”


If you think about something like facial recognition systems built into police body cams, you can think about the fact that the “way of seeing” built into that system is a way of seeing that amplifies the power of the police. If you think about a driving assistant system built into your car, that’s a system designed to extract value from you by sending information about your driving habits to insurance companies. When we’re talking about the world of actually existing computer vision products, we are talking about systems that have been designed to see the world in particular ways in order to maximize the power of the people creating and deploying those systems, whether those are police, military or capitalist interests.

Trevor Paglen, The Other Night Sky, Dead Satellite with Nuclear Reactor, Eastern Arizona (COSMOS 469), 2011. Courtesy of the artist



Peter Bauman: Capitalist interests are clear in generative media. Your work examines how it's personalized to shape our worldview and make us more susceptible to manipulation, fueling polarization, inequality, and disinformation. Do you think AI could also help counter these trends, maybe through fact-checking or de-radicalization?

Trevor Paglen: To answer this question, we should recognize that “AI” doesn’t exist as a thing-in-itself, independent of the companies and interests who’re making it. So we’d have to ask the question slightly differently, such as for example, “Can Google Gemini be used to counteract these trends if it were deployed on Twitter?” The answer to that in the abstract is “maybe.” But think about the economy of Twitter, which is an engagement and attention economy whose business model is dependent on agitating people. You’re not going to see businesses willingly build tools that counteract their business model unless they’re under an extreme threat of regulation, as we recently saw with Instagram’s new controls on young people using it.

Peter Bauman: This goes back to when you mentioned systems that maximize the power of their creators. That’s AI and generative media in the hands of giant corporations. Is there any alternative, more attractive model?

Trevor Paglen: Sure, there are tons of different ways we could imagine technology functioning in society, but in order to do that, we’d have to theorize what non-capitalist technologies might look like. In other words, we’d have to imagine technologies optimized for things other than maximizing profits and we’d have to imagine what incentives and mechanisms would have to exist in order to make those technologies viable.

Peter Bauman: Where is this all heading in your opinion then? You have even suggested that AI is not compatible with democracy. If that’s the case, what is our political future in a world where AI “atomizes” society?

Trevor Paglen: This is a really good question and a difficult one to answer and I’m having a hard time imagining a way through it. On one hand, I’m not sure we can or want to go back to the era of “manufacturing consent” through a few centralized media outlets that create and promulgate conventional wisdom. On the other, I’m not sure you can have a functioning democracy, much less a society, if you have millions of entirely incompatible worldviews that are supposed to be making collective decisions.

Trevor Paglen, Preludes #17, 2023. Courtesy of the artist, Art Blocks and Pace Verso



Peter Bauman: You’ve talked before about machines as “engines of capitalism,” saying “What they're trying to do is extend that logic of capital to as many places as possible and to make it as optimized and efficient as possible. And that comes at the expense of places that we've had, whether that's in our brains or in our collective experience, that were relatively insulated from those quite brutal market logics.” Would you say the rise of NFTs in art fits into this framework of capitalist exploitation—or challenges it?

Trevor Paglen: The rise of NFTs is very interesting to me, especially now, which is a time when I think they’re maturing into something genuinely new. I don’t think they’re either propping up nor subverting capitalism—that’s kind of silly—but they do allow for new relationships between artists and collectors/patrons on one side, and they allow for artists to show and share their work in different ways. I’m doing a project with Fellowship at the moment where we’re going back to an early project I did with a generative AI system I built in my studio around 2015. I had to build the training sets, models, image generators, etc. I ended up with millions of images, a massive archive, dozens of models and training sets, etc. When I first showed the work in 2017 at Metro Pictures gallery, I had to reduce that entire project’s output down to less than 20 prints on a wall. So an audience could only really see the tiniest tip of the project iceberg. With the Fellowship project, I’m able to share a much more in-depth glimpse into the actual larger project, show the architectures I created, and give people a much richer understanding of what I was doing and why I was doing it.

Peter Bauman: While your views on technology are often critical, you think art can help us imagine a better society. Beyond raising awareness of these pressing issues, what role do you see for art in creating tangible change?

Trevor Paglen: Art can’t singlehandedly change society, but it can help us examine the things we take for granted and can allow us to create visions of how society can be different. For example, when I “see” feminism in my head, I see the work of folks like Jenny Holzer or Martha Rosler. For queer liberation, I “see” the work of Gran Fury or Felix Gonzales Torres. Art can help us visualize different ways of being, which allows us to imagine ourselves in those different ways of being.

Peter Bauman: Will technology always be flawed because its makers are flawed? Is technology making us more human or more alienated?

Trevor Paglen: It’s partially flawed because its makers are flawed. More importantly, it’s flawed because the economic incentives under which it has to exist are flawed.



---



Trevor Paglen is an artist whose work spans image-making, sculpture, investigative journalism, writing, engineering, and numerous other disciplines. Paglen’s work has had one-person exhibitions at the Smithsonian Museum of American Art, Washington D.C.; Carnegie Museum of Art, Pittsburgh; Fondazione Prada, Milan; the Barbican Centre, London; Vienna Secession, Vienna; and Protocinema Istanbul. He has participated in group exhibitions at the Metropolitan Museum of Art, the San Francisco Museum of Modern Art, the Tate Modern and numerous other venues.

Peter Bauman (Monk Antony) is Le Random's Editor-in-Chief.

Suggested Reading
Simon Denny on Society, Technology and Art
Peter Bauman
October 7, 2024
View