Previous

Lawrence Lek on AI Reinventing Place

Lawrence Lek, artist, filmmaker, musician and winner of the Frieze London Artist Award 2024, spoke with Peter Bauman (Monk Antony) to discuss AI as a philosophical and artistic subject, focusing on its implications for identity, place and humanity's relationship with technology. They also cover the geopolitical and corporate dynamics shaping AI development and the challenges of preservation in digital art.
About the Author
Lawrence Lek, Geomancer Still: UNDERWATER GO GAME, 2017. Courtesy of the artist


Lawrence Lek on AI Reinventing Place

Lawrence Lek, artist, filmmaker, musician and winner of the Frieze London Artist Award 2024, spoke with Peter Bauman (Monk Antony) to discuss AI as a philosophical and artistic subject, focusing on its implications for identity, place and humanity's relationship with technology. They also cover the geopolitical and corporate dynamics shaping AI development and the challenges of preservation in digital art.

Peter Bauman: I’d like to start with your artistic vision and the theoretical or philosophical foundations of your practice, which largely involves AI. In your films, the main characters are usually anthropomorphized AI machines exhibiting very human emotions and behavior. We are meant to see ourselves in these AI characters. What’s your starting point when thinking about AI? Do you see it as a neutral tool serving everything between benevolent and nefarious human ends, or is it something entirely novel and powerful?

Lawrence Lek:
Broadly, as a kid growing up in the '80s and '90s and interested in science fiction, mostly through pop culture and things like that, the big things that were on the horizon were virtual reality, artificial intelligence and cyberspace. In a way, virtual reality is both my subject and my medium, even though it's not necessarily with actual glasses and immersion. Simulating another reality is my medium. CGI, big installations and multimedia are all forms of virtual realities but not in that literal VR headset space. The Internet or cyberspace isn't really a medium for me personally. That's more like my social space or mechanism of distribution.

Artificial intelligence is my subject and, lately, it's become my primary subject. When I first started working with video games and 3D, I was really interested in the question of place or space—where are we? 

Lawrence Lek, Black Cloud (Still), 2021, Courtesy of the artist


As if this idea of where we are in a physical or virtual space very much governs our character, identity or perspective. It's interesting thinking about that placeness in relation to questions of identity.


I've taken the view—or the overarching question—of “Where am I? Where are we in this physical space? Where are we in a virtual space? Where are we at this particular point in history?” The reason I think that's compelling in terms of AI is not only was it an interesting concept from childhood, but it was this idea of the technology possibly reinventing all of those—identity, place, time and memory—simultaneously. I wasn't thinking about AI back then. It was more like, “Oh, it's like a smart robot from Star Wars.” It only gradually crept up on me—as in parallel, the past ten years of machine learning and big data have really evolved—that my work could be about AI, both as an idea, sociological structure and potentially tool.

Many artists deal with AI specifically as a tool, whereas for me, the main thing is AI as a social idea. Secondarily, I sometimes use certain AI tools. Back to your question about AI being a neutral tool or not? It's obviously a big debate in philosophy of technology.

When you have a hammer, everything's a nail. Is AI just a really multi-purpose hammer that you do everything with—plus it has some of its own personality back there?


Will the tool turn against you? It's something that's tied up with humanity's creation of tools and, at the same time, humanity's creation of means of destruction. Unfortunately, those two things always seem to come together in some roundabout way.

Lawrence Lek, Black Cloud (Still), 2021, Courtesy of the artist



Peter Bauman: It's the first tool we are creating that has its own agency, identity and sense of place. It's going to exist in a different place from us in a lot of ways, which ties into geopolitics as you mentioned and your work tackles.

Lawrence Lek:
Totally. With geopolitical things, there's always an agenda somewhere along the line. Particularly with the current generation of AI and machine learning because the resources you need to train or create a foundational model are so vast. It's hard to imagine a cottage industry of tinkerers in the garage just inventing this great AI. For other tools, it may have been possible. People can tinker and make home computers, experiment with a new fabrication culture or make a new electronic music synthesizer at home. But it's hard to imagine currently that AI breakthroughs can happen in anything other than multi-billion dollar companies. Of course, they can in an academic sense, but to leverage academic research, you really need extensive resources.

That’s also a big difference in terms of use of these tools. To use the musician analogy, there are some tools that can only be built by companies with a lot of financial leverage, while some musicians prefer to make their own guitar pedal. Then there's tools that they don't really have a choice apart from opting in or out, like streaming their stuff on Spotify. Mat Dryhurst and Holly Herndon have talked about this at length. 

As a creator, you have this span of tools to use. Each one of them has a different agenda attached to it. Your agency is largely about which tools you use.


It’s up to the individual. Some artists make a point of saying, “I will only use open-source software that is free and readily available.” Or “I will only make a net-zero carbon project,” and so on. I'm less of a purist in that respect; I'm using what's available that's helpful for my worldbuilding up to a point.

Peter Bauman: AI requires the resources and institutions of states and their corporations but only fairly recently. In the '80s and '90s, a lot of this research was done in public sector institutions and universities. Today, universities can no longer keep up with the budgets and compute power of private corporations that’s required to train these deep learning models. What are the ramifications of this technology’s emergence when not only can single individuals not participate in its construction, but even robust institutions like universities also cannot?

Lawrence Lek:
I want to believe that individuals can create tools. But sometimes tech companies with a lot of angel investment can just buy talent. They might be doing interesting projects or whatever but what they can afford to pay is much higher than what universities can offer. It's also interesting going back to the history of cybernetics and early computing, which was mostly higher education meets military-industrial complex funding. In some sense, things don't change that much. They just evolve. So many of these big communication technologies are really coming out of that funding. That raises some ethical questions, going back to some artists wanting to only use open-source software and so on. At the same time, with these big corporate models, there's also a lot of scope for the other favorite subject of the '80s and '90s—hacking.

That punk and hacking attitude comes out of this idea that even if you can't create the foundational models—you can't create the Internet—but you can hack it for your own purposes, which is an attitude I'm more attuned to.


Peter Bauman: We're moving from the government-funded research of the military-industrial complex—that provided us with computers, the Internet and much more—to private company research. Corporations now have more power to shape these new technologies. This idea comes through in your films, where the antagonists are mostly corporations—not governments. How does that tie in with the element of identity in your work?

Lawrence Lek:
It's a good point. Having the government as an antagonist goes back to conspiracy and thriller genres of the 1970s that come from distrust in big government—the Nixon, Watergate era, which is essentially a generational attitude of distrust towards authority. Considering the corporation as antagonist, I started thinking in a slightly different way a long time ago. This goes into the question of identity. About a decade ago, I thought of using my own biographical memories, not in a nostalgic way, but almost as a case study in how an individual is shaped by geopolitics and by the corporation. Going back to my childhood or even my own literal origin story, I thought, “Okay, if my parents worked for Singapore Airlines and they met because they worked for the same company, then in a very real sense, if I were to be anti-corporate for its own sake as a rebellion instinct, there would be an element of self-denial in that.”

Then in the '80s and '90s, the idea of corporations as antagonist became more popular in this post-conspiracy theory land. It's all over William Gibson. It's in the Weyland Corporation in the Alien franchise—this big evil corporation, a big evil boss that takes the place of a nation-state or civilization. It’s this idea of the omnipresent corporation.

The tradition is different in Asia, with zaibatsu in Japan or chaebol in Korea. For complex historical reasons, the megacorporations seemed to have evolved into a very different form in Asia. It has its roots in earlier Western corporate forms: the idea that the corporation is not just a living entity or your employer, but it's all-encompassing in a different way. It’s not necessarily a bad thing because the company also takes care of its workers. Sometimes the energy that the company can bring, they can present as more like a family or a benevolent dictatorship and not just as a purely exploitative force, like Amazon. It's just a whole span of attitudes.

If I'm in this more 2000s ideology, like Naomi Klein’s No Logo, where Nike is bad, corporations do f– –g terrible things because they don't care. They're programmed not to care. They just want numbers to go up. But, actually, if I'm truthful about it and if I don't have this element of self-hatred about it, then I realize, “No, I am partly corporate as well.” Of course, we all fit in some system.

I thought maybe I could create a more truthful worldbuilding if I understood how companies work.

Lawrence Lek, Geomancer (Cover), 2017. Courtesy of the artist



After working with virtual worlds for about five or six years, I had this idea, and this relates to ideas about hyperstition I would talk about with my friend Kode9 [AKA musician and a sound artist Steve Goodman] who is related to CCRU [Cybernetic Culture Research Unit]. It's like, “Hang on, what if I start a company that is my studio?" In 2017, I made Geomancer, which is set in the year 2065 and about an AI company called Farsight that makes this superintelligent satellite. The year afterwards, in 2018, I thought, for various admin reasons, “I need a studio company. What if I just start Farsight and therefore fulfill that future timeline? I've got about 47 years left to make an AI satellite,” just as a playful thought. When I started the company, I realized—as we all know—tax is not fun. Compliance is not fun. I actually realized that corporations are not necessarily these big evil monsters.

On paper, a corporation is a nonhuman person, like AI. 


Because of the way that hundreds of years of capitalism have evolved, corporations exist to essentially minimize the risk of bankruptcy. They are a self-sustaining organism. They can't reproduce but they can grow. Their motivation is to grow and accumulate capital, resources and all the rest of it.

In a way, corporations themselves are not monolithic, big evil things—sometimes you get bad evil people heading them, for sure—but actually, as a structure, a corporation is a non-human life form. So in later films, like the Smart City Series—Black Cloud, NOX, Guanyin and Empty Rider—I had [the corporation] Farsight as an antagonist. But it became a more sophisticated antagonist because the experience of starting the company gave me more insight into being a real science fiction author. I could start thinking through the logical consequences of how AI development might proceed. Not because I have billions in investment, but just because I thought, “Corporations aren’t necessarily bad, but they have, unfortunately, a very clear motivation, which is to grow.”

Human beings don't have clear motivations. Who knows what we want, really? That's part of enriching the worldbuilding. There were some other projects like Nøtel, my audiovisual collaboration I had with Kode9, that made me think through some of these ideas, particularly the relationship between science fiction, hyperstition and reality that is stranger than fiction, really.

Peter Bauman: That reminds me of
a recent conversation I had with Sputniko, who's an artist as well as a CEO. She was talking about hacking capitalism to do good and also just to understand it.

I love your metaphor of corporations as neural networks. You also relate elements of Sinofuturism, even the concept of Orientalism, as a metaphor for AI. It got me thinking about other possible metaphors that relate to AI and geopolitics and I think bureaucracy fits well, too. Bureaucracy is this unknowable black box—even individual cogs don't know what the other cogs are doing. But the outcomes are extraordinarily impactful on everybody’s lives.

Lawrence Lek:
It's really interesting. Someone recently watched Empty Rider [a short film where a self-driving car is on trial for the attempted murder of their owner] and they were like, “Oh, this just reminded me that it's nearly the 100th anniversary of Kafka’s The Trial.” They also drew parallels to what you're saying about that senseless bureaucracy where no one really knows what anyone else is doing. In a profound sense, maybe it doesn't matter.

It also goes back to this AI question. I talk a lot to friends who work at AI companies, and I'm not saying they don't know what they're doing. They absolutely do know what they're doing, but maybe they're not entirely sure how it fits in the big picture. Everyone has their line manager, their line manager has a line manager, and there's someone reporting to someone. It’s almost a law of nature that when a system becomes complex beyond a certain point—especially if it's to build a complex technology—if you take the machine apart, can you put it back together again? It’s like the chip fabrication machines that ASML makes. It doesn't matter if someone buys a machine, takes it apart, and wants to put it back together again because it's so complex that it can't happen.

With a company like ASML in the Netherlands that makes the machines to make chips, when you get a machine of that complexity and quantum level precision, which is just completely mad, different rules apply in relation to how individuals and the system work together.

Peter Bauman: Before, you mentioned Guanyin and the sociology of AI. The piece won you the 2024 Frieze Artist Award. Along with other work, like Black Cloud, it alludes to feelings of isolation and loneliness as experienced by an AI. I imagine they’re also metaphors for how AI might affect us moving forward—the fragility of humanity in a post-AI world. Making stories where humans are largely absent, how do you see the relationship between AI and our increasing societal isolation?

Lawrence Lek: 
I'll try to answer that in two parts. One is the idea of AI, chatbots and conversation—how that's evolved in my work into different characters—and then the question of isolation. Those are two separate things. With Black Cloud and Guanyin, specifically, I was speculating on a not-unlikely scenario that if you made an AI, they would be pretty depressed—not because they weren't super intelligent, but precisely because they were super intelligent and would be hyper-aware of that fact. They’d think, “I'm super intelligent but all I'm doing is counting the traffic lights turning red to green all day long,” or “I'm just delivering packages.”

It seemed logical to me that if you made this super intelligent self-driving car system, that very quickly, if not overnight, they would develop neuroses and frustrations around those things, especially if they felt this real disconnection between their potential and their actuality. 


Of course, human beings feel this all the time. I thought the logical thing that Farsight would do is to create a built-in therapist. It's like how prestigious jobs give you occupational healthcare perks: dental care, health care, mental health, a free company bike—that kind of thing. So this built-in self-help AI, whom I named Guanyin after the Buddhist goddess of compassion, would talk the cars out of doing something stupid, or at least undesirable from the company's point of view. The more I thought about it and started writing the script, the more I was like, “Oh, isn't it weird that since the beginning—since the imitation game—this idea of a conversation between two entities has really been at the heart of the research field that is AI?”

It’s from the imitation game to Eliza, the 70’s chatbot, to Alexa and Siri, and then present-day ChatGPT. It's so interesting that it's something conversational; it's all about dialog and text, and this idea of AI has nothing to do with bodily form. It's not about robots. It's purely linguistic. Guanyin and Black Cloud come from this idea that it's a conversation between these two entities that are uncertain about their future. That really seemed like a way to ground the work in something relatable.

Lawrence Lek, Black Cloud (Still), 2021, Courtesy of the artist

Going into the second part of the question, as a science fiction writer, it made sense that this AI would feel very isolated. Not just that, it would probably be just like in computer programming, where you have public and private variables. Let's say you had a smart city system. There would be things that cars were allowed to communicate to each other in order to travel more safely as a group: “I'm going to turn left,” and then I turn left.

Then there would be forms of communication that would not be allowed. In my research into Black Cloud, I was talking to lawyer friends about how you would implement a smart city system. There are all these litigation problems, like who gets sued if somebody dies. This is happening right now: dealing with AI liability. Going back to our point about strange bureaucracy, one very interesting idea that came from this is that you would have to control the cars while making sure they worked together in the system. Again, the AIs might feel very disillusioned with the fact that not only are they super intelligent, but in terms of Black Cloud, they lack so many things that have been promised to them.

You take human frustration about mortality, about your crappy job or something, and you multiply that by 100. That would suck for sure. They would feel increasingly isolated.


The question of technology and isolation has been something that sociologists have talked about a lot. Why does it seem that the further we progress technologically, the further society breaks down? Why is it that the tools that even are made to connect people make people feel more disconnected? There are all sorts of studies, like “Dunbar's number,” which theorizes you can have up to about 150 close connections, which is also the size of a village. The human brain has evolved to be in a small village, a tribal animal. When we're exposed to too much, our three-hundred-thousand-year-old brain can't adapt quickly enough to this new world that we find ourselves in.

Thinking about this idea of isolation and going back to music, specifically, I made a film, my one and only feature film until now, AIDOL, which was about a singer who wants to make a comeback by recruiting an AI to ghostwrite her new album. This was 2019, about a year before the pandemic. I was looking a lot into influencers and influencer culture. Essentially, the mental burden that influencers have—in terms of posting every day, making their fans happy, keeping the algorithm busy—was a less extreme version of what we have now. I was thinking it's so tragic that you can have an individual with millions of fans yet still feel so isolated. Of course, this idea of the star having loads of fans and being disconnected is older than Elvis.

But the ironic thing with influencers is they simulate this personal one-to-one connection, like they're your best friend in your bedroom rather than the star on the stage. Even though there's this intimacy of the influencer-fan connection, everybody feels even more isolated and depleted than before. It was just something that I saw time and time again when you'd have people talking about how they were stopping their channel, podcast, V-log or stream because of the toll it took on them.

So it seemed that even in the most visible examples of social media culture, the isolation is highest. If you look down at the different levels of the impact, human beings are universally bad at thinking about second-order consequences. The first-order consequences of social media are great: we're going to make loads of friends! But the second-order consequences are that everyone has 7,000 friends. It's meaningless.

Peter Bauman: That influencer culture has started combining with politics as well, where the wall continues to be brought down. Figures and offices are demystified. Yet the second-order effects are ever more polarization.

You mentioned early tribal societies. Historians like Yuval Harari have written about how these tribal bands were largely democratic. Then democracy worked up to a city-state level before reaching a population ceiling. It took thousands of years before a democracy got bigger than a city-state. Once it got to empire or even nation-state level, the ability for such a large number of people to effectively communicate broke down. We actually needed technology to have democracy function in larger societies.

Now it seems that too much technology is destroying democracies again. It's such a struggle for humanity to find the balance between the tools we create and fair political deployment.

Lawrence Lek:
Totally. Going back to this idea about systems and bureaucracies as well, there's the idea that the more interconnected the system, the more vulnerable it is. This is almost a universal truth, whether it's computer networking, global health or the weather. Now that everything is more interconnected and responds at a faster rate, there's no lag time that allows failure to happen at a systemic level.

These are obviously huge subjects that I try to relate to in my work. In different films and games and in the story, there's this idea of much bigger forces at play. Going back to Black Cloud or Guanyin, the assumption about the world is that superintelligence has arrived. But it's really mundane. and the AI is depressed—that's the norm. In Geomancer, the 2017 film, the assumption was that global warming had come and gone; everywhere is flooded, but it's alluded to that humanity kept on going for some reason. Is it because industry and the environment are fully automated and the robots are taking care of everything? Maybe. We didn't go to the moon or Mars or anything. We're still on Earth; the environment sucks but it's not apocalyptic. It's actually quite mundane, but it’s still livable.

The science fiction and fantasy I grew up with was really capital-f Futuristic. But it wasn't my lived experience. In Hong Kong and Singapore back in the '80s and '90s, there was still this post-colonial search for their own identity. Part of the reason they looked so futuristic is because they wanted to look futuristic. The higher-ups asked the world's most famous architects to build the world's best towers. And from that point, multinational companies would come in and decide to put their Asian base in Singapore and Hong Kong, which is exactly what happened. This idea of the future being a mundane brand really seeped into my work.

Lawrence Lek, Nepenthe (Summer Palace Ruins) Still, 2021. Courtesy of the artist



Peter Bauman: As a quick last question, many readers collect on-chain artwork, and they may come across your work, Nepenthe, which says it has an NFT component. But I haven’t been able to find it online. Is the work available?

Lawrence Lek:
I'll briefly talk about my experience with the NFT world. Before the spike or interest that really culminated in 2021 or 2022 around NFTs, I had seen in the more traditional art world, many, many different initiatives trying to essentially create your own digital art collection. “Let's make this screen that only plays this subscription model.” I've probably seen about three or four of these pre-NFT digital art waves come and go.

One big difference for me in around 2022 is I had never experienced artists-as-commodity actually being the case.


Platforms try to get artists on board because it builds the brand. That was surprising for me. It created a quite polarized situation from both sides. There are many generalizations being made about the trad art world and the digital art world. But I did certainly think it was really cool that a lot of creators got a platform. That reminded me of being a Myspace musician around at the peak of about 2008. Also, creators from a global universe got to participate, have an audience and self-organize shows. I thought that was absolutely really great. I did the Nepenthe Valley project with a few friends.

What I didn't anticipate is to create in that space, sustainability is really, really important, and it's really, really difficult.


We were just talking about influencer culture in 2019, and I found that to be a successful artist in the NFT-Web3 space, it's a 24/7 job—launch, pre-launch, post-launch, after-sales care and all the rest of it. One thing I slightly regret, but it's also because of necessity, is that I didn't have time to maintain that trajectory, as I'm doing all sorts of shows. I would like to someday really get back into that but it's a question of time.

With the Nepenthe Valley project, I really got a deep dive—it was like a crash course—in that side of the world, where I also learned lots of counterintuitive things.

For example, usually when you're an artist and you have an exhibition or a new commission, you open the doors; it's opening, and then you're done. You can chill. Whereas it's a very different temporality when you have a live project. It's more like developing software rather than making a sculpture.


That's a huge characteristic; it's a very different system. Talking to friends like Mat and Holly or Sarah Friend or looking at Rhea Myers' projects, what sticks with me is the care that you need to put in to maintain the work. I totally underestimated that, basically. I thought, “You make it and you're done.” Maybe it's fine if it's on a very established platform. But particularly because I tried to launch the Nepenthe Valley project on a friend's new platform and that platform stopped getting maintained. So the project is a big question mark at the moment. I'm happy with the work, but to answer your question, it's like,

“Where is it?” Going back to this idea of place, it's like, "Oh, that's something I didn't realize needed so much care and attention."


Peter Bauman: I think a lot of artists and collectors don't realize that.
I recently spoke with Regina Harsanyi from MOMI and Jon Ippolito, who’s ex-Guggenheim, about digital preservation. One thing they stressed is blockchains are software, and software is constantly changing, and we all need to be thinking about that more. Or artists’ work and collectors’ collections could just disappear.

Lawrence Lek:
Exactly. It's software, which is basically fragile on a grand scheme of things. Not only is it software, it's online software, which is even more fragile. And not only is it online software, it's experimental online software. It’s very exciting in terms of reformulating the artwork around systems, tools and distribution, which is great. But it’s easy to underestimate the difficulty of longevity, and that's part of it.



---



Lawrence Lek is a filmmaker, musician and artist who unifies diverse practices—architecture, gaming, video, music and fiction—into a continuously expanding cinematic universe. Over the last decade, Lek has incorporated vernacular media of his generation—such as video games and computer-generated animation—into site-specific installations and digital environments, which he describes as “three-dimensional collages of found objects and situations.” Often featuring interlocking narratives and the recurring figure of the wanderer, his work explores the myth of technological progress in an age of artificial intelligence and social change.

Peter Bauman (Monk Antony) is Le Random's Editor-in-Chief.