Artvoice: Buffalo's #1 Newsweekly
Home Blogs Web Features Calendar Listings Artvoice TV Real Estate Classifieds Contact
Previous story: Homegrown Suds
Next story: One Dreary Road Trip

Ex Machina director Alex Garland on Artificial Intelligence and Morality

Man and Machine
Ex Machina director Alex Garland on Artificial Intelligence and Morality

Did you draw on other films about A.I. when developing the concept for Ex Machina?

I didn’t exactly draw on them, but I knew them. In particular 2001: A Space Odyssey. It’s a film I know very well, and I think from my point of view it has the best representation in film of an A.I.—kind of hyper-aware. I was able to use [other A.I. films] to an extent. For example, the people that have seen Blade Runner might suspect that some of the people that look like humans are in fact machines. And I could use that as a kind of misdirection sometimes, to sort of lead people to suspect that there are twists of a sort that don’t actually exist. But I wasn’t exactly drawing on them because it’s more just being very familiar with them, having seen those films many times, and liking those films, and being influenced by them.

Ex Machina seems right on the cusp of the next big technological breakthrough in society. Do you feel we are close to the sort of machine consciousness seen in the film?

Well I strongly doubt we are on the cusp of developing self aware machines. It’s not going to be anything like a short time frame. Or if it were, I would be extremely surprised. I think you’re looking more at decades than a couple of years, or a century, or even more. The truth is it’s hard to tell. There isn’t a road map with where A.I. is going. There are really big mysteries that haven’t yet been figured out.

The film seems to be based on a sort of fear of what A.I. can become. How much of this fear is based on your real anxiety about the future of A.I.?

From my point of view, I have to say the film is not anxious about A.I. in a way. The film is allied with the machine. It’s the machine that acts in what I would term as reasonable ways, and the humans that act in unreasonable ways in many respects. So I don’t personally feel that anxiety. I understand where it comes from, and I think there’s actually some very good reasons for it. But in the end, I think that A.I. will probably be a good thing for humans, not a bad thing.

And it seems like you take a lot of steps to actually align the viewer with the A.I. in Ex Machina.

I tried to, but it’s really up to the audience in the end, and some will just refuse to be allied with the A.I. I found that quite interesting.

You use Ava’s sexuality as a key point in the film. Was that an attempt to align the viewer with Ava as a sympathetic character, or what was the reasoning behind that?

From a simple starting point, issues of machine consciousness and human consciousness also end up being about what humans are, as well as what machines might be, and how humans interact with other humans. That begins to involve stuff like sexuality and gender, and all the complex aspects of our interactions can be referenced in some way. In the end, some of the gender issues are really not related to A.I. at all—they’re about us, they’re not about machines. But I was interested in the way that if you present the machine as having a gender, while at the same time it being reasonably clear that by definition it doesn’t have a gender because it’s not actually organic, its gender is in effect irrelevant to it. So it was more to do with that, and in a way less to do with A.I.

Do you draw a line between human rights and A.I. rights, or is there no line at all?

No question. That is to say, yes there would be lines you could draw between a sentient machine and a sentient human, just as there are lines you can draw between different kinds of animals with different types of consciousness. A whale and a dog don’t have the same consciousness, but they both have consciousness. And so in broad terms, you’d say they’re the same because they’re both self aware. But in more specific ways, you might find differences. In key terms, and what I would say are the most important terms, if you had a machine that could think something like the way we can think, then you would have to start respecting it at the same level that we respect each other. The rights that we attribute to each other are based on our sentience. That’s why we worry about killing a person but not about about cutting down a tree. Both of them are alive, but only one can think.

With devices like the Apple Watch and Google Glass, technology has in a way become an extension of ourselves. How could the introduction of conscious A.I. continue this trend, and were you thinking about this while making Ex Machina?

I was thinking about that. If you had self aware machines, I think that the issues that would flow from it ultimately would be about which sentience on this earth is in a way the most important. Because machines that could think would have a very significant advantage over us. They wouldn’t have to grow old, they wouldn’t get cancer, they wouldn’t get ill. And I think that what would happen is quite quickly some very fundamental paradigm shifts would start to happen, and some of them would be pretty problematic for us I think. And then the question is how one chooses to see the machines. Because one way of seeing the machines is that they’re separate from us in crucial ways, in which case I think we would feel rivalrous and have the sense that we were about to be made extinct and that kind of thing. The other way is that they’d be an extension of us, maybe like in the way that we see our children. We acknowledge with our children the idea that we’re gonna die before they do, and we feel okay with this. So is that the correct way we should look at A.I.? I think that’s kind of the question it poses.

One very interesting thing about this movie for me, and what made it more unique in the A.I. subgenre, was the discussion of surveillance and invasion of privacy. It tapped into very topical issues of subjects like the NSA or Edward Snowden. How do you think this theme fits into the A.I. subgenre?

I think that they fit in in broad terms which is just technology and our relationship with technology, and that technology companies seem to contain a lot of power and contain a lot of understanding of us as individuals. What we want to shop, what we want to see, what we want to read, that kind of thing. So they understand a lot about us and we don’t seem to understand very much about them. We don’t understand how our laptops work and we also don’t understand how the companies behind these things work. So, I think that all of that then plays out as a kind of techno-fear or techno-anxiety. And that’s probably the level that it plays into this kind of story. And I also think that the traditional definition of paranoia says you’re scared of things that you shouldn’t be scared of, it’s an unreasonable fear. And I think that in terms of the surveillance we’re talking about, it’s not actually paranoia. It’s a reasonable fear. It’s exactly what people should be alarmed about, and I think that’s partly why I included it in the film.

When you were scouting locations for the set, what sort of environment were you looking for, and how did you want your set to play into the themes of the movie?

We were looking for exactly the kind of environment we got. We found the ideal place really, which was a big, wild, uncontrolled natural setting. And within it, a very kind of overcontrolled and human setting, one massive and expansive, and the other claustrophobic and contained. It ended up being in Norway, where there are beautiful bits of architecture in these very remote settings, and that suited us perfectly.

The main building itself mixed human and robot aspects together. Is that what were you going for there, a juxtaposition of man and machine?

Yes, but particularly it’s sort of juxtaposing uncontrolled nature, and controlled man. And then what you do is you gradually subvert that—the manmade environment gets increasingly out of control.

The score in this movie was unique, and very disorienting. How closely did you personally work with the composers, and how do you feel the music contributed to the film’s message and the audience’s viewing experience?

We worked very closely together. Everybody in the film, it was just one very intense collaboration. The thing about [composers] Geoff (Barrow, of the band Portishead) and Ben (Salisbury) is that they don’t come from a background in film. Geoff in particular comes from a rock background. His coming from outside the industry means that he approaches things in a quite innocent way, a sort of left-field way. For a film like that, it felt perfect that you had this sensibility that was learning as it went along and coming up with surprising solutions. Whereas some of the more steeped in film school might reach for some rather more obvious cues, Geoff and Ben seemed to always be more surprising, and it was a real pleasure working with them.

Pushing forward into your career, and thinking about growing technology and the developing human consciousness, how do you think future works of yours might expand on these themes?

I don’t know actually. The next film I’m working on really has nothing to do with any of this stuff. So maybe never again, or maybe again. I don’t know. I tend to just think on a single project-by-project basis. The next film really has nothing to do with technology, it actually has to do with cell structures and stuff like that.

blog comments powered by Disqus