Sunday, July 12, 2015

Rethinking the theme of Terminator 2

Not so much thumbs up
I went back and rewatched the previous Terminator movies with the recent release of Terminator Genesys, and while I still very much enjoyed the first one, I found myself not enjoying Terminator 2 as much as I had when I was younger. I think there is still a lot of great stuff in this movie, don't get me wrong, but this time around I found the theme annoyed me. I'm hoping that this post will get across what it was that was bugging me about it.

Main Theme


One of the key messages of the movie is the idea that "the future is not set. There is no fate but that which we make for ourselves". This is fine and not what bugged me. It was the greater theme of learning the value of human life. My issue isn't with the theme itself, but with how poorly the movie actually handles it. There are two main plot points that support this theme:
  1. The Terminator learning the value of human life
  2. Sarah Connor not killing Miles Dyson and instead choosing a non-lethal approach to taking down Cyberdyne
Neither of these really ends up doing much service to the theme, so let's go over them and I'll try to explain why.

I Know Now Why You Cry


Let's start at the very end. The final line of the movie is:

Because if a machine, a Terminator, can learn the value of human life, maybe we can too.

This is a nice feelgood line to end the movie on, and it sounds meaningful. We're meant to leave the movie thinking, "yeah, if a lousy machine can figure out the value of human life, why can't we damn humans do it?"

The problem, of course, is that no machine has actually learned the value of human life. A machine in a movie learned it because that's what the script said. But you can't take a profound fictional event and use it as a premise for a logical argument about the real world. You don't base your opinion on human beings on what a fictional robot did in some movie! This is a clever line that feels profound, but it really says nothing.

Now, going through the movie, we see the Terminator go from not knowing why he can't kill humans to apparently valuing human life. We even have the funny exchange where John Connor keeps telling him that he can't kill humans, and he just keeps asking, "why?". The best John Connor can come up with is, "because you just can't. Trust me on this."

John Connor is just a child in the movie, so we shouldn't expect any kind of profound philosophical argument from him. But if the Terminator's arc in the movie is going to be eventually valuing human life, then we need things to happen in the movie to justify that. And as far as I can see, that isn't the case. The movie progresses and as far as we can tell, the Terminator is avoiding killing people because John Connor ordered him not to. But then suddenly, at the end, he apparently now understands why people cry, and values human life.

The movie did no work to get to that point. It just kind of asserts it because it needs it for story closure. There is no good reason for the Terminator to now value human life, other than if he just somehow 'figured it out' from out of nowhere. And if a movie is all about learning the value of human life but can't actually articulate in any real way why human life is valuable, then it more or less fails.

I Almost Did It


The previous issue was a bit nitpicky, but this one I think is the main problem. This is the part where Sarah Connor goes to kill Miles Dyson, but in the end, they decide to take a non-lethal approach to stopping Cyberdyne. We get the scene with her sobbing on the floor as John Connor and the Terminator run in, and she says, "I almost did it". First thing is first, though. The real reason Miles Dyson is still alive at this point is not because Sarah Connor couldn't pull the trigger. It's because of luck and Sarah Connor being a fucking terrible shot.

Sarah Connor absolutely pulls the trigger to kill Miles Dyson. But he moves at the last moment when a radio controlled car hits his foot. This is pure luck and by all rights Dyson should be straight up dead at this point.

She then goes on to pour 60 rounds of assault rifle ammunition into the desk he's hiding behind, but his desk must be stacked full of bricks or something because none of them penetrate and kill him.

This still isn't enough. She then heads to the room he's in and fires at him with a pistol, eventually hitting him once in the shoulder. It's only after all of these lousy shots that she eventually has to confront him and his family and then, finally, she's unable to shoot him in the face at point blank range while his wife and child look on.

Hardly 'pat yourself on the back' moral integrity here, and Miles Dyson really should be dead and the movie basically over at this point. Ironically, Sarah Connor would have far more moral integrity if she actually followed through with shooting him at this point, because otherwise it means that she had no problem killing him with an assault rifle from 50 meters away, but not while looking at his face.

But at this point, we're meant to think that what has happened is that she 'came to her senses' and realized what she was doing was wrong. The alternative, not so feelgood answer, though, might be that she was right in the first place, but lost the nerve to make a difficult moral choice once she could no longer stay emotionally distant from it.

Sacrifice


So the issue here is whether or not killing Miles Dyson in order to stop Skynet being created would be a morally acceptable thing to do. The movie tries to argue that it wouldn't be. Now, obviously, if you can solve a problem without taking lives that's always going to be preferable. In this case, we're talking about trading one person's life for the lives of billions of people. It's possible that you may be able to do it non-lethally, and in the movie they find a way, but at a much greater risk of failure, particularly with a T1000 chasing them.

It's a question of whether preserving that one life is worth all the extra risk of trying to find a non-lethal approach. One thing that might seem significant here is that fact that we're talking about killing a person who hasn't actually done anything yet, which makes it seem like some kind of thought crime. But that's where this moral problem differs from anything we ever actually encounter in real life, so we need to be careful. The movie has the unique situation of having precise knowledge from the future of what is going to happen. And this makes it less killing a person for what they might do, and makes it closer to killing them for what they have done. This doesn't ever happen in the real world so we're not built to think well about this kind of situation.

The fact that Miles Dyson isn't intentionally trying to wipe out the human race makes this feel different to than if he was doing it intentionally, but that doesn't change the need to stop him. It means we shouldn't kill him because he 'deserves it', but it doesn't change the fact that killing him in order to stop him might not be unreasonable. Given how large the stakes are, this might be the unfortunate but pragmatic best move given the circumstances.

Consider that, right now, the US has an ongoing drone bombing campaign across several countries in the Middle East, where thousands of people have been killed. Thousands of innocent civilians are knowingly killed as part of this, and not much effort is made to avoid that. Drone attacks aren't aborted if there's any chance that an innocent person might die. Far from it.

And people seem to be mostly okay with that. I mean, people kind of don't like it, but no one seems to be kicking up a fuss or petitioning their representatives. The news media barely talks about it. Most people are happy to not be reminded of it, and to not think about it.

But really, if you're not losing sleep over thousands of innocent people being killing on an ongoing basis, in some dubious, poorly defined war on terror, on what ground can you really be against killing one single person who will guaranteed cause the deaths of several billion people. By what we're currently morally accepting, a drone should be able to bomb Miles Dyson's house and kill him and his entire family and we should just call it collateral damage.

Now, since I don't think the US drone bombing is morally acceptable I can't exactly use that as an argument for killing Miles Dyson. However, my point is more that we accept deaths for the greater good all the time, and when it saves lives we usually consider it a good thing. If one death saved billions of lives, would we really be morally torn about it?

The problem with Terminator 2 is that it avoided the actually morally difficult choice of having to do one bad thing in order to avoid a much worse thing. Making a tough choice, and then living with the guilt of that, that can actually be a noble act. Deciding that you can never take a life under any circumstances, and then muddling through a plan that hopefully will stop the deaths of billions of people if you get lucky, that's actually not so noble after all.





No comments:

Post a Comment