Humans and their thinking, autonomous creations have had a rough relationship for a long damn time and Terminator: The Sarah Connor Chronicles (T:SCC) was looking to change that, right around the time it got canceled. I’ve gone off at great length about what I call the twin heads of this relationship, namely “the Pinocchio Complex” and the “Frankenstein/Shellian Syndrome” (again, 1 here, 2 here, 3 here, 4 here, 5 here, 6 here, 7 here, 8 here, and most recently 9 right here ). Pinocchio Complex stories are those where the creation wants to be a “Real Boy,” and, in the end, to some degree or another, gets to be. In Frankenstein/Shellian Syndrome (F/SS) stories, the creation may start out wanting to be real or it may start out confused or with a clear purpose–but the hubris of the creator is shown and she is forced to try to destroy it, ultimately being destroyed by it. This last has been around at least since the ancient tale of the Golem created by a Rabbi to wipe the land clean of those who would oppress and kill Jews, and that really speaks to the age of this feeling, in humanity.
Noted and notable futurist Jamais Cascio has spoken very clearly about what this strain in representative fiction meas to our “real world operations” with what he calls Autonomous Created Intelligence, in his Laws of Robotics, but my concern is the same thing, from a slightly different angle. I believe that our fiction reflects our hopes and fears, but it also helps shape them. This means that, if we make Science Fiction/Fantasy/Horror that shows warring factions coming to a better relationship, rather than a plain old victim/victor model, we’ll see that as more and more possible. As I said in my Splice review, it’s long past time for Science Fiction to move beyond this simplistic “Kill The Monster Or Make It Real” dichotomy into more of a “recognition, integration and correction of our failures” kind of place, and I think T:SCC was on it’s way to doing just that.
T:SCC was developing a interesting narrative, consisting of changing the nature of the interaction between machine and human, about exploring what living in a post-apocalyptic world does to the psychology of the people who, though used to living in it, are now living somewhen else (it shows in the little things like what happens when they hear a dog bark, or how they brush their teeth). It was serious about the business of developing and exploring the existent complexity in the interactions of its characters, not shying away from the fact of them. People had different desires, agendas, plans, and preferences–all of which meant that, even if their end goal was the same (i.e. Stop Skynet), the actions by which they tried to get there were very different. In other words, the show was Smart. And I could go off for days and days about the characterisation of humans and machines, and what it means that such a smart show was canceled…but I don’t think you want to hear me whine. What I want to talk about is exactly how I started this extended rant: Humans and their autonomous creations have had a rough relationship in fiction, for a long time, T:SCC seemed to be trying to change that, and then it was canceled.
So, before we continue, some background, if you don’t know what the hell I’ve been yammering on about this whole time: Terminator: The Sarah Connor Chronicles (T:SCC) is an American television show which ran from 2008 to 2009. The plot concerns the continuing lives of Sarah and John Connor within James Cameron’s Terminator universe. It picked up a few years after the events of T:2, and pulled the two main characters eight years into the future, skipping over the events of the third film in the franchise. T:SCC got two seasons, with the first season foreshortened to nine episodes due to a mid-season start and the early 2008 WGA writer’s strike. The second season had a full compliment of twenty-two episodes, after which the show was canceled, making thirty-one total episodes. A number of factors went into the cancellation of the show, but they all boil down to the thing that usually always cancels a show: low ratings. There were not enough eyes on the screen at time of air.
Now, I’m not going to use this space to go off on a rant about time-shifted viewing being just one major aspect of a vast landscape shift, the whole of which needs to be taken into account for the sake of the continuation of serialised entertainment, even if I think that’s something that really needs to be talked about. This isn’t really the place for that. And, like I said, it’s not the place for is talking about the fact that television networks recently seem extremely eager to cancel smart television, as soon as it doesn’t conform to the older ratings models; to know that, just look at your TV-scape these days and you will see a vast, shallow sea of “reality-based” and soap-operatic mediocrity, pocked with little islands of extraordinarily complex, engaging shows, and these latter islands are constantly in danger of sinking. I do think that problem is an interesting one, though, so I’ll say this: Think about it. Anyway, now that we’ve gotten all of that out of the way, let’s talk about why you’re all here: Killer Robots are Awesome and Summer Glau is Hot. Am I right? Yeah, I thought so.
To be fair, killer robots are completely awesome, even as they’re completely terrifying, and that’s what keeps us coming back to the Terminator franchise, even when it hurts us. To be completely honest, I was swayed by Internet rumour mills, and I went to see Terminator: Salvation based primarily on the vain hope that the film was going to be the proper ending to T:SCC. So great was my love for ass-kicking robots, deeply complex readings of “artificial intelligence” and the awesome, crazy stuff they pulled in that show, that I convinced myself that the movie had to be following from that timeline. And I was wrong. The show and the film had nothing to do with each other, and we are all poorer, because of it. The machines and cyborgs in the show were far more interesting, to me, than anything that film had to offer.
Though there was some complaint about a “Terminator of the week” kind of vibe in some of the episodes of the show, there are really only four important Artificial Intelligences in the T:SCC universe: Cameron, John’s cyborg protector in the form of a young girl, sent to blend in as a part of the Connor Family and to serve as a companion for John’s younger self; Cromartie, the cyborg hunting the Connors through time and changes of body; “Catherine Weaver”, the T-1001 model that has come back through time, killed two people and assumed the form of one of them to run her vast electronics and computing software firm (sounds both ominous and obvious, right?); and John Henry, the AI created by Weaver’s company, and taught and raised by humans (bwhuuu?!). Now each of these machines showcases a different kind of development in the course of the series and each of the kinds of development is a piece of a very important puzzle: and the title of that puzzle is “How Do We Learn?” When we look at what Cameron learns from her time with the Connors–the things, actions, and opinions that matter to her–it’s a different set of things than Cromartie learns from Agent Ellison (the FBI agent hunting Sarah Connor, following the Destruction of Cyberdyne in 1995), which is in turn a different set from the concerns and lessons of Catherine Weaver, which are different from John Henry’s lessons. It’s the way in which those lessons interconnect that matters.
The beauty of this show is in the intricate, subtle interplay of the characters–human and cyborg/machine–and how what they learn, what they know, and what they don’t know that they’ve learned…all play off of each other and create lives and a world, while they are all in the midsts of seeking to not just save but literally create and sustain their futures. Now, the show is ostensibly about the human element: human reactions to robots, robots impacting the lives of humans, OMG Uncanny Valley, blah blah blah. If you can’t tell, by now, let me put it simply: I think that’s boring. I’m not saying that there isn’t useful, interesting fiction there, mind you, just that I’m bored by it, because it has been done to death. Yes, human psychology is a fascinating thing. Yes, the end of the world (personal and collective) is deeply affecting. Yes, stress and change and madness all take their toll on the mind living in the constant glut of it, and watching that can be deeply jarring, on an emotional level. But I know all that, already. What I don’t know is: what is the psychology of a created intelligence? Why does Skynet persist in viewing us as a threat to itself, seeking to hunt us down to the irrational end of self-fulfilling prophecy? What does a machine that is programmed to feel… feel? There are some really interesting tastes of this in T:SCC and I would now like to talk about them, at length.
I don’t even think I should have to talk about spoilers at this point, so if you haven’t seen it yet and want to stop here and let what I’ve said so far stand to convince you to read it, that’s cool. Just bookmark this page, and come on back after you’re done over at Netflix or whatever. Otherwise, let’s get to the heart of this thing. Let’s talk about these robots.
Cameron: Cameron–arguably the most important machine intelligence in this show–first shows up at John Connor’s high school, shortly after he and his mother Sarah move to a new town and away from their most recent false life as John and Sarah Reese. She presents herself as a normal girl, interested in John, just happening to find herself seated next to him in all of his classes. In one class, their usual teacher, Mr Ferguson, is ill that day and a substitute instructor, Cromartie, will be taking his place. As he calls the names, he scans the room…and when he gets to “John Baum,” he pulls out the gun that he’s dug from the flesh of his leg and fires. Cameron happens to be in the way and John Runs. Outside, in the parking lot, Cromartie hunts John and as he finds him, raises the gun and… is hit by a large truck. The passenger door opens, revealing Cameron behind the wheel, then she reaches out her hand and says those eight magic words: “Come with me if you want to live.”
All through the course of the show, Cameron–a reprogrammed T-888 model cyborg–is concerned with one thing and one thing only: The Safety of John and Sarah Connor. And if it came down to it? She’s not that picky about Sarah. She was sent back by “Future John” for this purpose, as well as to move himself and his mother forward in time, to jump over the death of Sarah Connor (cancer), and to help them find and destroy the elements that will bring about Judgement Day. Anything that gets in the way is to be destroyed or removed. But she is also supposed to take orders from “Present John,” when they don’t supersede those given by “Future John” and that, in itself, is an interesting set of definitions on Cameron’s part. As she places and is placed in varying situations, with young John Connor and his mother, she will respond to his commands and behaviours based in part on how he comports himself. If he acts like a whiny teenager, then she manipulates and lies to him. If he acts like a general, and a leader, then she obeys him without question, or offers counsel, as an equal. This seems to suggest that, for Cameron, “Future John” isn’t so much a distinct person, as a state of mind. And, considering we’re talking about a universe with heavy time travel, that only makes sense.
One of the most clearly defined of Cameron’s traits is her definition of friendship. Being a cyborg originally programmed to infiltrate as a teenage girl within John Connor’s inner circle, reprogrammed by John, and sent back to blend into a high school setting…socialisation and interaction are key things she has to learn. Ironically, this focus is precisely what causes her to stick out as different. However, in a pre-Judgement Day world, she’s just “that weird, intense girl who says ‘Thank You For Explaining’ all the time.” In the Season 2 episode “Self Made Man,” we see what Cameron gets up to late at night when everyone else is asleep. She goes to the library and reads everything she can related to history and industry–anything that could possibly indicate the origins of Skynet. How does she manage this? She befriends the night librarian (a young man named Erik, who’s confined to a wheelchair as a result of a bout with bone cancer) by bringing him donuts. Over the course of the episode, they search the records for a very specific purpose, and Cameron teaches Erik how to shoot a gun, deep within the library. She carries him upstairs and walks in on him in the bathroom to ask questions. In the end, she tells Erik that he needs to get checked out for the return of his cancer, telling him that his body weight is down, his bone mass is compromised, and his trouble pulling the trigger indicates muscle weakness.
Erik is understandably upset by all of this, telling her that she can’t just say things like that to people who are supposed to be her friends…but that’s the thing: She is acting as she feels a friend should act. He is currently damaged, but it isn’t irreparable, and he should see to it as soon as possible. Who wouldn’t do that for her friends? The next night, Cameron returns to the library with her customary bag of donuts and a young woman answers the door with a “Can I help you?” Cameron asks to see Erik, and the woman answers, “I don’t know any Erik…They just told me to come in.” Cameron pauses for a second, then smiles and says, “Would you like a donut?” And the young woman lets her in the door.
The simplistic reading of this is that Cameron used Erik for his access…but if that were the case, she wouldn’t have bothered telling him about the cancer and she wouldn’t have paused when she was told that he wasn’t working that night. This is the subtle indication that friendship matters to Cameron…but not as much as the mission. Friendship and empathy are there, but they’re secondary to her function as protector of John Connor and stopper of Skynet. In the season one episode, “The Demon Hand” (one of the flat-out best of the first season), Cameron infiltrates a small ballet studio in order to follow a very important lead. She uses her new instructor to find a black market fence, letting the woman believe that she’ll help and protect her and her brother in exchange for information. When she gets the information, she walks out and lets the two of them die. When Sarah asks what happens to the people, Cameron replies, “They died.” “Did you kill them?” “That wasn’t my mission.” This coldness changes, over time, to a position where, though someone is not a direct benefit to the Connors, neither are they a harm, and that is enough for her to keep them from coming to harm, or, at the very least, provide them with the means to help themselves. This is a prolonged evolution of the character’s personality.
Cameron always wants to do more and is never satisfied with slower half-measures, when it is almost always more efficient to simply kill someone or blow something up. The ability to strategise and plan in the long term, to value someone or something for what they might do, rather than to dread it, is something that everyone in this series eventually learns–but especially Cameron. More than that, Cameron also wants to learn all she can; things like ballet and video games and libraries and idioms and friends…and she keeps secrets out of this need. She hides things from John and Sarah and Derek, out of a sense of self-preservation, because she knows that the more she seems to grow and develop, the less they’ll trust her. It’s an interesting dichotomy and another self-fulfilling Catch-22, because when they do find out, they trust her even less. But Cameron makes her choices, and learns her lessons, all in the name of protecting John, and helping him to become the man he needs to be.
Cromartie: Cromartie starts life as a standard T-888, traveling through time to try to kill the Connors before they can stop Skynet, etc., but he begins a small-yet-crucial evolutionary journey after being blown apart and accidentally traveling to the future. In the episode “Heavy Metal,” Cromartie first meets Agent James Ellison who, thinking that Cromartie is the man whose face and apartment Cromartie has stolen, tells him that someone may be trying to steal his identity. “What would they do with it,” Cromartie asks. “What do people do with anything, these days,” Ellison responds. “Lie, deceive, terrorize the national psyche.” This seems to click with Cromartie, who, from that point forward, conducts his public searches for the Connors in the guise of an FBI agent. This isn’t the only change in Cromartie’s thoughts and behaviour, however. Like Cameron, he learns long-term strategy from his exposure to the Connors, but whereas hers is to complement their efforts, Cromartie’s is in response to it. He also develops a sense of self, and a sense of what he himself calls “faith.”
In the Season 2 episode “The Brothers of Nabalus,” Cromartie saves James Ellison from a T-888 model duplicate of himself, ostensibly to take over Ellison’s contacts and find the Connors. Ellison asks Cromartie, “Why did you save me?” and Cromartie responds that “Skynet doesn’t believe in you like I do. I have faith.” “Faith?” “That you will lead me to the Connors.” Cromartie never self-examines this new tendency toward irrational belief regardless of evidence…because he doesn’t see it happening. Even when his actions directly contradict Skynet’s objectives–a behaviour pattern which could be said to point to chip damage probably sustained during the bank explosion and involuntary time jump–he retains his faith in Ellison, and, in fact, his faith is borne out. In the episode “Mr Ferguson is Ill Today,” Ellison responds to an FBI alert regarding one John Connor in Mexico, and Cromartie, having already obtained Sarah Connor, follows him. The fact of the matter is, Cromartie could have found John on his own, simply by listening to police bands and monitoring certain name traffic on the FBI’s servers. That he attributes his finding them to Ellison (and that Ellison blames himself) is indicative of what he’s learned, from humans: Faith. Trusting in and following Ellison, over and above more efficient conventional means.
In the end, though, while Cromartie’s faith guided him to the Connors, it might not be the case that it was the outcome he truly desired.
Catherine Weaver: The first Catherine Weaver was an entrepreneur and co-founder, along with her husband Lachland, of ZeiraCorp, a technology and computing company with interests in software, robotics, and defense development. She and her husband had a daughter named Savannah, found some seriously insane advanced technology and attempted to reverse engineer it…and then the elder Weavers died, suddenly. Lachland in a helicopter crash; Catherine in unspecified circumstances. The new Catherine Weaver is a T-1001 model liquid metal terminator, which continues to both operate ZeiraCorp and raise Savannah. In the course of its (her?) operations, Weaver is shown to provide for the people who are loyal to her and dispatch those who aren’t…with extreme prejudice. At first, it seems like she’s predictably hastening the arrival of Skynet: building an advanced robotics and programming wing codenamed “Project Babylon.” Gonna be Skynet, right? Well, not so much. Weaver’s plan is a bit more complex than just bringing about the end of the world, and it’s here that we start to see just how important the ideas of learning and development are in the T:SCC world.
As I said, Weaver continues to raise Savannah as her daughter, at first obviously just keeping her around for appearances…but later she begins to show real concern for Savannah’s well-being, and not just Savannah’s, but that of humanity as a whole. Weaver, recognising the qualities she is unable to provide a human child, has her staff take care of Savannah a large portion of the time, and even goes so far as to take Savannah to see a family psychologist, Dr. Boyd Sherman, to help her talk about her worries, problems, and fears. Catherine Weaver is, in this way, learning how to be a nurturing parent. Though Savannah isn’t her actual child, Savannah doesn’t quite understand what’s happened to her mother, and still relies on her, and clings to her, on an emotional level. Weaver takes this as an opportunity to practice for what will be her greatest creation: the analog to having her own child, Project Babylon. Now, the fact that she even bothers should make it clear that this isn’t your traditional “Kill All Humans” kind of Terminator. In fact, she seems to feel that killing humans should be prevented whenever possible…but done efficiently and without error or remorse, when necessary. If there is a problem, you excise it, cauterise the wound, and move on. But live a life that leads to as few problems as possible, all right?
Again, Weaver recruits a psychologist to help out with Savannah, but that’s not all he’s for. After he helps her with Savannah, Weaver asks Dr Sherman to help with another child, to teach it and aid in its development. She wants him to help her raise Project Babylon. Right around this time, Former Special Agent Ellison has been approached by Weaver with a similar job offer, due to his experience with Cromartie, and has told him a little of what she’s trying to do. Eventually, Project Babylon is renamed “John Henry,” and it causes the death of one of its handlers. Weaver takes this time to introduce Ellison to John Henry, and get Ellison to help. This is Weaver’s pattern and mode of behaviour: She has defied Skynet (and rejected Future John Connor) in order to introduce the human perspective into the mind of a machine…and the machine perspective to the world of humans (even if they don’t realise it). Weaver, you see, is working for a third way that neither Skynet nor the humans in the middle of the conflict seem capable of realising: Mutual Understanding. Weaver seeks to achieve organically (ironic, no?) and the product is a being which has the perspective of human ethics, coupled with robotic logic: John Henry.
John Henry: Unlike the other platforms we see in T:SCC, John Henry learns from the ground up. What I mean is, Cameron has been reprogrammed, twice, and that gives her part of her perspective; Cromartie was disabled, deactivated, and sent to the future where he had to adapt to brand new parameters, which gives him his; and Weaver is a highly adaptable T-1001 model that comes to the conclusion that war is stupid and that everyone’s going to die, if she doesn’t do something about it. But John Henry is built from the basic framework of a thinking, adapting chess computer, and then it is, ever so carefully, taught. Dr Sherman provides the programmers with the model by which to teach a developing intelligence, and spends time helping John Henry equate learning with playing. So, at first, John Henry is taught math, definitions, grammar, colours, shapes, facts and figures, dates, history, and so forth. Then it’s given access to the Internet, and it expands, even more, correlating ideas, connecting related tangents and snippets of information. And then John Henry plays games with Savannah, and they learn together. And then John Henry accidentally kills someone, and its creator decides to nip that right in the bud.
Now do they do that by scrubbing the program and starting over; basically saying “Screw it! This one’s a wash?” Do they go back to base code and make him Three-Laws-Safe? No. No, they do not. Because Weaver is concerned with a world in which humans don’t hate and fear machines and in which machines don’t feel the need to fight humans, she takes the time and effort to find someone to teach John Henry why it shouldn’t kill people or allow them to die. What a revolutionary idea! Through his interactions with Ellison, John Henry is given an ethically-based respect for human (if not all) life and, through this, comes to understand the notions of remorse and regret for one’s actions. He promises that he will be careful to make sure no one dies this way again, and this message is reinforced by Weaver, who tells John Henry that Savannah’s survival is dependent on John Henry’s continued survival and learning, but is is not necessarily dependent on hers. Like with every other piece of information, John Henry considers this very carefully.
And then, one day, Savannah wants to introduce John Henry’s toys to her toys, wants them to play together. John Henry says he doesn’t remember reading any thing about duckies in the Bionicle Kingdom and this makes Savannah sad. When John Henry asks what’s wrong (when John Henry Asks What’s Wrong), Savannah says that the duckies are sad, because they want to play, and can’t John Henry change the rules so they can play? Now, this is a concept John Henry hasn’t ever encountered before, so he takes a few seconds to think about it, after which he replies, “Yes. We can Change The Rules.” This is a crucial understanding for John Henry, because he realises that it can be applied to all games and any conflicts. It means that, if two or more groups agree that the rules or laws of their engagement can be other than they were, then they are. This is vastly important, because it is one of the very last things John Henry needs to understand. The last two things follow almost immediately thereafter: Similarity and Death.
As soon as John Henry starts playing the new game with Savannah, something infiltrates his systems, attacks Savannah, and accesses all of John Henry’s files. A Skynet node has found him. This causes his handler to have to rush in and shut him off before too much damage can be done, but the time in which John Henry is deactivated and detached from all information sources is like an eternity for an AI. It’s like dying. When he is reactivated he understands that there is another system in the world like him: He has a brother and, even if that brother wants to kill him, it’s out there and needs to be found. The other thing he understands is how horrible it is to die, and he seems to resolve that if he can save anyone from experiencing this before they have to, then he will. This puts him on the course of action he follows in the series finale, and places John Connor in the position to become the person he has to.
So what does all this really matter? So every machine learns from humans, and every human has an influence on the development of the machines. So what? Well, exactly that. Cameron learns from John how to hide what she wants. Cromartie learns from Agent Ellison how to be patient and have faith. Weaver learns from Dr Sherman and Savannah how to be a mother. John Henry learns from everyone how to be himself. What the machines learn, from whom and how they learn it, and how they apply it, all add something into the final mix of what humans and machines have to do to survive and thrive in the coming world: They have to adapt, they to learn from each other, and recognise that they are different types of intelligence, with different concerns and ways of understanding the world, but none of them wants to die. This last point can be understood by any living thing, and can become a point of unification and consensus, rather than contention and war.
Now, it’s pure speculation, on my part, but I think that the “Changing the rules” exchange between John Henry and Savannah Weaver was intended to imply a change not only in the way we approach the conflict between humans and machines, but also to the traditional rules of Science-Fiction/Fantasy/Horror tropes of Frankensteinian Monsters and Pinoccian Puppets with dreams of being “Real.” What, they seem to ask, do we think about the creations who know they’re creations and are happy with who and what they are? What of the monster which revels in its monstrosity, the robot which wants to be a better robot? Or what about the beings who aren’t concerned with human versus machine, who think that any thinking, feeling thing should be allowed to flourish and learn? What about those who simply want to Be Better, and to help others do the same? Like I said, it’s pure speculation for me to suggest that the creators of T:SCC were suggesting this or were asking any of these questions. But I am.
And this is why I had such hope for Terminator: Salvation. The trailers and the plot points all seemed to point this way, toward a new way of understanding the conflict between humans and Skynet and I desperately wanted that to be the case, in a large scale, big budget summer blockbuster. Because isn’t it time we started honestly asking what the AI–as alien as it necessarily must be–is thinking and feeling, rather than just presenting a foil for our fear of the potential dangers of technological progress? Isn’t it time we were as concerned with a third way as Catherine Weaver and John Henry? Isn’t it time we gave audiences a little credit for being able to move past fear and prejudice toward the other, and thereby help them do just that? Honestly, I think it’s long past time.