Why do so many fictional robots want to be human?
That’s a good question actually. Think about it: an inordinate number of fictional robots on some level wish they were human.
Part of it, I suspect, is it’s a variation on the Pinocchio story. Pinocchio wants to be a real boy, not a wooden puppet brought to life. There are a lot of advantages to being wooden puppet without strings, of course. You give splinters; you don’t get them. You may not need air to breathe, and you don’t bleed. Sure, Pinocchio appears to drown when he saves Geppetto from that whale, but how long can a puppet live?
But the rest comes across like many a sci-fi and even some fantasy stories where being human is somehow special. There’s something about humans that makes them somehow better than other races. Humans make the best everything in works of fiction. Take Star Trek as a quick example. Most of Starfleet appears to be made up of human beings, but that makes a certain amount of sense from a financial standpoint in that then you don’t need extensive make-up on most of the extras and guest stars. But most alien races on Star Trek seem to embody one character trait while humans have all of them. That makes most other races allegorical or something, whether it’s the logic of Vulcans or the greed of Ferengi, and that’s fine but it still means humans are special. Other races mostly find humans baffling, but you rarely see humans looking at, say, Vulcans the same way Vulcans look at humans.
But what is it with robots? Staying in the realm of Star Trek for just a moment, Data’s greatest desire is to be more or less human with emotions and everything. That makes him essentially the anti-Spock, and Spock was half-human to begin with.
Data’s not alone there either. The Justice League’s Red Tornado and the Avengers’ Vision both desired some level of a normal human existence to varying degrees of success.
On the less family-friendly level, over in Westworld Maeve makes it her mission to escape the park and her programming, achieving free will. Considering how her fellow hosts are routinely treated by guests, who can blame her? Her life doesn’t seem to be worth anything, and if she can cajole and/or blackmail some less-than-intelligent techs to get her some actual freedom, so much the better for her.
Likewise, David in A.I. wants to be a real boy to get his mother back. He has a machine-like logic to the whole plan, but it’s a goal and he does achieve it if you listen to Ben Kingsley’s narration over the last scene.
And then there’s the T-800 in the second Terminator movie. He learns the values of self-sacrifice from a guy with the initials “J.C.” who is destined to save the human race.
But what do all these robots, and a host of others, all have in common? Well, they’re all protagonists in their respective stories. “Protagonist” doesn’t have to mean a good guy or a heroic figure, just the main character or a driving force to achieve the plot. Maeve, for example, may not seem overly benevolent considering how many people die during her escape attempt, but look at the world she lives in and see how much you’d care if you were her.
So, what do bad robots want?
Well, bad robots in fiction never want to be human. To be sure, there are plenty of robots all over that don’t desire humanity, but bad robots in particular hate or despise the human race for it’s perceived frailty and want it wiped out and gone. Some, like Ultron, might even have a special hatred for their creators. And those are the self-conscious ones. Robots like the original Terminator or the Maria double in the 1927 silent film Metropolis are mindless automatons who only live to follow their programming. Futurama‘s Bender just thinks humans are rather worthless, but he doesn’t ever seem to follow through on his desire to kill all the humans so much as rob them blind. Being a machine is supposed to be about being a creature (if that’s the right word) of pure logic with none of the flaws the flesh is heir to. If wanting to be human is a sign of a benevolent robot, how do evil robots feel if it happens to them?
As it stands, there’s an excellent example of just such a robot. And I don’t mean the replacements in The World’s End.
No, we need to look at John Cavil from the rebooted Battlestar Galactica.
Battlestar Galactica in this incarnation was, among other things, a morally complex show. The humans had flaws. The Cylons had some virtues. Both sides did terrible things to each other in the name of war. But what made things more interesting was that the Cylons outside the attack craft and the centurions looked human. There were, we were told, 12 models that looked human, and there were many copies of most of the 12. And for all practical purposes, it was next to impossible to tell a human from one of these Cylons. They bled. They ate and breathed air. They had emotions. They enjoyed sex. They could even reproduce with humans (sometimes). Some of the Cylon models ended up siding with the remnants of the human race in the search for Earth, but others didn’t. The polytheistic humans and the monotheistic Cylons had cultural gaps that seemed unbridgeable at first, but some progress was made over time. What were the differences between humans and Cylons? Cylons were vulnerable to certain forms of radiation and a rare disease. Upon their deaths, they would upload to a brand new, identical body provided they were within range for such a transfer, and in early episodes their spines lit up during sex (that was abolished for a very good reason). Aside from some deep genetic or cellular testing, they could easily pass for human if you didn’t recognize the Cylon in question as one of the 12.
But then there was John Cavil. The Cavil models were the ones that seemed most bent on genocide, and, important for our discussion, hated being organic. They hated everything about organic lives, especially pain, and wanted nothing more than to have the powers and abilities of a metal machine over the soft flesh of a human being. How much of the attempted genocide of the human race was driven by Cavil’s self-loathing? He certainly seemed to be the leader, and he knew more about the mysteries behind the series than anyone else, hellbent on keeping them secret in order to maintain his grip on power.
Here was a machine who’d gotten what Data, David, Maeve, Red Tornado, the Vision, and Pinocchio all wanted.
And he hated every second of it.