Westworld places itself at the cusp of the human question. It presents us with everything we are, and asks where we want to go from here.
Moving between the Wild West and the Wilder Present, we are forced to face the duality of our humanity: animal and intellect. We are thrust into the harsh realities of the Wild West, a non-thinking place where humans act on their animal instincts; and at the same time we are placed gently into the science lab, the most-thinking place, where it seems we use our minds for even crueler ends.
The horror that dawns slowly comes when we are forced to face what we are capable of doing to those who look and act just like us.
It forces us to face who we are at our basest.
In doing so, it also asks us to ponder what sort of human we may wish to be. However, by introducing the theory of the bicameral mind in the third episode, we are presented with a more elemental question: how do we become human? If, as the theory asserts, humans learned consciousness, then is it not possible that intelligent robots could learn it, too?
Considering this, do creatures that look and act like us deserve to be treated humanely, with the creeping notion that intelligent creatures, regardless of their species, will one day learn consciousness? Or, do we, perhaps shortsightedly, remind ourselves that they are not human and thus not worthy of our humanity? Perhaps complicating the issue is that we have already decided that humanity is not for humans alone: we extend it to animals. So, why not robots?
This is a policy question of the not-so-distant future. Should artificially intelligent robots have rights? In bushido, even inorganic things have spirits; perhaps this is why Japan has been at the forefront of using robots in day-to-day human life, especially in the field of care-giving. In the West, however, people would not be so accepting of robots as care-givers, because robots are feared as usurpers of humankind. (Perhaps this is revealing of how we perceive ourselves.)
Anyway, the fear ought to be real for the visitors in Westworld who take rather shocking liberties with the robots.
But, to be fair, we ought not to be afraid of robots taking over, because by the time we develop artificial intelligence sophisticated enough to do what we fear it will do, humans will already be enhanced enough to withstand it. At a certain point, there will be little difference between the abilities of artificial beings and organic beings because we will have either been able to manipulate our DNA to an extent that we have physical abilities akin to the ancient gods, or we will have supplemented our bodies with enough robotics to be super-human.
We should know by now that we will do anything to be more than human. We will do anything it takes to be divine.
And isn’t that, ultimately, the reason that Westworld was built?