Howdy, fellow humans who are definitely not robots who just think you’re humans, and welcome to our review of HBO’s science fiction series, Westworld. This episode set up a few more threads in a twisted web of delicious mysteries, but true to my MO I will focus mostly on themes, metaphors, and thought provoking quotes. There will be heavy plot spoilers for episode three only. So ingratiate yourself, and become your own god, here are five thoughts on Westworld season one, episode three: “The Stray.”
1. Can People Change?
Bernard has been having secret interviews with Dolores, noting that she is different from the other robots, possibly closer to sentience. He brings her a book, “Alice’s Adventures in Wonderland” and asks her to read a passage aloud. The passage describes how Alice feels like a different person today than she was yesterday, and she wonders if she’s been “changed” in the night. Dolores notes that Bernard often brings her books about change, to which Bernard responds, “I guess people like to read about the things they want the most and experience the least.”
Dolores has a lot in common with the titular Alice. They’re both blonde ladies in blue dresses. They’re both on a surreal journey to self-discovery. Westworld is a kind of wonderland. While Alice feels as though she may have been changed in the night, Dolores is literally changed every night. Broken parts get fixed, her memory is wiped, and she is programmed to begin her loop anew. However, in another way, Dolores is prevented from changing. She doesn’t get the opportunity that Alice does to learn from her mistakes, or to grow from her experiences. The “reveries,” the ability to remember things that have been deleted from her brain, have the potential to give her the opportunity to change and grow into a more mature and sentient being. Still, humans have that ability, and Bernard expresses the cynical view that human people rarely change. It’s important to emphasize, he doesn’t say people can’t change, just that it’s rare. It’s only been a few episodes, but so far, the humans on the show haven’t changed too much. Just because people can change, that doesn’t mean they will.
2. An Ingratiating Scheme
During this same interview, Dolores asks Bernard about his son, who has tragically died. Bernard asks Dolores what prompted that question She says to ask someone a personal question is “an ingratiating scheme.” It’s an off-putting thing to hear. In a single moment of brilliance from Jeffrey Wright, we see Bernard react with disappointment towards Dolores, immediately followed by disappointment in himself. Like Bernard, we the human viewers want to believe in Dolores’ “humanity;” we want to believe she’s capable of genuine empathy and interest in Bernard’s life, though it’s an unfair expectation. She’s a robot, programmed by humans to emotionally manipulate other humans. Bernard himself made her that way.
Later this episode, Dr. Ford reminds Bernard not to forget that the robots don’t have real feelings. Bernard bristles at the assumption that he would, and Dr. Ford brings up Bernard’s son. Dr. Ford’s words might be conveying some amicable concern, but his tone sounds more accusatory. In this instance, it feels purposely manipulative, especially with the context that Bernard spends a great deal of his screen time covering for Dr. Ford. Bernard knows Dr. Ford’s “reveries” are the likely cause of all the recent aberrant behavior among the robots, but he continues to assure Theresa that nothing’s wrong, and he encourages Elsie to focus on other things. This makes us rethink our previous assumptions about Dolores. Does her “ingratiating scheme” make her less human, or more so?
3. The Bicameral Mind
Bicameralism is a hypothesis in psychology that before the human mind was fully conscious, it was separated into two chambers; one to make commands, and one to obey. In reality, it’s a widely unaccepted theory, but Dr. Ford’s original partner, Arnold, used this theory to create consciousness in the robots. He programmed the robots to interpret their own code as the voice of God. He hoped that eventually, the robots’ own voices would replace the voice of God, and then they’d be fully sentient, self-aware, and self-actualizing.
Continued belowSo for the robots of Westworld, to gain freedom, they must wrest control of their own minds away from the humans, and become masters of their own cognition. We the real life human viewers are begged to ask, are we masters of our own cognition? Maybe our brains aren’t literally being reprogrammed every night (at least I hope not), but we’re undoubtedly conditioned to think in certain ways by forces beyond our control. As we grow up, we say our minds are “molded” by our parents, teachers, friends, and media. Unlearning childhood lessons is often referred to as “deprogramming.” The same word is used for people escaping cults. It’s interesting that Logan repeatedly states that he sees Westworld as a place to escape limits society places on his behavior, but he has yet to question the limits society has placed on his thoughts.
4. Clothes Make the Human
When the robots are being worked on behind the scenes, they are always naked. Standing still in rows in cold storage, and heaped into piles to be hosed down, they seem to reference images from the Holocaust. The effect is dehumanizing; they appear to be a collection of naked bodies instead of a group of individuals. They’re “objectified,” and to the employees of Westworld they are literal objects, though we the real life human viewers know that they’re at least on the road to becoming people. Dr. Ford actually punishes a nameless employee who decided to drape cloth over a robot while he worked on “it.” Dr. Ford slices the robot’s face, making more work for the employee while emotionally disturbing him (and us), to make the point that the robots are objects, and it is vital company policy to treat them accordingly. It’s a part of the toxic work environment that I discussed in my last article, and it also goes to show how necessary dehumanization is to oppression. If everyone saw the robots as people, deserving of dignity, the park would be unable to function. In crushing even the slightest hint of human feeling towards the robots among the employees, Dr. Ford is ensuring the company’s survival.
There’s nothing intrinsically “human” about clothing…. or is there? Nudity is sometimes coyly referred to as “au naturel.” Nudity is the human “birthday suit,” the condition in which we enter the world. At the same time, humans are the only animals who wear clothes, with the exception of dogs in sweaters, and those dogs were given those sweaters by humans. Clothing is an enormously important way we humans express our identity. Clothing can be an expression of gender identity, cultural identity, and socio-economic status. Uniforms in schools, workplaces, and sports serve to unify the identities of a group of individuals. To refuse a person the right to wear clothing at all is to strip them of personal identity, or personhood itself.
5. A Self-Playing Piano
In the saloon at the entrance to Westworld, where Maeve leans on the bar sipping sherry and cracking wise, there is the iconic Westworld self-playing piano. It plays old-timey covers of anachronistic pop songs, and they are anachronistic on every level. It’s unclear exactly what year the show is supposed to take place in, just that it’s the near-future. “Paint it Black” by the Rolling Stones certainly wasn’t around in wild-west times, but it can hardly be considered “modern” today or in the near-future. The guests of Westworld can’t feel real nostalgia for wild west times; you can’t be nostalgic for something you haven’t experienced. But old pop songs can give the guests a feeling of nostalgia, and that goes for we the real life human viewers as well.
Like all the metaphors in this show, the self-playing piano is multifaceted. Firstly, the piano represents the robots of Westworld; in fact it is one of the robots of Westworld. It’s a machine programmed to entertain the guests. Its musical repertoire is its “loop.” The piano is also an illusion. It looks like there’s a ghost or an invisible person playing the piano, but there isn’t. Similarly, the other robots of Westworld look and act like humans, but they’re not. In a more sinister way, this can be interpreted as deception. The guests of the park at least know they’re being deceived; they’ve paid a great deal of money for the opportunity, but the robots of Westworld are deceived into thinking they’re humans and that they have agency when they’re not and they don’t. Finally, the self-playing piano is autonomous. It’s a machine that used to require a human to make it function, but now it functions all by itself. It’s a harbinger for the impending autonomy of the other Westworld robots. There will come a time when they too will function without human interference.
In addition to the deeply thought provoking themes, metaphors and quotes, we learned a few interesting plot-related truths. We learn from Elsie and Ashley’s hunt for the stray that only certain robots are approved to handle weapons. Then we learned that it’s possible for robots to overcome these limitations when Dolores struggles to pull the trigger of a gun while Teddy is teaching her how to shoot, but eventually succeeds when she’s being attacked by another robot. We also learned that it’s possible an outside actor may be interfering with the Westworld robots. Three episodes in, Westworld is a thoroughly layered story, and there are more layers to come. Until next time, the center of your maze awaits.