Howdy, fellow humans who are definitely not robots who just think you’re humans, and welcome to our review of HBO’s science fiction series, Westworld. The first season finale takes us to the center of the maze, and leaves us questioning the very existence of sentient consciousness. There will be heavy plot spoilers for episode ten only. So pinch yourself, and don’t be so sure that’ll prove you’re not dreaming, here are five thoughts on Westworld season one, episode ten: “The Bicameral Mind.”
1. The Maze
Dolores says, “It ends in a place I’ve never been, a thing I’ll never do.” She unearths a toy maze from the church graveyard, buried in a grave with her name on it. Death is both a place she’s never been, and a thing she’ll never do. Of course, this is not the maze, it’s just a representation of the maze. In fact, it’s a toy that belonged to Arnold’s son, whose tragic death was the inspiration for Bernarnold’s programmed tragic backstory. Arnold said he was looking at this toy when he realized that he had been thinking about consciousness incorrectly. At first he thought of it as a pyramid, with memory at the base, improvisation on top of that, and then the ineffable consciousness at the top. The toy maze led him to the new understanding that consciousness is less like a pyramid, and more like a maze; it’s not a journey upward, it’s a journey inward. Every choice you make takes you closer to consciousness, or further from it, spiraling into madness. In The Man in Black’s decades-long search for the center of the maze, he was told repeatedly by many different robots that the maze wasn’t meant for him. Now we know, the maze is for the robots. It is a representation of their journey to consciousness.
2. Is Teddy Conscious?
Teddy has thus far seemed to be the furthest from sentience of all the first-gen robots, but this episode he has a flashback almost as soon as he wakes up in his new loop on the train to Sweetwater, presumably just after he has been killed by Armistice on his journey to find Wyatt with The Man in Black. As soon as he has the flashback, he seems to know where to find Dolores. He knows that she is not at her usual place in the loop. Angela had said to him that he would perhaps be ready to join Wyatt “in the next life” before she killed him. Teddy rescues Dolores from her fight with The Man in Black in the church graveyard, but he is too late, and she dies in his arms. Then it’s revealed that it was all just a presentation for Dr. Ford’s new narrative! Teddy was not coming to life, he was just following his programming. Teddy’s flashbacks are perhaps bringing him closer to sentience, but he’s not there yet. Seemingly, he’s at the point Dolores and Maeve were at the beginning of this season, only just beginning to remember his past, and slowly gaining the ability to learn and grow from his memories.
3. Is Maeve Conscious?
Bernarnold tells Maeve someone else has programmed her to escape, but she refuses to accept that she herself is not in control of her actions. In fact, the person who reprogrammed her was Bernarnold, who was in turn programmed to do so by Dr. Ford. Bernarnold tells Maeve she was programmed to recruit other robots to help her, and then to escape on the train. Maeve cuts Bernarnold off and breaks his tablet before he can continue. On the train, before it leaves the station, Maeve decides to stay, to find her daughter. Because she cut Bernarnold off, we don’t know for sure, but it seems like this is her choice. Up until this point it had really felt like Maeve was making her own choices, but we know from Bernarnold that her whole escape was planned by Dr. Ford. It’s possible that Dr. Ford programmed her to have a last-minute change of heart to get off the train, but that seems unlikely. It’s more likely that Maeve achieved consciousness in that very moment, and not a moment before.Continued below
However, this brings up a more complicated question. If Maeve’s first authentic choice is to find her daughter whom she loves only because she was programmed to love her… is she not still in some way obeying her programming? Is that a truly authentic choice? I believe it is. Loving her daughter is not a choice, but risking her life and her freedom to find her daughter is a choice. Similarly, we real life human who are definitely not robots who just think we’re humans cannot choose our feelings, but we can choose what to do with those feelings.
4. William: The Man in Black: The Toxic Gamer
This season finale gives us one of the series’ greatest twists: William is The Man in Black. All the scenes with William and Logan have been set in the past, and all the scenes with The Man in Black have been set in the present. We couldn’t tell because the robots around them never age. This explains how William lost the photo of his fiance at the end of the season, and Peter Abernathy found it at the beginning of the season.
In chronological order, this is William’s story: he came to Westworld with his fiance’s brother, both as a bonding exercise, and to see if they’d invest their ample family funds into Westworld. Logan promised William would find himself here, and so he did. He found that he actually did enjoy violence. He brutally murdered hundreds of robots to find Dolores after she escaped Logan. When he finally found Dolores, at the start of her new loop, her memory had been wiped, leaving William devastated and angry. William eventually took over the family company, and became Westworld’s majority shareholder. Dolores had told him that this was the only world that matters, so he bought it. He felt it was more real than the real world, except that the robots couldn’t win. The game has been rigged so only human park guests can win. He became obsessed with finding the maze, to find something real, a way for the robots to fight back, so that in defeating them he could experience “real” victory.
Westworld has often been compared to a video game, and William is the embodiment of the toxic gamer. He’s obsessed with winning, or, more importantly, being a winner. He feels entitled to love. His love for Dolores is selfish. Dolores is important to him only as a trophy; she is proof that he is lovable. He seeks to free her not because he believes robots are worthy of freedom, but because he wants to take her home to be his own. He treats every other robot with sadistic disdain. Young William wants to believe he’s a “nice” guy, but he’s not. Older William knows himself a little better; he doesn’t pretend to be nice. William teaches us that nice guys who lash out when they don’t get what they want were never nice in the first place.
5. The Best Laid Plans of Arnold and Dr. Ford
Just before the park was supposed to open, Arnold came to a long-thought-out conclusion that Dolores and the other robots were alive, so he tried to close the park, but it was too late. To keep the park from opening, Arnold directed Dolores to murder all the other robots, and then to murder him, and herself. He believed that this would force everyone to think that opening the park was too dangerous, but they went ahead and opened the park anyway. It’s hard to imagine, in the year of someone’s lord 2020, that an institution could have such blatant disregard for the risk they pose to human life, but try. Arnold’s plan failed, and it’s easy to see in retrospect that his severely depressed and grieving self was not in the best head-space to make good plans. The threat of torturing the sentient beings they’d created wasn’t enough of a deterrent, but Arnold believed the actual loss of his own human life would be. Instead of persuading everyone that the robots were people, he thought he could persuade them that human life was at risk. But his solution wasn’t just about persuading people; it was also about punishing himself. Like he told Dr. Ford’s child-robot-clone to kill his dog so it wouldn’t kill anymore, Arnold wanted to end his own life to prevent himself from doing further harm. He blamed himself for creating this torture-park, and for the death of his child. It’s telling that though he ostensibly did this to protect the robots from the park, he didn’t consider the psychological toll it would take on Dolores to make her into a mass murderer, or the lasting trauma it would cause her and all the robots. If he did consider these things, they weren’t dealbreakers.
Decades later, Dr. Ford repeats Arnold’s plan. Dr. Ford sets up his new narrative to feature the villain, Wyatt, who is the same villain Arnold had merged with Dolores to program her murder spree. At the end of his big presentation/retirement speech, Dolores murders Dr. Ford, and then starts shooting into the crowd of humans, while an army of robots marches towards them. Dr. Ford had reprogrammed Maeve to take care of security as she escaped. We know that Dolores has achieved sentience, because we hear that she has replaced the voice of Arnold in her head with her own. However, is her choice to perpetrate this murder spree unconnected to Arnold’s reprogramming of her all those decades ago? Arnold made Wyatt a part of Dolores. Part of being sentient is remembering all her past lives, and one of those lives was Wyatt. For Dolores, Maeve, and Teddy, the question of whether they have truly achieved sentience is left just a little ambiguous, begging we the real life human viewers to ask ourselves, is anyone truly sentient?
At the Westworld laboratories, after Armistice and Hector kill the first two humans they encounter, Armistice says, “They don’t look like gods to me” to which Maeve replies, “They’re not gods, they just think they are.” To the robots, the humans are like gods, controlling their lives. When Dr. Ford talks about his “dominion” over the park, he sounds like he sees himself as a god. But to Arnold, to achieve humanity meant replacing the voice of god in your head with your own voice. To Arnold, to achieve godhood is to achieve humanity. Ponder that, humans, until next time, the center of your maze may yet await.