Column: What Westworld gets wrong (and right) about human nature
A central theme of HBO’s new sci-fi series “Westworld” is the question of what it means to be human.
The setting is an immersive adult theme park that’s been fashioned after the American Old West and is inhabited by intelligent lifelike robots. Over the years, the robots – called hosts – have been updated to look and act more human. As a result, the hosts have started to deviate from their programming. They’ve become unpredictable – just like humans.
While viewers are invited to ponder the robot hosts’ humanity, the irony of “Westworld” is that the park’s wealthy, human guests are the ones who seem truly inhuman. They live out their wildest fantasies, no matter how depraved, abusing and murdering the hosts with indifference, even glee. One guest, after shooting a host in a bar for no apparent reason, shouts, “Now that’s a vacation!”
The guests’ sadistic treatment of the hosts paints a grim portrait of human nature. It also forces viewers to wonder: What would you do if you visited Westworld? Could you really shoot a lifelike host in the face while they pleaded for mercy?
Research by psychologists provides some insight into how most humans would actually act in Westworld.
Perceiving robot minds
Our willingness to harm others depends, in part, on our perceptions of what they think and feel.
In 2007, psychologists Heather Gray, Kurt Gray and Daniel Wegner conducted a study of how people think about the minds of human, animal and robot characters. Using the answers of over 2,000 participants of an online survey, they found that the respondents judged mental capacity on two independent factors: the capacity to feel things like pain and pleasure (a factor that the researchers termed “experience”), and the capacity to plan and make decisions (a factor that the researchers termed “agency”).
Respondents were also asked how painful it would be for them if they were forced to harm the various characters. On average, they considered it more painful to harm characters that were rated high on “experience” (capacity to feel). However, ratings of “agency” (capacity to plan and make decisions) – high or low – had much less influence on the respondents’ feelings about harming the characters.
For example, one character in the survey was Kismet, a social robot that can express emotion through facial expressions. Kismet was rated moderately high in “agency” but very low in “experience.” As a result, the participants were, on average, more open to harming Kismet. This is consistent with Westworld’s guests’ indifference about harming the hosts, who are robots.
But there is a key difference between robots like Kismet and Westworld’s hosts.
In Westworld, the hosts are virtually indistinguishable from humans both in behavior and appearance. They are played by human actors on the show. They even bleed.
During the show’s second episode, William, a first-time guest to the park, has the following exchange with a host:
“Are you real?”
“Well if you can’t tell, does it matter?”
The primary way that you or I or William decide if another agent has a mind is by observing the agent’s appearance and behavior. But if the hosts look and act human, it would be difficult to overcome the powerful feeling that they have minds and can feel pain, even if we’re told they don’t.
A 2012 study by psychologists Kurt Gray and Daniel Wegner on the creepiness of lifelike robots supports the idea that a robot’s appearance is a major factor in our perceptions of its capacity to feel.
In a series of experiments, the researchers found that robots that appeared more lifelike were thought to have a higher capacity to feel pain or pleasure. This made study participants uneasy. For example, in one experiment, 105 participants watched a video of the robot KASPAR either from the front, showing a human-like face, or from the back, showing wires and mechanics. When participants viewed KASPAR from the front, they assigned it slightly higher ratings of “experience” and also found the robot slightly creepier.
This suggests that most Westworld guests would not be able to easily stab a lifelike host in the hand with a knife and watch him writhe in pain (which is just what William’s brother-in-law, Logan, does in the second episode).
Instead, most of us would react with horror.
Dehumanizing robots, dehumanizing people
But people are sometimes capable of callous violence, even toward actual humans. Such violence is psychologically easier when the perpetrators dehumanize their victims, viewing them as having less of a mind. Historically, many genocides have been preceded by campaigns to portray the victims as subhuman animals like rats and cockroaches.
We see this on “Westworld” too, where the park staff is encouraged to think of the hosts as mindless and less than human.
For example, in one scene, Dr. Ford, Westworld’s enigmatic creative director (played by Anthony Hopkins), admonishes a technician for covering a naked host with a sheet while working on him:
“Why is this host covered? Perhaps you didn’t want him to feel cold or ashamed. You wanted to cover his modesty. It doesn’t get cold! It doesn’t feel ashamed! It doesn’t feel a solitary thing that we haven’t told it to.”
He then casually cuts the host’s face with a scalpel to underscore his point: The hosts are mindless things, not people. By thinking about the hosts this way, the staff can rationalize any abuse.
So while “Westworld” offers an unrealistically grim view of typical human nature, it does serve as a reminder of the human capacity for cruelty.
Because the hosts look and act human, you would probably struggle to harm them. At the same time, if you could be conditioned to see the hosts as less than human, what would prevent you from being conditioned to see a group of actual humans the same way?