Last week, Alton Sterling and Philando Castile, both black men, were shot by officers. Protesters have taken to the streets and social media to draw attention to their deaths, perceived as yet more instances of unjustified, racially motivated police violence.
How does racial bias play a role in police shootings? Two decades of scientific investigations suggest a nuanced answer.
Dozens of studies have shown that racial cues are perceived within a fraction of a second (whether or not the perceiver intends to do so), and that black racial cues rapidly activate negative information in memory.
The Implicit Association Test (IAT) is one simple way to demonstrate this. Black and white faces flash on the screen along with “good” words (such as “happy”) and “bad” words (such as “dangerous”). During some trials, you must sort the black faces and “bad” words to one side of the screen and the white faces and “good” words to the other side. In other trials, the rule is reversed: Put white faces and “bad” words together, and black faces and “good” words together. (You can see a demonstration of the IAT here.)
White Americans make faster decisions and fewer mistakes when they must sort white faces with “good” words and black faces with “bad” words, indicating a positive bias toward whites and a negative bias toward blacks.
That’s surprising, but not as disturbing as the results with black subjects. About 50 percent of black Americans show an automatic white preference as well, while the other half show an automatic black preference. Researchers conclude that the IAT scores reflect a merger of an automatic inclination toward one’s own race, tempered by what one learns is “good” in the larger culture.
Police officers are not immune to these effects. For example, in two studies involving more than 250 police officers, words associated with crime — “shoot,” “capture” and “arrest” — shifted their attention toward black male faces, particularly those considered physically representative of the social group.
But the more important question is this: Does automatic implicit racial bias influence the decision to shoot?
For an answer, researchers employ “shoot/don’t-shoot” simulations. Subjects view a series of video clips featuring people who may or may not be involved in crime. Their job is to press a button labeled “shoot” if they see an armed target, and to press “don’t shoot” in response to an unarmed target. (You can see a demonstration of these simulations here.)
Results show repeatedly that civilians of all races are faster and more accurate when shooting an armed black man rather than an armed white man. They are also faster and more accurate when responding “don’t shoot” to an unarmed white man rather than an unarmed black man.
But what about trained police officers? One 2007 study looked at how police officers perform in shoot/don’t shoot simulations, when the targets are black or white men armed with a gun or holding non-gun objects (including a black cell phone, a silver cell phone, a black wallet, a silver Coke can). The participants included trained black, white and Latino police officers as well as black, white and Latino community members. Compared to police officers, community members were more likely to shoot unarmed black targets. In other words, even though both groups experienced implicit racial bias, police officers inhibited that bias when making a final decision about whether to shoot.
A new study confirms this conclusion. The study, led by Roland G. Fryer, Jr., a professor of economics at Harvard, examined 1,332 police shootings between 2000 and 2015 in Austin, Houston and Dallas, Texas; Los Angeles; Orlando, Jacksonville, and four other counties in Florida. The surprising results showed that although black men and women are more likely to be touched, handcuffed, pushed to the ground, or pepper-sprayed by a police officer, they are not more likely to be shot by police than are white men and women. A closer examination of the reports from Houston showed that in tense situations, officers were about 20 percent less likely to shoot suspects if the suspect were black.
Other research suggests one caveat when interpreting these results: The kind of training officers receive matters. For example, when civilians were asked to read newspaper articles featuring black criminals, they showed strong racial bias on a shoot/don’t shoot simulation task. Reading articles about white criminals erased racial bias. But results for trained police officers were unaffected by the manipulation. Instead, the officers showed racial bias only in two conditions: When black targets were more frequently paired with weapons in the simulation, and when test participants were special unit officers who routinely deal with minority gang members. In both cases, training reinforced the association between blacks and danger, enhancing racial bias.
The picture that has emerged from these studies is that when the training an officer receives reinforces the association between blacks and danger, officers are more likely to show racial bias in their decisions to shoot. When that training instead undermines negative racial stereotypes, split-second shooting decisions are less likely to be implicit racial bias.