Philando Castile, an African-American man from St. Paul, MN, died in a hospital on July 6, 2016, from a gun wound inflicted during a traffic stop by Jeronimo Yanez, a St. Anthony (MN) police officer.
This past Friday, on June 23, 2017, a jury acquitted Yanez of second-degree manslaughter and two counts of dangerous discharge of a firearm.
Video of the immediate aftermath of the shooting was live-streamed on social media by Philando’s girlfriend, Diamond Reynolds. The video, which showed a bloodied and dying Castile, went viral and further fanned the flames of tensions over a rash of highly publicized incidents of police shootings of African-American men.
Officially, Yanez pulled Castile over for a broken tail-light. However, in a call to an officer in another squad car, Yanez made note of a physical similarity he noticed between the driver (Castile) and a suspect in an earlier armed robbery: a “wide-set nose.” For many observers, this detail raised the specter of racial profiling–or at the very least, stereotyping. If Yanez acted on a stereotype—that stereotype would prove deadly for Castile, an elementary school cafeteria worker who had his four-year old daughter in the car—and who had nothing to do with an armed robbery.
A year ago, this incident raised public consciousness about the consequences of systemic prejudice in policing. And it spurred calls, once again, for systemic change.
The day after Castile was killed, Minnesota Governor Mark Dayton asked, “Would this have happened if the driver were white, if the passengers were white?” His own answer was. “I don’t think it would have.”
Dayton was pointing out the problem of prejudice. We’re all prejudiced. But when that prejudice affects something as serious as police work and when lives are at stake—when prejudice is holding a gun—the results can be tragic.
In Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts, Carol Tavris and Elliot Aronson point out that prejudices are natural results of human information processing. The mind sorts out data and experiences into categories. Categories, they say, is just a “nicer” synonym for stereotypes.
Stereotypes are “energy saving devices” which
Allow us to make efficient decisions on the basis of past experiences; they help us quickly process new information, retrieve memories, identify real differences between groups, and predict, often with considerable accuracy, how others will behave or think. We wisely rely on stereotypes and the quick information they give us to avoid danger, approach possible new friends, choose one school or job over another, or decide that that person across this crowded room will be the love of our lives. (75)
Stereotyping is natural and inevitable; we’d have a difficult time living in a complex social world without them. And we’re hardwired to split up the social world into categories, positioning ourselves (myself and my group) in relation to others. Tavris and Aronson assert that “Us is the most fundamental social category in the brain’s organizing system…” (76).
The downside of categorizing is that that stereotypes don’t always work and they can be downright false. False stereotypes are perpetuated through socialization and the false narratives we tell each other, even in the face of contradictory evidence.
Research performed through the Yale Child Study Center recently showed that preschool teachers are biased to expect bad behavior in black children more than in white children. As researcher Walter Gilliam put it,
“What we found was exactly what we expected based on the rates at which children are expelled from preschool programs,” Gilliam says. “Teachers looked more at the black children than the white children, and they looked specifically more at the African-American boy.”
The study reveals the pervasiveness, effectiveness, and troubling consequences of implicit bias. In this case, the expectation of bad behavior in black boys led them to “find” that bad behavior precisely where they expected it, even though their actual behavior did not differ from that of the other children in the study. No wonder black children are 3.6 times more likely to be expelled than white children. The expectation, the bias, creates the reality, which leads to a vicious cycle of confirmation—with unintended but unfortunate outcomes.
Those who unreflectively rely on stereotypes live in an “us versus them” (or “us versus not-us”) world, accepting uncritically the categories that have been created and handed down. Their prejudices emerge from and serve to confirm the way the social world has been categorized and narrated.
When enough people with similar prejudices, overlapping false narratives about the “us” and the “them,” have designed the architecture of an “us versus them world,” genuine differences are flattened out and those who have been designated the “not-us” often find themselves swimming upstream, on the out, or looking at the barrel of a gun that should never have been pointed at them.
To stop and think reflectively on inherited stereotypes takes time and energy. Such reflection raises the prospect of “cognitive dissonance,” when our natural assumptions, beliefs, and prejudices bump up against alternative evidence and counter-examples. The tension created by the input of new (often better) information is a cognitive dissonance that will only be resolved by either ignoring the new information and retreating to the old stereotype, or by changing one’s perspective.
To be fair to the situations that police officers sometimes find themselves in, they don’t have the time and energy, in the heat of a potentially volatile moment, to do that critical work. There isn’t often time or energy for cognitive dissonance.
That work must be done before that moment. Prejudices, biases, “self-delusions,” and so on, takes time, intentionality, and work to deal with as individuals. Imagine the kind of work it will take to change a flawed system.
But that’s exactly what Tavris and Aronson suggest, as they point as an example to the project of a law professor, Andrew McClurg, who proposes a plan for training law enforcement rookies which utilizes the research on cognitive dissonance and which calls on,
Their own self-concept as good guys fighting crime and violence. He proposes a program of integrity training in dealing with ethical dilemmas, in which cadets would be instilled with the values of telling the truth and doing the right thing as a central part of their emerging professional identity (199).
If an officer can be motivated to act on the basis of his or her “own self-concept” as a good person, then any action that would run counter to that (thereby raising cognitive dissonance), such as shooting an innocent person for no good reason, should be avoided at all costs.
As part of preparation for being a good officer, one should also seek to make untrue and false biases known and to correct them with better information and more adequate narratives.
But, on its own, that proposal seems naïve and perhaps shifts too much of the burden to the police officer. If implicit bias runs that deep into the subconscious, can we really expect that anti-racism and other bias-uncovering training can solve the systemic problem? Such efforts need to be accompanied by other more drastic and material measures, including stricter gun control legislation, a de-militarized police force, and more systematic incorporation of mental health care and social health care workers in situations of crisis and potential conflict.
The system needs change. It needs to reckon with the innate human ability to create stereotypes and our propensity to act on the basis of those (false but unchecked) stereotypes. And it needs to account for the tragic consequences that too often occur when prejudice is holding a gun.
 “Bias Isn’t Just a Police Problem, it’s a Preschool Problem” (http://www.npr.org/sections/ed/2016/09/28/495488716/bias-isnt-just-a-police-problem-its-a-preschool-problem)
Stay in touch with Kyle Roberts and Unsystematic Theology on Facebook: