When Reform Hasn’t Worked: Part II

This analysis was featured in Critical State, a weekly newsletter from Inkstick Media and The World. Subscribe here.

Last week on Deep Dive, we looked at how efforts at police reform can fizzle due to policymakers who believe that there is no way to provide services to people who are routinely targeted by police other than by deputizing armed police as service providers. This week, we’ll get into another reason why police reform can feel like a mirage, leaving options like police defunding and abolition as the only remaining options for targeted communities to get justice.

A common subplot in the debate about police reform is that we don’t really know which specific reforms work to reduce police violence overall or disproportionate police violence toward Black people. Does de-escalation training for police work to reduce violence? A survey of 64 evaluations of de-escalation training programs conducted over 40 years says… we have no idea. Police body cameras, an in-vogue intervention that has led to millions in dollars of public spending on cameras and data storage? The best program evaluation minds in the business say there’s no evidence they do much of anything to reduce police shootings.

One response to that lack of knowledge, particularly popular among social scientists, is to do more social science. If we do more experiments, the argument goes, we’ll be able to find the best technical solutions to the problem of police violence and implement them broadly. With the right research design, maybe the 65th de-escalation training program evaluation will tell us something persuasive about whether those programs work.

With the right research design, maybe the 65th de-escalation training program evaluation will tell us something persuasive about whether those programs work.

A new paper from Samantha Goerger, Jonathan Mummolo, and Sean Westwood throws some cold water on the feasibility of that idea. Like so many of their colleagues, Goerger et al. wanted to study how different interventions in policing affect how law enforcement actually functions, and to do that they needed cooperation from police departments. It’s hard to study police practices surreptitiously — police keep a lot of detailed numerical data on crime and policing, and without accessing that data or speaking to officers, tracking the effect of police reform would be tough. So Goerger et al. did what social scientists tend to do in this situation: They asked cops to form research partnerships with them.

What Goerger et al. did differently, though, is that they asked a ton of cops and they didn’t ask them all in the same way. Instead, they did a kind of pre-experiment experiment. They randomly selected roughly 3,000 police and sheriff’s departments across the US to reach out to, drawing from the FBI’s list of law enforcement agencies. Then they divided most of their sample departments into pairs, with each department coupled next to a department that operated in the same state and had roughly the same number of people in their jurisdictions.

Within each pair, they randomly assigned one department to get an introduction letter asking for a research partnership that highlighted how the department’s clearance rate for violent crimes or mean use of force per officer compared to other departments, and the other department to get an introduction letter with a request for a research partnership but no mention of rankings.

The idea was to see whether the implication that their performance would be evaluated as part of the research partnership would make departments less likely to participate. Unsurprisingly, the answer is yes. Departments that had their clearance rates or use of force ranks highlighted were significantly less likely to participate. More surprisingly, that was even true of departments that ranked highly in those measures. The resistance, it seems, was not to being potentially embarrassed by the rankings, but to being evaluated at all.

Some 300 departments took Goerger et al up on their offer and further research with those departments is in the works. But the paper highlights some problematic questions for those who believe more research is the answer: If police departments control how much access those who want to study reforms receive and are unwilling to grant that access if it means real evaluation of those reforms, then what is the real prospect for that 65th de-escalation study to tell us something meaningful?

And, more importantly, how long must people who see the issue of police violence against Black people as stemming from structural racism wait for social scientists to be convinced that technical solutions are inadequate? Will it be another 40 years of inconclusive program evaluations before mainstream social scientists offer a stronger recommendation on confronting the scourge of racist police violence than calling for more studies?