The Future Is Here
We may earn a commission from links on this page

Study Finds Predictive Policing No More Racist Than Regular Policing

Police agencies are increasingly using advanced technologies to fight crime, including biometrics, auditory detection, and even virtual reality. One of the most controversial tools is “predictive policing,” which has long been accused of reinforcing racial biases. But a team of researchers from Indiana University, UCLA, and Louisiana State University found the practice, and its effects on bias, is more complicated than that.

Predictive policing is a “smart policing” tool that trains an algorithm to predict where crime will happen. For this study, the LAPD was given maps of “hot spot” areas to patrol. On “treatment” days, the hot spot areas were selected by an algorithm. On “control” days, they were selected by a human analyst. The researchers compared arrest rates on both treatment and control days, wanting to know if minorities were arrested more or less frequently on either day.

Advertisement

Why is this controversial? Because arrest data lacks nuance. A person could commit the same crime in two different places, but would only be arrested if there’s an officer there (or one who comes after responding to a 911 call.) As advocates argue, areas with more police presence will always have more arrests. So if officers are more suspicious of a neighborhood, it will have more arrests because there are more officers sent there. This dynamic isn’t reflected in the arrest data used to train algorithms; it simply shows where arrests happen and how often. Law enforcement agencies, however, say that officers are sent into areas based on crime levels, not racial suspicion.

Advertisement

But what the Indiana University researchers found neither proved nor disproved either assertion. The paper, “Does Predictive Policing Lead to Biased Arrests? Results from A Randomized Control Trial?” was published in the latest edition of Statistics and Public Policy.

Advertisement

Researchers say minorities were not arrested at higher rates on algo-determined days than on analyst-determined days. The racial proportions of arrestees were the same on both “treatment” and “control” days. Taken as a whole, arrest rates for all races were the same on both “treatment” and “control” days overall.

Interestingly, when broken down to a geographic level, officers did arrest more people in specific “hot spot” areas determined by the algorithm than the “hot spots” determined by an analyst. But, the researchers assert that this is expected—the increase in arrests scales upward proportional to the increase in crime.

Advertisement

“The higher crime rate, and proportionally higher arrest rate, is what you would expect since the algorithm is designed to identify areas with high crime rates,” George Mohler, one of the study’s authors, told Physics.org.

Ultimately, the researchers found that predictive policing seemingly doesn’t increase bias. That does not mean policing, predictive or no, doesn’t have racial biases; simply that in this case, algorithms weren’t found to have caused any racial imbalances in arrests. From the results section:

The analyses do not provide any guidance on whether arrests are themselves systemically biased. Such could be the case, for example, if black and Latino individuals experienced arrest at a rate disproportionate to their share of offending. [...] The current study is only able to ascertain that arrest rates for black and Latino individuals were not impacted, positively or negatively, by using predictive policing. Future research could seek to test whether the situational conditions surrounding arrests and final dispositions differ in the presence of predictive policing.

Advertisement

Predictive policing simply augments existing policing patterns. If there are biases, algorithms augment them as well, but they aren’t the originator. The researchers point out that the root causes of crime and racial bias are a different subject, though left unasked is an obvious question: Why augment policing while there are still pervasive, unaddressed biases?

That remains a debate for community leaders and law enforcement agencies. For now, Mohler hopes the study serves as a “framework” for auditing the racial impact of this practice.

Advertisement

“Every time you do one of these predictive policing deployments, departments should monitor the ethnic impact of these algorithms to check whether there is racial bias,” Mohler said. “I think the statistical methods we provide in this paper provide a framework to monitor that.”

[Phys.org]