Trending Topics

Study: No evidence of racial bias in predictive policing

A recent study conducted by an Indiana university found that predictive policing didn’t result in racially biased arrests

AP_120629151008-2.jpg

In this photo taken Friday, June 29, 2012, Jeff Brantingham, anthropology professor at the University of California Los Angeles, displays a computer generated “predictive policing,” zones at the Los Angeles Police Department Unified Command Post (UCP) in Los Angeles.

AP Photo/Damian Dovarganes

By Police1 Staff

INDIANAPOLIS — A recent study conducted by an Indiana university found that predictive policing didn’t result in racially biased arrests.

A researcher from the Indiana University–Purdue University Indianapolis School of Science conducted the first study to look at real-time field data and found predictive policing didn’t result in biased arrests, according to an IUPUI press release. There have been some concerns that the algorithms from predictive policing may lead LEOs to target minority communities and lead to discriminatory arrests.

“Predictive policing still is a fairly new field. There have been several field trials of predictive policing where the crime rate reduction was measured; but there have been no empirical field trials to date looking at whether these algorithms, when deployed, target certain racial groups more than others and lead to biased stops or arrests,” said George Mohler, associate professor of computer and information science at the School of Science at IUPUI.

Mohler and researchers from UCLA and Louisiana State University worked with the LAPD to conduct the experimental study. A human analyst made predictions on where officers would patrol each day while an algorithm also made a set of predictions.

The algorithm then randomly selected which set was used by officers each day. Researchers measured the different arrest rates by ethnic groups between the predictive policing algorithm and hotspots created by the LAPD, which were made before the experiment.

“When we looked at the data, the differences in arrest rates by ethnic group between predictive policing and standard patrol practices were not statistically significant,” Mohler said.

Researchers also looked at data both at the district level and within the LAPD officers’ patrol areas. They found no significant difference statistically between arrest rate by ethnic group at either geographical level.

The study also found that there were higher arrests rates overall in the algorithmically-selected areas. When adjusted for higher crime rates in those areas, the arrests were lowered or unchanged.

Mohler added that there continues to be lessons learned from each study of predictive policing. He hopes this study will become a starting point for measuring predictive policing bias in future studies.

“Every time you do one of these predictive policing deployments, departments should monitor the ethnic impact of these algorithms to check whether there is racial bias,” Mohler said. “I think the statistical methods we provide in this paper provide a framework to monitor that.”