Wednesday 28 December 2016

Minority Retort: Why Oakland Police Turned Down Predictive Policing

Tim Birch was six months into his new job as head of research and planning for the Oakland Police Department when he walked into his office and found a piece of easel pad paper tacked onto his wall. Scribbled across the page were the words, “I told you so!”

Paul Figueroa, then the assistant chief of Oakland Police, who sat next door to Birch, was the culprit.

A few months before, in the fall of 2014, Birch had attended a national conference for police chiefs where he was introduced to PredPol, a predictive policing software that several major cities across the US have started to use. It can forecast when and where crimes may occur based on prior crime reports, but the results of its impact on crime reduction have been mixed.

Birch, a former police officer in Daly City, thought it could help Oakland’s understaffed and underfunded police force. During the January 2015 budgeting planning process he convinced Mayor Libby Schaaf to earmark $150,000 in the city’s budget to fund the software over two years.

But Figueroa was skeptical of the technology. An Oakland native and 25-year veteran of the force, he worried the technology could have unintended consequences—such as disproportionately scrutinizing certain neighborhoods—and erode community trust. Figueroa and Birch had spirited discussions after the January budget proposal about why it wouldn’t work in a city with a sordid history of police and community relations, including several misconduct scandals.

"f we have a way to use mathematics to find out where we need to be in order to prevent crime, let’s use it."

Birch finally came around to Figueroa’s thinking in April 2015 after further research and a newfound understanding of Oakland. He realized the city didn’t need to give its people another reason to be suspicious. It was too easy for the public to interpret predictive policing as another form of racial profiling.

He decided to rescind his funding request from Schaaf, telling her the OPD would not be using the software. That’s when Figueroa put the note on his wall.

“Maybe we could reduce crime more by using predictive policing, but the unintended consequences [are] even more damaging… and it’s just not worth it,” Birch said. He said it could lead to even more disproportionate stops of African Americans, Hispanics and other minorities.

The Oakland police’s decision runs counter to a broader nationwide trend. Departments in cities like New York, Los Angeles, Atlanta and Chicago are turning to predictive policing software like PredPol as a way to reduce crime by deploying officers and resources more effectively.

A 2013 PredPol pilot in Atlanta was one of the first key tests of the software.

According to a 2014 national survey conducted by the Police Executive Research Forum, a Washington-based think tank made up of police executives, 70 percent of police department representatives surveyed said they expected to implement the technology in the next two to five years. Thirty-eight percent said they were already using it at the time.

But Bay Area departments are raising questions about the effectiveness and dangers of relying on data to prevent crime. San Francisco currently has no plans to use predictive policing technology. Berkeley does not either. Just north of Oakland, the Richmond Police Department canceled its contract with Predpol earlier this year and to the south, the Milpitas Police Department cut its ties with the software maker back in 2014.

These authorities say the software may be able to predict crime, but may not actually help prevent crime because knowing when a crime may occur doesn’t necessarily solve the problem of stopping it. Critics of the software also argue it perpetuates racial bias inherent in crime data and the justice system, which could lead to more disproportionate stops of people of color. But police departments who support using PredPol say police presence in these predicted crime zones can potentially deter crime.

PredPol first began as a research project within UCLA’s Institute for Pure and Applied Mathematics, which uses math to solve scientific and technology challenges across fields. The research team was trying to figure out if they could predict crime like scientists predict earthquake aftershocks, but they needed data to crunch. That’s when they formed an informal partnership with Los Angeles Police Department Captain Sean Malinowski.

In theory, the algorithm is not too different from heat maps that law enforcement have used for years to lay out locations of past crimes. PredPol funnels in data such as the location and time of property crimes and theft from crime reports into an algorithm that analyzes the areas that are at high-risk for future crime. During routine police patrols, officers glance at a laptop inside their car to view “the box,” a small red square highlighting a 500 by 500 foot region on their patrol area map. These boxes indicated where and when a crime is most likely to occur during an officer’s shift.

Malinowski said a few years into the research project the team realized the software’s potential as a practical tool that could help police prevent crime. Jeff Brantingham, an anthropology professor at UCLA who helped develop PredPol, became its chief of research and development.

“We put some experiments into the field to see if it was possible to predict crime, and do so better than the existing practices the [LAPD] was using,” Brantingham said. “It proved to be a substantial improvement.”

A PredPol map showing predicted hotspots in the Atlanta trial. Image: Christian Science Monitor/Getty

The company draws a distinction between predicting when or where a crime is likely to occur and who is going to commit a crime because civil rights advocates, who have spoken with PredPol and Malinowski, expressed concerns about targeted policing. In the algorithm, PredPol does not include arrest data, which could include biases inherent in the criminal justice system, or information about suspects involved in a crime, because they could foresee contributing to disproportionate stops of people of color. It also omits drug-related offenses and other violent crimes because the data may bring another layer of complexity that can’t yet be untangled in the algorithm.

Malinowski said PredPol omits individual arrest data to prevent a "self-fulfilling prophecy" in the predictions.

In 2011, the team launched a prototype of the software to be tested by the Foothill division of the LAPD, where Malinowski was stationed at the time.

“I said if we have a way to use mathematics to find out where we need to be in order to prevent crime, let’s use it,” said Malinowski, now LAPD chief of staff. “The only training you need to give the officers is what the map means and how to use the map.”

According to PredPol, crimes in Foothill went down 13 percent in the four months following the implementation of the algorithm. However, the LAPD’s crime stats digest for 2011 and 2010 show other divisions that were not using the software also saw crime reduction as high as 16 percent.

The Santa Cruz police department followed shortly after, becoming the first law enforcement group in the nation to pilot the finished software. Now, over 60 police departments across the nation use the software.

Malinowski said he noticed a shift in officers’ behavior when they started using PredPol. They became less focused on arrests, which was desirable from his standpoint because the public might see that and be more willing to trust the police, and more focused on preventing victimhood.

“They had to change their way of thinking, because for years they were rewarded on their productivity, which meant citations and arrests, and I had to convince them that’s not what I’m rewarding you for,” he said. “I’m interested in rewarding for fewer victims.”

But Bay Area police departments have backed away from the technology because there are few independent evaluations from researchers outside of police departments on the direct effect of PredPol on crime reduction.

In 2016, the Richmond Police Department terminated its contract halfway into a three-year program because it found no measurable impact on crime reduction. RPD spokesperson Lieutenant Felix Tan said it was difficult to quantify the software’s impact on crime.

“On one hand I can say it was successful because when officers were there nothing really happened,” he said, explaining how arrest and crime statistics don’t accurately measure the software’s success. “Or you can say it wasn’t all that successful because a few times when the officers were there the crime still happened.”

Tan said the department believes in the strength of their officers’ experience and their community relationships over the technology.

“We have very competent police officers very much in tune with our community, and with that trust you gain intelligence. I don’t think you can get anything like that out of software,” he said.

Tan said old-fashioned crime data analysis was still a crucial part of fighting crime because history often dictates future patterns. He said most departments including in Richmond have an in-house crime analysis unit.

Milpitas, a small city located about 35 miles south of Oakland, received a discounted rate in 2013 after it referred the software to another agency. But it severed the $37,500 agreement with PredPol just one year into its three-year contract back in 2014.

Milpitas police did not return calls seeking comment about their decision to drop it.

"If you have systemic bias in this collection of the data your prediction is going to reflect that systematic bias"

Brantingham of Predpol said he still doesn’t understand why these departments decided to back out. The software simply provides them with predictions about where and when the risk of crime is greatest, then it’s up to officers to decide what to do with that information, he said.

“We are not police officers. We have no business telling them what tactical activities they should be engaged in,” said Brantingham.

But research think tank RAND Corporation, which in 2014 published one of the few formal studies about predictive policing using a similar model to PredPol, found that it did not reduce crime.

“The promise of it is amazing, so I can see why everyone wants to believe in it,” said Jessica Saunders, who co-authored the study, based on a pilot program that took place in Shreveport, Louisiana in 2012. Her team had compared how officers using the software in a block-sized area fared against officers using the status quo policing approach.

“We need a lot more work in this area before we’ll see a large crime reduction as a result of these sort of initiatives,” she said, adding that crime prediction is not the same as crime prevention. “No one is looking at this second part which is, ‘How do you actually prevent it?’”

The complex history of race relations between the police and communities may also be hindering the software’s public perception.

A recent study of the OPD’s recorded drug crimes in 2010 by the Human Rights Data Analysis Group, published in October, concluded that predictive policing algorithms like PredPol have the potential to amplify racial bias because of biases in the datasets.

The study concluded that if the OPD had used PredPol to predict future drug crimes, officers would have been deployed to mostly lower-income minority neighborhoods where the previous drug crimes were recorded. This feedback loop may not accurately reflect where drug crime is occurring throughout the city because officers are focused on certain neighborhoods to look for crime while ignoring other areas of the city where those same crimes may be occurring.

“If you have systemic bias in this collection of the data your prediction is going to reflect that systematic bias,” said Isaac.

Matthew Odom looks over statistics posted for PredPol at the at the Tech Innovation Center in the Atlanta Police Foundation in 2015. Image: Ann Hermes/The Christian Science Monitor

Other researchers have argued that predictive models will continue to direct police back to areas that are already disproportionately scrutinized, even when drug-crime or arrest data isn’t used.

“We don’t really know what the actual crime rate is,” said Solon Barocas, a post-doctorate researcher at Microsoft Research focusing on the ethics of machine learning and automated decision-making.

Borocas is part of a growing body of data scientists looking at ways to improve crime data, and in turn improve prediction models. He pointed out that one of the primary barriers to doing this is the amount of crime that goes unobserved and unreported.

“The nature of crime will often explain whether or not it’s likely to be reported, and because of the fraught relation with police there are many underreported incidents of crime,” he said, adding that police departments need to stop and consider what the goal of predictive policing really is before purchasing the pricey software. “Is it to reduce crime by means of actually physically putting police throughout the neighborhood or address the root of crime?” he said.

Another growing trend among police departments across the nation restructuring their approach to crime reduction is community policing. It’s a loosely defined term with different meanings if you ask numerous police departments. But the general goal of this style of policing is to engage with at-risk communities, build relationships between the police and those they serve, and disentangle the root cause of crime.

Malinowski said the LAPD is looking at ways to apply PredPol to this methodology.

One of the department’s community relations sergeants approached Malinowski recently about how they could use predicted crime zones for community building and intervention in a high-risk neighborhood in Los Angeles. This sergeant had noticed one apartment building kept falling inside the hotspot for future crime. LAPD officers went door-to-door inside the apartment complex handing out flyers.

“We’d show them the map and say this is an area that’s at a heightened risk for burglary, you got to look out for your neighbors,” Malinowski said. “When you talk to those people it puts us in direct contact with the community, and that’s sort of my definition where the rubber meets the road.”

Since distancing itself from the software, the OPD has been working on community policing strategies of its own to improve public perception and rebuild the trust that has been broken over many decades. Birch said the department is building a new handcuffing policy that will encourage officers to think twice before handcuffing someone.

“Think about what it looks like when you’re handcuffing an African-American male in East Oakland, what are the optics of that?” he said. “The community all around that is watching you, what are they thinking about you when they see you do that as an officer?”

“Even if it’s not only appropriate, but necessary to do so to keep people safe, there are impacts and unintended consequences of something that simple,” Birch said. “That’s where we are, that’s where we want to be.”



from Minority Retort: Why Oakland Police Turned Down Predictive Policing

No comments:

Post a Comment