NYPD officers

Photo via Marc.Sundstrom/Flickr (CC-BY)

Predictive policing is everywhere.

Predictive policing is everywhere. 

In Chicago, a “heat list” supposedly identifies the citizens most likely to be involved in a shooting. In Los Angeles, Atlanta, Seattle, and a number of other cities, private company PredPol is supposedly helping police to identify where property crimes and robberies might occur. As those cities’ predictive programs have gotten more and more attention, police chiefs have done their best to get in on the action. 

Predictive policing programs are being rolled out and lauded all over the country—in South Jersey, Albuquerque, Berkeley, and Manchester, New Hampshire, to name a few. New companies are also angling for market share. One company says it can pick out potential shoplifters in retail stores. The New York Police Department is so deep into predictive tools that it created “Operation Cutting Edge” to specifically pinpoint where slashings might occur—and who might carry out those crimes.

But does predictive policing actually work? One study—touted seemingly every time a police department decides to contract with PredPol—shows that PredPol’s software can lead to a 7.4 percent reduction in “crime volume.” But that study was carried out by two of PredPol’s founders, P. Jeffrey Brantingham, and George Mohler, who sit on the company’s board of directors. What’s more, a French researcher replicated that study and found that police could get the same results if they sent officers to crime hotspots instead of paying to use predictive software.

It’s with similar skepticism that Andrew Ferguson, a law professor at the University of the District of Columbia, decided to take a hard look at predictive policing. In a paper titled “Policing Predictive Policing,” forthcoming in the Washington University Law Review, Ferguson provides perhaps the most comprehensive critique of predictive policing yet published anywhere. His goal: encouraging lawmakers to scrutinize technology that “has far outpaced any legal or political accountability and has largely escaped academic scrutiny.”

After delving into the history of predictive policing and the literature that’s looked into it, Ferguson lays out an exhaustive list of “vulnerabilities” that any official should consider before signing onto to a so-called predictive program. Those vulnerabilities include questions about how the Big Data feeding these programs are gathered; how those data are used; who’s accountable for how the predictive tools influence policing; how those tools are implemented; and more. To cite his most basic example: If crimes are reported in a specific geographic location, that doesn’t necessarily mean more crimes are happening there — or that crimes aren’t happening elsewhere. It could mean that police tend to be stationed nearby, or that police specifically target that area, or that people there tend to call in crimes more frequently. But predictive software has no way of knowing that, and can thus lead to a self-fulfilling crime prophecy. Ferguson quotes former Electronic Frontier Foundation attorney Hanni Fakhoury: “[I]f the data is biased to begin with and based on human judgment, then the results the algorithm is going to spit out will reflect those biases.”

Though predictive policing is spreading to police departments all over the world and “there is no real hope of going back,” he writes, officials need to understand what they’re signing up for before they contract with PredPol or HunchLab or any other company selling the Minority Report police solution.

There’s still not a ton of literature about the effectiveness of predictive policing. (There’s only really been one independent study; it was conducted by RAND researchers and its results were inconclusive.) But as Ferguson pointed out to me in an email last week, officials need someone other than police chiefs and company representatives to tell them about whether predictive policing works. “My hope was [this article will be] handed to mayors and city council members when someone proposes buying a predictive policing system,” he said.

We’ll see if that’s the case.

OFFICIAL POLICE BUSINESS is a weekly column and newsletter by reporter Matt Stroud about new developments in police technology, and the way that technology is changing law enforcement. Think body cameras, cell-site simulators, surveillance systems, and electroshock weapons. Sign up to receive OPB in your email every Wednesday at officialpolicebusiness.com, or check for it here at the Daily Dot. Stroud is available at matt@officialpolicebusiness.com or on Twitter @MattStroud.

Promoted Stories Powered by Sharethrough
police
Muslim woman sues California police who 'forcibly' removed her hijab
The city of Long Beach, California, is facing a federal lawsuit after one of its police officers “forcibly” removed a Muslim woman’s religious headscarf following her arrest.
From Our VICE Partners
Group

Pure, uncut internet. Straight to your inbox.

Thanks for subscribing to our newsletter!