Police Can't Predict the Future. Fortunately, They Don't Have to.

Can computers, fancy mathematics, and big data predict crime, even predict who will commit murder? Attention to factors like previous arrests and unemployment processed through sophisticated software has allowed police to target those at highest risk.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

2015-10-14-1444857467-995210-ProsecutorandMayorwithCommunityMember_INTERIOR.jpg
Photo Credit: Nisha Stephen

Can computers, fancy mathematics, and big data predict crime, even predict who will commit murder? The New York Times says yes: in a story revolving around Kansas City ("Police Program Aims to Pinpoint Those Most Likely to Commit Crimes," September 24, 2015), it highlighted the growing use of "predictive policing": "complex computer algorithms to try to pinpoint the people most likely to be involved in future violent crimes," part of a "larger trend by governments and corporations that are increasingly turning to predictive analytics" for forecasting. The Kansas City No Violence Alliance (KC NoVA) was one example, the Times said, along with others in, for example, the Manhattan DA's office. Attention to factors like "previous arrests; unemployment; an unstable home life; friends and relatives who have been killed, are in prison or have gang ties; and problems with drugs or alcohol," processed through sophisticated software, allow police to target those at highest risk.

Civil libertarians predictably take a dim view of such "Minority Report" policing. Get it wrong and "you could be reducing civil liberties and Fourth Amendment protections for certain people on bad information and bad data," law professor Andrew Guthrie Ferguson told Fox News. To many, it sounds like familiar old profiling decanted from high-tech new bottles. "Our concern is guilt by association," said the American Civil Liberties Union's Ezekiel Edwards. "Because you live in a certain neighborhood or hang out with certain people, we are now going to be suspicious of you and treat you differently, not because you have committed a crime or because we have information that allows us to arrest you, but because our predictive tool shows us you might commit a crime at some point in the future."

It seems reasonable to take such concerns pretty seriously. In a moment when the nation's attention is properly focused on mass incarceration; the overreach of the criminal justice system and the incredible harm caused not just by prison but arrests, fines, warrants, and the like; and the risk that even routine police contacts with the public can go horribly wrong, giving control over policing to the same kind of software that causes creepy ads to pop up on your laptop seems plausibly at least as creepy. Moneyball statistics are one thing when they give your home team an edge, quite another when they put your homeboy in prison. When baseball gets it wrong, you lose a game. When law enforcement gets it wrong, you lose your freedom and even your life.

Which may be why, in fact, Kansas City -- and the Manhattan DA's office, and a bunch of others being tarred with this brush -- are not in fact doing anything of the kind. They are not forecasting who will do violence. They're not using fancy computer algorithms, or sifting through "big data," to forecast criminality. There is such a thing as "predictive policing," with its own merits and demerits, but they're not doing it. What they're doing is fundamentally different; it is for the most part not new but tried and tested; and -- the Times' breathlessness notwithstanding -- it is rooted in real seriousness both about what's fair and what works.

Most fundamentally, what Kansas City and others are doing is based not on prediction but on observation. It is now very well understood that both violent offending and violent victimization is extraordinarily concentrated amongst a remarkably small and active number of people. In particular, members of violent street groups -- gangs, drug crews, and the like -- are at a hugely elevated risk for violent victimization: by a factor of 600 or more. No computer algorithm is necessary to (or, so far, can) identify members of these groups, but their own behavior is perfectly good enough: they hang out together, commit crimes together, and are victimized together. I conducted the first "group audit" in Boston 20 years ago with front-line officers, a paper map, and a magic marker; the Manhattan DA began its work by calling around to NYPD precinct commanders and asking for lists of their most serious offenders. These days, computer software has of course moved in, but the focus is still on behavior, not prediction -- looking, for example, at who has been arrested together, or stopped by police together -- and computer results have to be filtered through front-line insight and common sense (link analysis may show that a gang shooter's mother was in his car when he was stopped by police, but that doesn't make her a gang member herself).

Think of it like traffic accidents. We could try to use fancy algorithms to predict high-risk drivers: but that young people learning to drive are accident-prone is just an observable fact. And just as most young drivers will never have a serious accident, most group members and other high-risk people will never kill anybody. Predicting that they will would be wrong; locking them up on the basis of that risk would be even more wrong.

In fact, the "call-in" meetings NoVA holds -- a key part of many of these approaches -- bring together community members, service providers, and law enforcement to communicate directly with group members. At these meetings, community members voice a moral message against violence, tell group members they care about them and want them to succeed, and offer community outreach; service providers offer help -- such as job training, education, and housing assistance -- to those who want it; and law enforcement gives prior notice of legal risks. Family members are often mobilized and also offered support. Any law enforcement action that follows is taken on the basis of actual criminal behavior -- not on any profile or prediction.

The strategy is designed as a corrective to the kind of broad, intrusive police action that harms neighborhoods and damages police-community relations. It's aimed at strengthening communities to set their own public safety standards, keeping the most vulnerable population alive and free, and narrowing the net of law enforcement. This is neither the traditional, dragnet approach to policing nor the experimental, predictive one. It's an evidence-based strategy for reducing violent crime in a way that allows police to be more strategic and less draconian, and builds community trust instead of harming it.

National civil rights and community-based organizations -- such as the PICO National Network -- actively endorse and support this work. It finds support among law enforcement, vulnerable communities, and criminal justice reformers as well. It's an approach that can continue to reduce the most serious violence in cities across the country while approaching the most vulnerable with both fairness and compassion. And all algorithm-free.

David M. Kennedy is professor of criminal justice and director of the National Network for Safe Communities at John Jay College of Criminal Justice. He is working with Kansas City on the KC No Violence Alliance.

Popular in the Community

Close

What's Hot