Handcuffspre Crime.jpg

Chicago PD Uses Computer to Predict Crimes — The Only Thing I Predict is ABUSE

We’ve talked a lot over the years about the attempts to get out “ahead of crime” by using computer programs and algorithms to try and predict who might commit a crime.

Predictive computing can then either target specific areas or specific people that might be in need of some extra law enforcement attention. Except as we’ve noted repeatedly, these programs are only as valuable as the data they use.

Garbage in, garbage out, but in this case you’ve got a human being on the other end of the equation whose life can be dramatically impacted by law enforcement holding what they believe is “proof” that you’ll soon be up to no good.


With that in mind there’s growing concerns about efforts in Chicago to use predictive analytical systems to generate a “heat list” — or a list of 400 or so individuals most likely to be involved in violent crime.

The Chicago efforts are based on a Yale sociologist’s studies and use an algorithm created by an engineer at the Illinois Institute of Technology. People who find themselves on the list get personal visits from law enforcement warning them that they better be nice.

The result is a collision between law enforcement that believes in the righteousness of these efforts and those who worry that they could, as an EFF rep states, create “an environment where police can show up at anyone’s door at any time for any reason.”

Law enforcement and the code creators, as you’d expect, argue that it’s only the bad guys that need to worry about a system like this:

“A press liaison for the NIJ explains in an email: “These are persons who the model has determined are those most likely to be involved in a shooting or homicide, with probabilities that are hundreds of times that of an ordinary citizen.” 

Commander Steven Caluris, who also works on the CPD’s predictive policing program, put it a different way:

“If you end up on that list, there’s a reason you’re there.”

Unless law enforcement makes a mistake, your data is wrong (which it often will be), or we decide to expand the program significantly, right?

With this kind of logic, don’t you feel safer already?

Another concern bubbling up in Chicago is that the programs are effectively using racial profiling to target already-troubled areas where crime naturally would be greater due to poverty, without anybody bothering to perform a deeper analysis of why those areas might be having problems (aka targeting symptoms, not disease):

“…how are we deciding who gets on the list and who decides who gets on the list?” (EFF staff attorney Hanni) Fakhoury asks…”Are people ending up on this list simply because they live in a crappy part of town and know people who have been troublemakers? We are living in a time when information is easily shareable and easily accessible,” Fakhoury says.

“So, let’s say we know that someone is connected to another person who was arrested. Or, let’s say we know that someone’s been arrested in the past. Is it fair to take advantage of that information? Are we just perpetuating the problem?” He continues:

“How many people of color are on this heat list? Is the list all black kids? Is this list all kids from Chicago’s South Side? If so, are we just closing ourselves off to this small subset of people?”

Chicago PD denies that there’s any “racial, neighborhood, or other such information” being used in their heat list calculations, but a FOIA request to actually confirm that was denied, under the pretense that releasing such information could “endanger the life or physical safety of law enforcement personnel or any other person.” So yeah, there’s great transparency at work here as well.

Predictive computing is excellent for a good many things, from improving traffic congestion to designing sewer networks, but calculating the future movements of highly complicated and emotional human beings is a bridge too far.

It’s not particularly difficult to imagine a future where law enforcement (not always known for nuanced thinking or honest crime stat record keeping) starts using their belief in the infallibility of mathematics as the underpinnings for bad behavior, with the horrible experiences of the falsely accused dismissed as anecdotal experiences (“well shucks, most of the time the system is right, so its existence is justified”).

It might just be time for a re-watch of Terry Gilliam’s Brazil with an eye on reminding ourselves what a simple clerical error can do to the Archibald Buttles of the world.

How long will it take, I wonder, before the pre-crime arrests start taking place around the world? 10, 20 years? The last time I checked a person is innocent until PROVEN guilty in a court of law, hence the only thing I predict is even more police abuse.

By Karl Bode, Tech Dirt; | Additions by Alexander Light;