Police wearing riot gear try to disperse a crowd, Aug. 11, 2014, in Ferguson, Mo.

Police wearing riot gear try to disperse a crowd, Aug. 11, 2014, in Ferguson, Mo. Jeff Roberson/AP

Will Predictive Policing Make Militarized Police More Dangerous?

The China-fication of the American police force is a cause for concern. By Patrick Tucker

As images of Ferguson, Missouri’s AR-15 totting police force made their way across the Internet this week, an ever-concerned public began to wonder who decided to give cops in an American city the sort of guns and gear that we provide to soldiers in the most dangerous places in the word?  We quickly discovered that the United States government did, under the so-called 1033 program, a program that allows the Defense Department to transfer military equipment to law enforcement (much to the delight, surely, of the companies that make that equipment.)

Of course, to call the Ferguson police force “militarized” is a misnomer. As Adam Weinstein points out at Gawker, gear alone does not a military member make—to wit: “Despite their expensive costuming, the police in Ferguson are putting on an unsophisticated, unscripted performance, a copy without an original. If these cops were to take a page out of the Army's book on crowd control, it would be an improvement. But they seem to be making up tactics to go with the gear they've acquired.”

In terms of large nations that have, in fact, militarized their police forces, the model that we are now following looks Chinese in origin, a country that has been blurring the line between military and police for a long time. NATO and the Defense Department continuously point out that Chinese military funding and “public safety” funding overlap to large extent. A quick glance at Chinese spending on “internal” security versus formal “defense” reveals a country that views its citizens as a larger threat than any external foe.

In 2011, not long after the “Jasmine Revolution” swept Tunisia, Beijing—feeling the winds change in popular uprising—upped spending for police, jails, and other pieces of internal security by more than 13 percent to 624.4 billion yuan ($95 billion). Money for the Chinese Liberation Army, conversely, rose 12.7 percent that year to 601.1 billion yuan ($91.5 billion).

"This would be the first time that the openly announced domestic security budget has surpassed military spending," Tongji University political scientist Xie Yue told  Reuters. Yue said that the figure provides a good sense of China's "stability protection" spend.

The trend slightly slowed but still continued through 2013. Chinese spending on its own police and internal security rose to 769.1 billion yuan last year compared to 740.6 billion yuan for the People’s Liberation Army.

This year, China withheld figures for what it would be spending on “stability maintenance,” but by the end of 2014, China will become the world’s number one market for surveillance equipment and technology, surpassing the U.S., according to a report from the Homeland Security Research Corporation.

The U.S. forbids the sale of military equipment to China with its Foreign Relations Authorization Act for FY1990-FY1991 (P.L. 101-246). But U.S. contractors from IBM China to GE Security Asia to Honeywell sell into the Chinese security market.

If China is a country that is at war with its own citizens, what does that mean for the future weapons that she might deploy?

Here’s where China may borrow from us. One of the most important new weapons that police forces around the country are experimenting with is so called predictive policing—the use of data and statistics to determine the location, and possibly even the perpetrators of crime. It’s a trend that’s sweeping police departments across America. Reporters at San Francisco Weekly have shown that a lot of today’s predictive policing marketers are peddling products that don’t meet the expectations that those marketers are advertising. But the thinking behind the concept is still sound, and there are some key cases where predictive policing has proven to be a force multiplier.

One such example is New York.

Predictive Policing, Past, Present and Future

In 1994, newly appointed New York City police commissioner William Bratton took it upon himself to “strategically re-engineer” New York’s Police Department.  Jack Maple, who was working with the New York transit authority at the time, convinced Bratton that up‐to‐the‐minute data, city‐wide crime statistics and crime mapping would enable the police to pre-emptively deploy police officers into areas where crime was about to go up.

This first experiment in what is now somewhat commonly called “predictive policing” decreased the crime rate in New York City by 37 percent in three years. But Bratton’s re-engineering went beyond money-balling crime. He also put into place zero‐tolerance and stop‐and‐frisk policies that have been deemed unconstitutional in Federal Court.

Predictive policing will make its way into the operations of more and more police departments around the country.

As a tactic, predictive policing has been used to preempt peaceful civil demonstrations, like the 2003 World Trade Organization protests that took place in Miami, Florida. Today, police around the country routinely employ espionage tactics to predict and preempt spontaneous punk and dance shows (under the expansive and poorly written 2003 RAVE Act, sponsored by Joe Biden, which can be used to arrest concert promoters for the behavior of their patrons). If you’re a police chief or mayor, preempting a protest is less risky than trying to disrupt one in progress, especially in an age where the kids you will be pepper spraying carry TV studios in their pockets. The combination of police armed with military equipment using big data analytics not just to break up street demonstrations, but to keep them from ever happening is a trend that’s invisibly increasing.  

Why It Won’t Fix the Problem

Neither big data nor AR-15s will fix any of the long-term issues that plague our criminal justice system or change the way many cops interact with residents in poor neighborhoods. Zero-tolerance policies—of which predictive policing programs often serve as a component—are really effective at putting people behind bars. In a country with the highest prison population rate on the planet, that’s like taking a machine that produces a terrible product—say, exploding strollers—and “improving” it not by changing the design of the strollers, but by enabling it to produce many more exploding strollers far faster and more cheaply. Even in places where every criminal is truly a threat to public health (which is no place), pumped-up arrests will exacerbate prison overcrowding, recidivism, and so forth.

There’s another example of predictive policing in action that shows the tactic can be used wisely, the city of Memphis.

Throughout the 2000s, Memphis was following the same path—straight down—of many formerly prosperous U.S. metro regions. Property values and college graduation rates were abysmal. Poverty was high. In the beginning of the last decade, Memphis was consistently ranked one of the top five worst U.S. cities for violent crime.

Between 2006 and 2010, in spite of all of the above, crime dropped 31 percent.

The demographics of Memphis didn’t change in that time. The approximately 2,300 men and women on the police force at that time were the same sort you’d find in any town where there’s too much to do and too few to do it. Here’s what changed: the department began handling its information differently thanks to Dr. Richard Janikowski, an associate professor in the Department of Criminology and Criminal Justice at the University of Memphis.

Janikowski convinced local police head Larry Godwin to allow him to study the department’s arrest records. But Janikowski wasn’t looking for biographical sketches of the perpetrators; he was looking for marginalia, the circumstances behind each arrest—the where and when of crime.

As originally covered in The Futurist magazine, the biggest single finding, and by far the most controversial, was that the rising crime rate was closely connected to Section 8 housing, federally subsidized housing for qualified individuals below a certain income level. When Janikowski and his wife, housing expert Phyllis Betts, took a crime hot-spot map and layered it on top of the map for Section 8 housing, the pattern was unmistakable. Hanna Rosin, in her 2008 Atlantic article on Janikowski, described it thusly: “On the merged map, dense violent-crime areas are shaded dark blue, and Section 8 addresses are represented by little red dots. All of the dark-blue areas are covered in little red dots, like bursts of gunfire. The rest of the city has almost no dots.”

Today Janikowski points out that blue-area, red-dot analysis omitted some important data. “You know, the stuff didn’t overlap perfectly. There were high levels of correlation with it. Section 8 housing was part of what you see there. But it was also just heavy levels—a big concentration of poverty. And that’s a complex relationship that was occurring.”

We now know with more certainty that the connection between Section 8 housing and rising crime is correlative, not causative. People who live in this housing are not more likely to commit crimes so much as they are more likely to move to low-rent neighborhoods where the probability of a crime rise is already high.

This relationship between the likelihood of being a crime victim, being a crime assailant, and living in Section 8 was particularly complicated in Memphis, says Janikowski, where many traditional Section 8 units were in terrible shape and others were being torn down. “You’ve got lots of demolition that was occurring in what was the traditional inner city for various reasons. So you had movement there. You had a lot of movement of at-risk populations. And they all tended to cluster because, again, the price of housing.”

Operation Blue CRUSH (Crime Reduction Utilizing Statistical History) was born in 2006. The system used IBM’s SPSS program and mapping software from Esri to better capture and disseminate crime data. In addition to weather patterns, seasonality and area demographics, they could also model lighting conditions with a particular focus on garages and alleys. They looked at when big local employers issued paychecks by time of the week, the month, year and what times of day people went to and left work.

The same location optimization techniques that companies such as Esri provide to retail chains to find the best neighborhood to place a new store are also useful in mapping relationships between crime, economics and physical space.

“We can not only just manage what is this dot on the map that we call ‘burglary’ or ‘robbery,’ but how does that dot on the map interact with the demographics of the area, home values or population trends,” said Mike King, Esri’s national law enforcement manager. “If you’re in a predominantly blue-collar neighborhood that works at factories, what happens every other Friday when it’s payroll time? Do we see increases in alcohol-related events? Do we see increases in domestic violence?”

Here’s why the way these models work matters to the future: as we develop the capacity to monitor more of these signals and incorporate more variables, the statistical tools required to make use of them will become simpler and more transparent. As transparency increases, governmental decision makers will have an easier time accepting and supporting predictive policing programs. As more departments begin to use such programs, and share information about which variables and tools are most useful, these programs could get a lot better very quickly.

The predictive policing program in Memphis, which has been in place longer than any other program of its type in the nation except New York’s, has touched the lives of virtually everyone in the city. But it’s attracted far fewer complaints and legal challenges—and essentially none of the controversy that has attached itself to other programs. Janikowski credits legwork for this. He and the deputy chiefs went to more than two hundred community meetings over the course of two years; they went through neighborhoods block by block to knock on doors, tell people what they were doing and listen to concerns. Of these meetings, he says, “Whether there was five people there or five hundred, we did the same thing. We explained what we were doing, why we were doing it, what results we were hoping for. By going out there and telling folks, ‘This is what’s going on and why,’ we never got the kinds of push back that I’ve heard from other cities.”

A cynic might suggest that Janikowski surrendered a strategic advantage in doing so. He gave up too much information to the would-be perpetrators. Janikowski says being more generous with information and proactively reaching out to the public about the program, rather than just announcing its existence in a press release, is the reason the program continues to operate with the public’s blessing. Public support is necessary if these programs are ever to reach their fullest potential. That public support would be impossible to obtain.

Andrew Ferguson of the University of the District of Columbia has written a lot on the Fourth Amendment implications of predictive policing. He says that when the police come to your door to politely ask for information or inform you of a new police program—and they’re armed like they were just air-dropped into Fallujah—the effect amounts to forced coercion.

“What happens when predictive policing meets a militarized police force? Instead of sending a few officers to a place where your algorithm has determined that there might be a crime you send a tank and officers in military outfits, I would imagine that would be an effective tactic but would change the way we view policing in America in a negative way.”

The Chinese policy of “stability maintenance” may soon receive a big boost from predictive policing methods, too. The U.S. has an opportunity to make a different choice, we can use data to make neighborhoods safer and improve police-community relations. Or we can continue to arm police like we’re sending them to war. Attempting to do both at once won’t work.

Eds. Note: This article was adapted, in part, from The Naked Future: What Happens In a World That Anticipates Your Every Move by Patrick Tucker (Current, 2014). It was reposted with the permission of the publisher.