It doesn’t make the police or public any safer. But figuring that out exposed the dearth of useable data on law-enforcement practices.
In the summer of 2014, unarmed protestors in Ferguson, Missouri were met with a startling and aggressive police response, and a national debate over the proper role of law enforcement in American communities—a dialogue we’ve initiated many times in our history, but never adequately resolved—reignited. For days, cable news networks saturated broadcasts with images of police in armored vehicles designed to withstand improvised explosive devices in Iraq, taking aim at civilians with high-powered rifles, clad in protective gear fit for a theater of war.
I wanted to understand why police had this equipment, why they used it, and what costs and benefits so-called “militarized policing” delivered. As a doctoral student in political science, I knew where to start—locating reliable data—but I didn’t know it would take me four years to assemble and analyze. This week, I published my findings: Militarized police units are deployed more often in black neighborhoods, even after controlling for local crime rates. And while militarized policing does not, on average, make either the public or police any safer, it may tarnish the reputation of police.
The process of assembling the data showed me just how long and challenging the path to meaningful police reform will be. Though both Democrats and Republicans have expressed a desire for criminal-justice reform, state and local governments have not committed to taking the first step: accurately recording police behavior nationwide. Not only do we lack knowledge of which police reforms will work in the future, we often have no clear picture of what police are doing in the present. By and large, the data necessary to understand the effects of militarized policing—and many other police activities—are not available in any usable form.
Hear Jonathan Mummolo explain his research in this week's Defense One Radio podcast below:
In a federalist system with more than 15,000 state and local law enforcement agencies and virtually no standardized reporting requirements, reliable and comprehensive data on police behavior, including the presence and usage of militarized units, have eluded scholars for decades. And even those with the time and resources to gather and assemble the records themselves face systemic barriers.
I sent hundreds of open records requests to local agencies in search of data on militarization, but learned that many agencies purge records that are more than five or 10 years old, or don’t track the activity of their militarized units at all. When they do, they often track it poorly, refuse to share it, or release heavily redacted records that are of little use citing exceptions to open records laws which, quite unhelpfully, vary by state. In other cases, agencies demand exorbitant fees, (sometimes tens of thousands of dollars), as payment for the person-hours they claim it would take to compile police records.
More readily available data on police agencies often require substantial preprocessing to be useable. A federal government survey intended to reach all police agencies in the country every four years, for example, included a question about whether agencies provide Special Weapons and Tactics (SWAT) services in their jurisdictions in 2000, 2004 and 2008. But the data contain no consistent agency IDs, making comparisons between these surveys a painstaking task.
In the wake of Ferguson, the federal government released data from its 1033 Program, an initiative that has sent billions of dollars in surplus military equipment to thousands of local agencies since the 1990s. For empirical social scientists like me, the data set appeared at first glance to be a treasure trove. But I found that records of older equipment shipments were incomplete, making the data ill-suited for estimating the causal effects of militarization, as opposed to merely identifying its correlates.
I eventually located usable data from Maryland, which passed a law in 2009 that required every police agency in the state to record SWAT activity in a uniform manner. The data represent the most complete and fine-grained portrait of SWAT activity I know of, and include the date and zip code of more than 8,000 deployments, as well as a host of outcomes including whether police fired shots, damaged property, or made arrests.
These data were not generated to satisfy some benevolent desire for transparency among Maryland officials. Rather, they were collected in response to intense political pressure following a botched SWAT raid in which a local sheriff’s department forced its way into a suburban Maryland mayor’s house, shot and killed his two dogs, and interrogated him and his wife on unfounded suspicion of drug trafficking. In response to public outrage, Maryland officials passed legislation mandating the uniform collection of SWAT data statewide, but only until the political storm blew over—the legislation included a sunset clause that ended the reporting requirement after five years.
Though limited to a single state, the data revealed some striking patterns. SWAT teams were originally conceived to handle violent emergencies, but roughly 90 percent of SWAT deployments in Maryland over five years occurred to execute a search warrant. After merging the deployment data with U.S. census figures, I found that every 10 percent increase in the share of African Americans in a zip code area was associated with roughly the same percent increase in SWAT deployments. I also conducted survey experiments that showed seeing militarized police in a news report—relative to traditionally equipped police—lowers public support for both the funding of police agencies and the presence of police patrols, and may even inflate perceptions of crime.
The data collection effort, though frustrating and circuitous, paid off. And to be clear, I am willing to expend shoe leather to better understand important social issues. I am lucky enough to get paid to do so. The situation is also improving, with a growing number of policing scholars engaging in heroic efforts to assemble and disseminate high-quality data.
But in a nation led by politicians who claim to value transparency and a commitment to improving the quality of policing, it should not be this difficult to learn what police do.
If America is serious about improving police behavior tomorrow, policy analysts need to know what police are doing today. We need the data, not just for some cities, but for all cities; not just for some years, but for all years. Republicans and Democrats should work together to impose a requirement that police agencies adopt a standard set of practices for recording data on the behavior of their officers and deposit that data regularly in a centralized public archive. These data should include the date, time, and location of police-citizen interactions, as well as enforcement outcomes and information on the use of force. The data should also be maintained indefinitely so analysts can measure changes over time. The scope of this effort is of a sufficient scale that it would likely require federal mandates and funding, but if better policing is the goal, open data is the first step.
In many American communities, the relationship between police and citizens has for decades been cloaked in a rayless shadow of mistrust. All the good intentions in the world won’t do a thing to fix that unless governments first commit to bringing police behavior into the light.