Suicide is the 10th leading cause of death in the United States, and it’s been a particularly problematic issue for the U.S. military since soldiers first mobilized overseas in response to the Sept. 11 terrorist attacks.
In the U.S. Army, historically high suicide rates have more than doubled since 2001 among active duty soldiers, with 259 taking their own lives last year, according to a Pentagon report released in January.
“This is a big problem for the United States Army, it’s a big problem for the United States and it’s a big problem for the world,” Roy Wallace, assistant deputy chief of staff with the Army G-1, said Tuesday at the Government Analytics Forum in Washington, D.C.
Wallace said he believes analytics technology will succeed where various suicide prevention efforts over the years have failed: in saving soldiers’ lives.
The Army is in the midst of a five-year, $65 million effort called Army STARRS, which stands for Army Study to Assess Risk and Resilience in Service members. The program aims to identify factors that protect or put at risk a soldier’s mental health.
With partners that include the National Institute of Mental Health, the University of Michigan and other educational institutions, Wallace said the Army is in a “huge big data operation,” analyzing some 1.1 billion data records from 39 Army and Defense Department databases looking for insights that could suggest a soldier is at elevated risk for suicide.
“What we’re trying to do is get down to predict who might commit suicide,” Wallace said.
Health, personnel and criminal records along with less obvious sources, such as genetic records and blood samples, are now crunchable through analytics, he said.
Other data sources used in the study to ascertain risk include tens of thousands of neuro-cognitive assessments, 43,000 blood samples, more than 100,000 surveys, hospital records, previous risk studies and a slew of other data points associated with 1.6 million soldiers who have served on active duty. Job history, family history and combat logs are also accessible to researchers and analysts.
STARRS takes those data sets and applies a risk model to relevant soldier populations, identifying subgroups with the highest predicted risk of suicide who can then be targeted for intervention. So far, STARRS has identified about 4,200 people in Army “that are high risk for suicide,” Wallace said.
Identifying at-risk soldiers is only half of the equation. If suicide rates held steady, the military could expect approximately 250 suicides in the coming year—about 5 percent of the total 4,200 soldiers deemed at risk.
“So there are thousands of false positives, which presents us with a moral dilemma in dealing with these individuals,” Wallace said. “How do I treat the other 3,800 out there while I’m trying to intervene? It is a very intensive, leader-peer-supervisor responsibility. This is about knowing your people inside the U.S. Army.”
The privacy implications are vast with data sets as encompassing as these, Wallace said. That’s why the military is developing training for decision-makers “so they do not use this data for wrong,” he added. It’s easy to imagine a scenario in which two competing personnel were up for the same leadership position, but one was penalized for being flagged as a suicide risk.
Wallace said the Army may face a “moral dilemma” around the using the data but not attaching a stigma to soldiers.
“How do I morally use this data that I’ve got out there—how do I not make what we’d call immoral decisions with that data,” he said.
STARRS will run through June 2015, though it is likely the project’s ramifications will continue to be felt across the military for years to come. Certainly, policies that govern the use of such sensitive data will mature, as will the uses and analysis of the data sets themselves. As this occurs over the next few years, the military will learn the extent to which analysis and analytics can save soldiers from suicide.