Could a Brain Scan Protect U.S. Troops from Insider Attacks?
U.S. forces in Iraq and Afghanistan often don’t know who to trust. Brain scans to the rescue? By Patrick Tucker
A Pentagon report, revealed by The New York Times over the weekend, showed that the American troops working alongside Iraqi forces were at risk of harm from Sunni extremists who had infiltrated the Iraqi Army (and, perhaps, from the pro-Iranian Shiite militias that effectively are the Army.) On Monday, Rear Adm. John Kirby told reporters that “it would be imprudent, irresponsible not to think about the insider threat.” The threat is real in Afghanistan as well where insider threats, so-called “green-on-blue” attacks, have killed several U.S. troops in recent years.
So, if you’re a U.S. soldier in Iraq or Afghanistan today, how do you determine whether or not the Iraqi or Afghan soldier next to you is going to give up your location to the enemy at the first chance? One solution, developed by a former Army counterintelligence agent, is scanning the brains of Iraqi troops for signs of potential betrayal.
Veritas Scientific, based in Virginia, markets a truth detection system called HandShake for soldiers to diagnose the trust-worthiness of people they may have to work with. The technology was developed by Derrell Small, who served as a U.S. Army counterintelligence agent in 2003 and 2004.
Here’s a brief introduction from the Veritas website:
Here’s how the HandShake system works: A U.S. soldier would take, say, an Iraqi officer and outfit the subject with a special helmet that can pick up both electromagnetic signals (EEG) and perform functional near-infrared imaging (fNIRs) which images blood flow changes in the brain. The soldier would put the subject through a battery of tests including image recognition. Most of the pictures in the tests would be benign, but a few would contain scenes that a potential insider threat would remember, possibly including faces, locations or even bomb parts. The key is to select these images very, very carefully to cut down on the potential false positives.
For instance, as company founder Eric Elbot explained to The Futurist magazine’s Keturah Hetrick, just being familiar with IED parts does not explicitly signal terrorist inclination, “But if I flash you a picture of a diagram that shows you how to build an IED, that would be a pretty strong indicator that you might be a foe…You wouldn’t be studying how to make an IED if you were a friend.”
When you recognize a picture that’s of emotional significance to you, your brain experiences a 200 to 500 microsecond hiccup, during which the electromagnetic activity drops, measurable via EEG. The reaction, referred to as the P300 response, happens too fast for the test subject to control, so the subject can’t game the system.
The fNIR readings back up the EEG numbers. Together, they speak to not only whether or not a subject is a traitor but how likely an individual is to act on potentially criminal or treasonous impulses. The system then runs all the data through what Veritas calls a Friend or Foe Algorithm. The output: the ability to pinpoint an insider’s threat potential with 80 to 90 percent accuracy, according to the company.
“It would take an immediate $1.2 million for a one to two month full-court press operation here to convert [the system] to the complex Iraq game dynamics. It would take another $800,000 expenses and $800,000 hard and software to deploy to Iraq, run tests and set up an operation at scale,” company founder Eric Elbot told Defense Onein an email. ”That operation depending on the scale, could run full-time at $500,000 a month per site.” He added “These are guestimates.”
“The US is engaged in a dangerous multi-fronted, contradiction-loaded, counter-intuitive game [in Iraq]. Our technology if it were fully tested would be up to this sort of advanced challenge. I can assure you that we would do the best to work under these time and battlefield conditions and that we can make a major difference in managing complex inputs and outcomes,” said Elbot.
The connection between the brain and criminality has been presumed — though not well understood — for more than 7,000 years, ever since humankind’s first experiments in treating anti-social behavior with brain surgery. Fossils show that the ancient people in what is France today treated violent mental disorders through crude medical procedures aimed at cutting evil out of the brain, as did the Mayans and later the Greeks. Some of the most innovative research in diagnosing criminality today involves scanning the brains of incarcerated populations in New Mexico. More and more research suggests that a lot of dangerous behavior can be detected using techniques like EEG and fNIR.
We may be entering the golden age of imaging, but that doesn’t mean neuro-scans are always accurate in determining potential criminality or insider threats.
“There’s great variability between individuals” under brain screening, Adam Lamparello, assistant professor of law at Indiana Tech Law School, told Defense One, “But there is more reliability within the individual itself. Age, mental disorders, gender will affect the reliability” of brain scanning techniques he said.
Indeed, some research has shown that age, stress, alcohol and nicotine use as well as loud noises can influence P300 response.
“Measures of perceptions don’t exactly measure truth. And that’s why the law has an evidentiary problem with this,” Lamparello said. He admits that while brain analysis isn’t a perfect indicator of potential insider danger, “it is suggestive.”
The best way to cut down on the variability in the results of different brain scans, according to Lamparello, is do more tests on a single subject. In the field, that could mean bringing Iraqi forces into a controlled environment and subjecting them to tests fairly routinely. Unfortunately, that’s not a luxury available to the relatively small number of U.S. military operating in Iraq right now. And it may not help build the trust needed to work with the Afghan security forces in the future.
This story has been updated.