Staff Sgt. Dakota Karlsen, 455th Expeditionary Security Forces Squadron, echo sector controller, looks at a monitor of the surrounding area around the base, Bagram Airfield, Afghanistan, Aug. 4, 2016.

Staff Sgt. Dakota Karlsen, 455th Expeditionary Security Forces Squadron, echo sector controller, looks at a monitor of the surrounding area around the base, Bagram Airfield, Afghanistan, Aug. 4, 2016. U.S. Air Force photo by Senior Airman Justyn M. Freeman

AI-Powered Tools for Commanders Are a Top Priority for ‘Connect-Everything’ Effort

DoD’s artificial-intelligence efforts are moving beyond just helping analysts spot things in video.

Artificial intelligence-powered decision tools to help commanders better make calls on the battlefield will be among the first things the Pentagon data office invests in as part of the military’s effort to link together all the different services across the domains of air, land, sea, space and cyberspace.

These tools can be built only after much work to gather and standardize data and create “appropriate models for descriptive behaviors of what’s going on,” Clark Cully, acting deputy chief data officer at the Defense Department, said at an AFCEA webinar on Tuesday.

But they’re a good example of the coming usefulness of the Joint All-Domain Command and Control, or JADC2, networking effort and of how AI in the Defense Department is moving beyond tools to help over-burdened analysts to ones that assist commanders in crucial decisions. 

Cully said that the decision support tools will have to include “executive analytics” similar to the predictive analytics that Fortune 500 companies use to understand supply, demand, sales, and logistics. The Defense Department is already experimenting with machine learning for understanding accruing cost and what projects might go over schedule, as well as for things like predictive maintenance. Says Cully: “that same methodology also works on the operational side and I’m confident that we’re going to take a measured approach where we gain experience with what the boundaries are [as well as] with what the fragile elements in some of these predictive algorithms that manifest themselves under what conditions they’re well-validated.”

But there’s a lot of work for the Pentagon’s data office to do first. The list includes identifying and combining key data sources across departments, ensuring that data is standardized and machine readable, and connecting the various analytic software tools used by different parts of the Department and the military. 

Finally, all of that has to come together into “a data fabric that will allow us to sense, make sense and act according to the JADC2 strategy,” Cully said. That will require working across the services to show that AI can help with specific missions and then “prioritize [ing] the gaps between these platforms that have been developed by specific organizations.” 

He says that the role of the chief data officer “is to help adjudicate whether we need some glue code or whether we need hardware, tailored black boxes [as in more advanced but less explainable AI forms like neural nets], different processes or maybe training and education to synthesize and connect these platforms.” 

He said that will take up much of this year and the next.

As part of their development, decision aids will have to be tested in as realistic scenarios as possible, including getting them into the hands of operators in controlled environments. One “great thing” about algorithms and AI, he said, is that “you can barrage them with synthetic training data and really robustly map out the perimeters in which they perform in known and constructive ways and develop [tactics, techniques, and procedures] rules of use for these capabilities that respect those boundaries.” 

For the military, those AI testing challenges are a lot bigger than the sorts that tech companies encounter dealing with customers or users, Jane Pinelis, the head of AI testing and evaluation at the Pentagon’s Joint Artificial Intelligence Center, or JAIC, said during Tuesday’s event.

Pinelis said that the JAIC last year spent a lot of time figuring out just what it wanted to test for.

“We now have at least written down all of the facets that we care about evaluating on a particular system, starting from the AI tools to the model and algorithm by itself, system integration, human machine teaming, through operational test and then post-deployment run time modeling and, of course, robustness and security checks and checks for adherence to DOD [artificial intelligence] ethics,” said Pinelis. Even though the Defense Department’s testing concerns are bigger and more complex than, well, anybody else’s, partnering with innovators in the private sector is a must, she said. “DOD has to be a customer in some cases of what is already being developed and innovated in academia and industry.”

In coming weeks, JAIC will begin seeking partners in the private sector to help with testing and evaluation, she said.