Marine Corps poolees await medical screening at The Citadel, a public military college in Charleston, S.C., where poolees are undergoing a 14-day observation period amid the coronavirus pandemic, May 4, 2020.

Marine Corps poolees await medical screening at The Citadel, a public military college in Charleston, S.C., where poolees are undergoing a 14-day observation period amid the coronavirus pandemic, May 4, 2020. Marine Corps Staff Sgt. Rebecca L. Floto

Pentagon Task Force Turns to Data to Shape COVID-19 Response

The group aims to use models and information to judge risks to the Defense Department and its missions.

The Defense Department’s COVID-19 Task Force, or CVTF, is using data analytics and advanced modeling to heighten military readiness, taper uncertainty, and enable a more informed response throughout and beyond the coronavirus pandemic.

Capt. Kimberly Elenberg leads modeling and analytical support for CVTF and recently offered a glimpse into the group’s evolving, tech-driven efforts. She was among several agency insiders, academics and industry analysts to detail ongoing work during a federal health virtual forum hosted by Booz Allen Hamilton, focusing on the impact of artificial intelligence in the fight against COVID-19.  

“Our work is really looking at not just the academics of ‘how do we get the best information,’ but ‘how do we actually apply that information,’” Elenberg said.

Early on, the CVTF launched an interagency modeling and simulation community of interest so that people from a range of different-yet-relevant backgrounds could exchange insights and align on best practices for both assessing the overall understanding, and subsequently, responding to the pandemic. “This was really a bridge between the academic side and the epidemiologists, and those folks who have to make sure that the military can meet its mission every single day,” Elenberg explained.

Related: The Pentagon Will Use AI to Predict Panic Buying, COVID-19 Hotspots

Related: How South Korea Used Technology to Flatten the Coronavirus Curve

Related: The Prognosis: Latest News on Coronavirus & National Security

Certain answers officials sought to arrive at via the integrated data and simulations, included how and where the disease is spreading and the inherent risks it poses to individuals and the Pentagon’s mission. With a wealth of data streams flowing into the agency, Elenberg also highlighted the significant need to focus on the authority and quality of the data that’s used. “That's important for our leadership, so that they can trust the outcomes of products we’re producing,” she noted.

CVTF’s Data Analysis line of effort ultimately aims to ensure military readiness in the near and far term “as the full effects of the COVID-19 pandemic are realized over time,” Elenberg said. Upon establishing a digital, common operating platform to host models and provide data management support for top Pentagon officials, the team also turned to history to help prime its pursuit. 

“So we did a lot of looking through the literature to help inform some of our efforts today, and our initial coordinated and integrated disease modeling insights looked at impacts on people, facilities and supplies by locality,” Elenberg said. “This includes the continental U.S. and outside the continental U.S. because we are all over the world.”

They identified three types of infectious disease models for pandemics to tap into during their initial stages. One is the “classic” SEIR model, which essentially reflects the flow of people as they are Susceptible, Exposed, Infected and Resistant to the illness. Elenberg noted that this model is not “super dynamic” and that the team “knew there were benefits to this as well as constraints.” For example, she mentioned that the model didn't really take into account transmission risk caused by patients in the latent period, which made it difficult to predict the development of the pandemic. She also detailed how the model had to have its parameters updated every couple weeks, and the team didn’t have an ample supply of sufficient raw data as they’re dealing with a literal “novel” coronavirus. Officials added a Bayesian model to those efforts, which inherently helps address uncertainty.

“That was helpful to make sure that parameters, random variables, and probability distributions will help us estimate the posterior distribution [or what can be understood about uncertain quantities], and we can really get a better understanding of ‘OK, we thought this was what was going to happen, we have the actual data of what's was going to happen, we implemented policy over time. What was the effect of that policy, what was the effect of behavior? What was the control program efficacy—and how can we use this to help inform policy,’” she explained. 

The team also used agent-based modeling, which helps to simulate interactions between individuals, groups and the environment to trace the impacts of the virus. Across their efforts, CVTF uses a wide variety of data and analytics supplied from sources across academia, the Defense enterprise, other agencies including the Health and Human Services Department and Federal Emergency Management Agency and certain open sources.

Elenberg emphasized that gauging how quickly the disease is spreading across the world is and will remain to be a challenge because there just hasn’t been a great deal of testing—which is also an obstacle in itself. “So some of the things that we're pondering as we continue to look at modeling for the future are what are unclassified options to increase that understanding,” she said. 

CVTF will continue to support the ongoing data analysis and modeling over time, but they’re also shifting to consider how all these efforts and learnings can inform the whole-of-government response to incidents beyond the current global health emergency. In that light, they’re evaluating connections, building data dictionaries and data cataloguing. Further, CVTF is working to ensure the users of those assets fully grasp the risks that accompany trusting the models’ outputs.

“We need to tell them, ‘look, we have low confidence in the model output, or we have medium confidence in the model output,’” she said. “And we try to explain it to them in language that’s kind of like forecasting a hurricane.”

Going forward, Elenberg said the team will work to address “second order effects” of the disease’s spread, additional waves of COVID-19—and they also have its eye on the forthcoming higher-than-average storm season. 

“The other thing we're really looking at is a confluence of events occurring at the same time. So we're starting to think about, you know, June 1, we start hurricane season, and we know in the Fall we're going to have influenza-like illnesses and influenza itself, plus COVID,” she said.  “How do we model that in totality and begin to forecast what the impact is going to be in our ability to maintain our mission?”

Booz Allen Hamilton also hosted a variety of other speakers during the web-based event, including Dr. Kim Pruitt, who detailed how the National Library of Medicine’s digital management and standards-focused efforts to support pandemic-driven research and Dr. Susan Gregurick, who shed light on the National Institutes of Health’s AI-focused work across pharmaceuticals and beyond, all in response to COVID-19.