PESHKOVA/SHUTTERSTOCK.COM

Inside the Pentagon's Plan to Make Computers ‘Collaborative Partners’

DARPA’s latest artificial intelligence project aims to bridge the gap between a machine that learns from data and one that adds new insights.

Artificial intelligence has become so common in recent years most people often don’t realize when they’re using it. You could get a band recommendation from Spotify, concert details from Siri, maps to and from the venue from Google and you probably wouldn’t think twice about machines organizing your entire night.

But as computers get smarter in so many respects, it’s easy to forget how dumb they are in others. An algorithm could tell you exactly what breed of dog you passed on the sidewalk, but it couldn’t say whether that dog needs food and water to survive.

Bridging the gap between a machine that learns from data and one that adds new insights requires re-evaluating the foundations of machine intelligence, and that’s exactly what the Pentagon intends to do with its latest research program.

The Defense Advanced Research Projects Agency in July launched Artificial Intelligence Exploration, a broad effort to develop so-called “third wave” AI tools that bring reasoning and contextual awareness to a technology that might not understand the world it tries to describe.

“We’re seeing huge advances [in AI] that seem kind of amazing, but they tend to overstate the capabilities of the system,” said John Everett, deputy director of DARPA’s Information Innovation Office. “Our systems don’t have a sense of the common-sense knowledge we take for granted in the physical world. This is an area that we’re very interested in exploring.”

AIE will work as an umbrella program that enables DARPA to “accelerate innovation” by testing numerous novel AI systems through smaller projects, said Valerie Browning, director of DARPA’s Defense Sciences Office. Each opportunity is eligible for up to $1 million in funding over 18 months, and the most promising projects could potentially spin off into their own full-fledged DARPA programs, she said.

The agency on Aug. 24 announced the first research opportunity under AIE, called Automating Scientific Knowledge Extraction, which aims to build AI that can generate, test and refine its own hypotheses. If successful, the project would essentially create a scientist from computer code.

Systems like ASKE wouldn’t entirely replace their human counterparts but rather act as a technology that complements and enhances their work, Browning told Nextgov. Such systems would be able to reason and add something to conversation instead of just spitting out information it previously crunched.

“Right now with machines as tools, they’re only going to do what we tell them. It’s not reciprocal in terms of suggesting courses of action … there’s not really a discourse,” Browning said. “A part of the grand vision really is machines transitioning from tools to collaborative partners.”

Houston, We Have Lots of Problems

To appreciate the impact third wave AI would have on society, it’s important to understand the shortcomings of the technology’s first two iterations. First wave AI involves systems that can follow sets of rules, like tax preparation software. The second wave encompasses the myriad machine-learning tools most people know today, like facial recognition programs.

Both types of AI have collectively advanced the technology over the last 60 years, but Browning and Everett both said there remain a number of technical roadblocks AIE and other DARPA programs are working to overcome.

For one, AI tools are only as good as the data they ingest and there are always limits on the information that’s available, said Everett. With abundant datasets and digital images, companies have built AI tools that are great at classifying pictures and making predictions. But common sense and intuition don’t exist as machine-readable data, so DARPA must explore different methods for instilling systems with that information, Everett said.

He added there are instances where “practice runs way ahead of theory” and groups end up building AI tools whose inner workings they don’t fully understand. Allowing machines to make decisions in a black box not only leads to slews of ethical problems, but makes correcting those issues extremely difficult as well, he said.

Understanding how systems make decisions is critical if humans want to rely on artificial intelligence to drive cars, assist law enforcement and make other consequential decisions, said Everett.

Today’s technology also doesn’t adapt well to new information, according to Browning. Once a tool is trained a certain way, its processes are essentially frozen. After that, if you want the tool to do something different, you need to go through the time-consuming process of retraining the whole system, she said.

“Right now we think we’re in the age of woefully inefficient machine learning,” Everett said. “What we’re seeing right now is a very rapid progression in certain aspects of [AI], but some of the technology won’t progress nearly as quickly as our current experience might indicate.”

But Then What?

Browning said there’s no concrete timeline for AIE, but DARPA will keep investing in projects as long as it deems necessary. New research opportunities will roll out regularly, and because agency personnel can propose their own ideas, “there won’t be any shortage of topics,” she said.

Everett noted technological progress ebbs and flows, so it’s likely the rapid AI advancements of recent years will slow down. But if there is a breakthrough, it’s likely to come from this type of moonshot research, he said.

“The commercial world is doing an enormous amount of essential engineering to take what we know about AI right now … and apply [it] in a way that’s commercially viable, safe for consumers, and affordable,” said Everett. “We see DARPA’s role as looking at what comes next, where people aren’t looking.”

In the aftermath of the Sputnik launch, DARPA made it its mission to “prevent technological surprise.” As such, the agency’s AI efforts are focused squarely on pushing the technology over the next horizon instead of adding another decimal place in accuracy to existing systems, he said.

Years down the line, third wave artificial intelligence systems could significantly transform the way the military plans missions, scientists conducts research and people interact with machines, said Everett. Humans and computers each have their respective strengths and weaknesses, and closer collaboration between the two could lead to better decision-making and more creative thinking, he said.

“It’s taken us 60 years to get to where we are,” Browning added. “The challenges that we see moving to third wave … are even grander than what we had to overcome to get to where we are today.”