DOD Science Board Recommends “Immediate Action” to Counter Enemy AI


AA Font size + Print

Pentagon scientists worry that the U.S. could be on the losing side of a AI arms race.

The Defense Science Board’s much-anticipated “Autonomy” study sees promise and peril in the years ahead. The good news: autonomy, artificial intelligence, and machine learning could revolutionize the way the military spies on enemies, defends its troops, or speeds its  supplies to the front lines. The bad news: AI in commercial and academic settings is moving faster than the military can keep up. Among the most startling recommendations in the study: the United States should take “immediate action” to figure out how to defeat new AI-enabled operations.

In issuing this warning, the study harks back to military missteps in cyber and electronic warfare. While the Pentagon was busy developing offensive weapons, techniques, plans, and tricks to use against enemies, it ignored U.S. equipment’s own vulnerabilities.

“For years, it has been clear that certain countries could, and most likely would, develop the technology and expertise to use cyber and electronic warfare against U.S. forces,” the study’s authors wrote. “Yet most of the U.S. effort focused on developing offensive cyber capabilities without commensurate attention to hardening U.S. systems against attacks from others. Unfortunately, in both domains, that neglect has resulted in DoD spending large sums of money today to ‘patch’ systems against potential attacks.”

That cycle could repeat itself in the field of AI, says the study.

To counter the threat, the study says, the undersecretary of defense for intelligence should “raise the priority of collection and analysis of foreign autonomous systems.” Take that to mean figuring out what China, Russia, and others can do and will soon be able to do with artificial intelligence.

Meanwhile, the Pentagon’s office of acquisition technology and logistics should gather together a community of researchers to run tests and scenarios to discover “counter-autonomy technologies, surrogates, and solutions” — in other words, practice fighting enemy AI systems. This community should have wide discretion in conducting research into commercial drones, software, and machine learning.

“Such a community would not only explore new uses for autonomy, counter-autonomy, and countering potential adversary autonomy, but also more realistically inform what the tactical advantages and vulnerabilities would be to both the U.S. and adversaries in adopting or adapting commercially available technology,” the study says.

Just as over-reliance on information technology has led to new weaknesses, so autonomy, too, is not a silver bullet. The study names a handful of “opportunities to limit or defeat the use of autonomy against U.S. forces.”

They include “using deception to confound rules-based logic” or simply overwhelming the AI’s sensor inputs. In most settings, the human brain can differentiate signal from noise far more capably than any human-written program.

The study reiterates the importance of human-decision making, but offers that the greatest potential for autonomy is in software that learns or adapts on its own, with little to no human guidance. When, if ever, is it safe to put an autonomous learning system like that in charge of a howitzer? The study says that the Defense Department doesn’t yet have the means to even ask the question.

“Current testing methods and processes are inadequate for testing software that learns and adapts,” it reads.  Better testing procedures, particularly in virtual environments, will be key to getting the most out of next-generation artificial intelligence.

The United States faces a special ethical burden in how it develops and uses autonomy. The military faces pressure – both internally and from outside groups – to limit the use of autonomy in weapons. That’s less true in China and Russia; the latter of which boasts that it has tested lethal autonomous ground robots as guards for missile sites and is developing a crewless version of the Armata T-14 tank.

“While many policy and political issues surround U.S. use of autonomy, it is certainly likely that many potential adversaries will have less restrictive policies and [concepts of operation] governing their own use of autonomy, particularly in the employment of lethal autonomy. Thus, expecting a mirror image of U.S. employment of autonomy will not fully capture the adversary potential,” notes the study.

Close [ x ] More from DefenseOne

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Ongoing Efforts in Veterans Health Care Modernization

    This report discusses the current state of veterans health care

  • Modernizing IT for Mission Success

    Surveying Federal and Defense Leaders on Priorities and Challenges at the Tactical Edge

  • Top 5 Findings: Security of Internet of Things To Be Mission-Critical

    As federal agencies increasingly leverage these capabilities, government security stakeholders now must manage and secure a growing number of devices, including those being used remotely at the “edge” of networks in a variety of locations. With such security concerns in mind, Government Business Council undertook an indepth research study of federal government leaders in January 2017. Here are five of the key takeaways below which, taken together, paint a portrait of a government that is increasingly cognizant and concerned for the future security of IoT.

  • Coordinating Incident Response on Posts, Camps and Stations

    Effective incident response on posts, camps, and stations is an increasingly complex challenge. An effective response calls for seamless conversations between multiple stakeholders on the base and beyond its borders with civilian law enforcement and emergency services personnel. This whitepaper discusses what a modern dispatch solution looks like -- one that brings together diverse channels and media, simplifies the dispatch environment and addresses technical integration challenges to ensure next generation safety and response on Department of Defense posts, camps and stations.

  • Forecasting Cloud's Future

    Conversations with Federal, State, and Local Technology Leaders on Cloud-Driven Digital Transformation


When you download a report, your information may be shared with the underwriters of that document.