An X-47B Unmanned Combat Air System (UCAS) demonstrator taxies on the flight deck of the aircraft carrier USS George H.W. Bush (CVN 77).

An X-47B Unmanned Combat Air System (UCAS) demonstrator taxies on the flight deck of the aircraft carrier USS George H.W. Bush (CVN 77). U.S. Navy photo by Mass Communication Specialist 2nd Class Timothy Walter

Keeping Killer Robots on a Tight Leash

As militaries contemplate autonomous weapons technology, they must anticipate and plan for its consequences.

This week, delegates to the United Nations Convention on Certain Conventional Weapons will discuss autonomous weapon systems, or what activists call “killer robots.” Colorful language aside, the incorporation of increasing autonomy into weapons raises important legal, policy, and ethical issues. These include potential motivations for developing autonomous weapons, how they might proliferate, implications for crisis stability, and what their possible development means for the military profession.

No government has publicly stated that it is building autonomous weapons, but there are several reasons why they might start. The need for speed has already led at least 30 countries to deploy defensive systems with human-supervised autonomous modes, such as Aegis and Patriot, to protect ships, bases, and civilian populations from swift swarms of aircraft and missiles. Such systems are only likely to become more important as precision-guided missiles proliferate. Autonomous weapons could also be useful in situations where radio links work badly or not at all. In a conflict, militaries will seek to jam or disrupt each other’s communications. Moreover, some environments, such as undersea, are intrinsically challenging for communications. Finally, some governments could desire autonomous weapons, in part, simply because they believe potential adversaries might obtain them.

Given the military and political attractiveness of autonomous weapons, it behooves us to explore some of the potential problems they present. Even if they performed better than humans most of the time, they would still fail in some circumstances — and in different and perhaps unexpected ways. Autonomous systems do well when the environment is predictable and there is an objectively correct action, like landing a plane safely on an aircraft carrier. In other situations, they can be “brittle.” What if self-driving cars could reduce auto fatalities by 90%, but the remaining 10% of deaths could have been easily prevented by a human driver? How should we think about those types of situations?

Failures can stem from simple programming errors, human operators using autonomous systems incorrectly, or the system’s interaction with uncertain and unpredictable environments. This last problem is particularly acute when multiple autonomous systems interact at high speeds. On May 6, 2010, an automated stock trade interacted with high-frequency trading algorithms to produce a “flash crash” in which the Dow Jones lost nearly 10% of its value in a matter of minutes.

(RelatedWhy There Will Be A Robot Uprising)

Similarly, autonomous weapons could perform perfectly 99.99% of the time but, in the few instances where they did fail, fail quite badly. In 2003, the U.S. Patriot air defense system shot down two friendly aircraft, killing the pilots. U.S. operators had physical access to the system and could disable it to prevent additional fratricides. However, this outcome would not necessarily have been the case if the operators had not had physical access to the malfunctioning weapon. Without a human “in the loop,” an autonomous system that was malfunctioning – or hacked by an enemy – could keep engaging targets until it ran out of ammunition or was destroyed. Software “kill switches” could help maintain human control over such systems, but only if communications links remained functional and the system still responded to software commands.

In such a scenario, a system failure – caused either by a malfunction, unanticipated interaction with the environment, or a cyber attack – raises the frightening prospect of mass fratricide on the battlefield. Similar to the 2010 flash crash, a host of autonomous systems malfunctioning could, in theory, rapidly spiral out of control. In the worst case, the result could be fratricides, civilian casualties, or unintended escalation in a crisis – potentially even a “flash war.”

The risks of such an outcome make cyber security even more important than it is today. While virtually any modern weapon system is theoretically vulnerable to cyber attacks, the consequences of hacking into an autonomous system could be far greater, since an enemy could actually take control of the system. While a cyber vulnerability could ground a modern fighter aircraft, an adversary would be hard-pressed to take control of the aircraft with a pilot in the cockpit. In contrast, an adversary could theoretically take control of an unmanned vehicle. With today’s remotely piloted systems, an adversary would have to replicate the controls in order to operate it. As a system’s autonomy increases, however, an adversary would only need to alter higher-level command guidance and let the vehicle – or autonomous weapon – carry out the actions on its own.

The possibility of failures from spoofing, hacking, malfunctions, and unintended interaction with the environment can be reduced with better cyber security and pre-deployment testing. But testers can only harden a system against known risks. There will always be unanticipated situations that an autonomous system will encounter, particularly when an enemy is trying to hack, spoof, jam, or otherwise deceive a system. Some failures will always occur. The greater challenge is ensuring that when systems fail, they fail safe.

The financial market’s response to the 2010 “flash crash” points the way toward a potential solution. After the incident, regulators imposed “circuit breakers” to halt trading if stocks rapidly plunged. Similarly, “human circuit breakers” to halt an autonomous system’s actions if it begins to fail and “human firewalls” to mitigate against hacking and spoofing attacks could help ensure that when systems fail, they fail safely.

Even if these problems could be adequately addressed, autonomous weapon systems raise challenging issues for the military profession. In an autonomous weapon, the decision about which specific targets to engage is no longer made by the military professional, but by the system itself, albeit in accordance with programming written by people. No longer can a warfighter be said to be responsible for each target engaged. Rather, the warfighter is responsible for placing the autonomous weapon into operation, but it is the engineers and programmers who designed the system who are responsible for target selection. 

If an autonomous weapon did something unexpected, human operators could justifiably claim, in some cases, that it wasn’t consistent with their intentions and it wasn’t their fault. Advocates for a ban on autonomous weapons worry about an “accountability gap,” but the problem is greater than simply holding someone accountable after an incident. It is at least theoretically possible to design regimes to assign accountability after the fact. The challenge is that accountability might lie with the engineer or programmer, not the warfighter, which cuts to the core of what the military profession is – expertise in decisions about the use of force.

Many innovations have changed how combatants fight on the battlefield, from the horse to the crossbow to firearms and missiles, but none of them changed the essential fact that it was still combatants deciding when and how to use force. A warfighter would still have the decision whether to deploy an autonomous weapon, but that is a qualitatively different decision than authorizing specific targets. Rather than being like a driver of a high-end automobile today who benefits from autonomous driving aids, such as intelligent cruise control and automatic lane keeping, but is still in control of the vehicle, warfighters operating autonomous weapons would be more like the passenger in Google’s steering wheel-less self-driving car: in charge of the decision whether to get in the car but, once onboard, along for the ride.

Fortunately, autonomy is not an either/or proposition. Militaries don’t face the choice of either building fully autonomous weapons or keeping humans fully in control (and modern sensing technologies already mean warfighters rely heavily on machines for many tasks). Instead, militaries should take a page from the field of advanced chess, or “centaur chess,” where human-machine hybrid teams that harness the best advantages of each are more likely to win than humans or machines alone. In decisions about the use of force, some mix of human and machine decision-making is likely optimal. Humans are far from perfect, and autonomous systems can help increase effectiveness and accuracy, mitigate against accidents, and even prevent some deliberate war crimes. At the same time, human judgement provides resiliency against unanticipated situations that are outside the bounds of an autonomous system’s programming. Given current technological developments, a blended approach that uses autonomy for some tasks and keeps humans “in the loop” for others is likely to be the best approach on the battlefield.

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.