The US Military Is Building Gangs of Autonomous Flying War Bots

An MQ-9 Reaper sits on the flight line at Hurlburt Field, Fla., on May 3, 2014.

U.S. Air Force photo illustration/Staff Sgt. John Bainter

AA Font size + Print

An MQ-9 Reaper sits on the flight line at Hurlburt Field, Fla., on May 3, 2014.

Flying swarms of semi-smart drones are coming to a war near you. By Patrick Tucker

For the Pentagon, drones are cheaper to buy and to operate than regular fighter jets. An armed MQ-9 Reaper drone runs about $14 million, compared to $180 million or more for an F-35 Joint Strike Fighter. But unlike barrel-rolling a jet, the business of actually operating a unmanned aerial vehicle, UAV, for the military is lonely, thankless, and incredibly difficult. It’s no wonder the Pentagon doesn’t have enough drone pilots to meet its needs, a problem certain to persist as the military increases its reliance on unmanned systems, especially in areas where it has no interest in putting boots on the ground, like Pakistan or Iraq. The solution that the military is exploring: increasing the level of autonomy in UAVs to allow one pilot to manage several drones at once.

The Defense Advanced Projects Research Agency, DARPA, put out a call for ideas this week as part of the “Collaborative Operations in Denied Environment” or CODE project. Today, the majority of the drones that the military is using in the fight against ISIL require two pilots. The agency is looking to build packs of flying machines that communicate more with one another as with their operator, which, in turn, would allow a single operator to preside over a unit of six or more drones. Together, the flying robot pack would “collaborate to find, track, identify and engage targets,” according to a press release.

It’s the “engage” portion of that release that rings of Skynet, the robotic tyrant system made famous by the “Terminator” movie franchise. But the drones that DARPA is envisioning would not be firing on human targets without approval from another human. The request also states that the targeting would be “under established rules of engagement.” What are those rules when it comes to robots? In deciding what drones should and should not be allowed to do, the Defense Department relies on a 2012 directive that states that autonomous weapons “shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.” DOD officials are always eager to remind reporters that, when it comes to armed robots being allowed to select and fire on targets, the department doesn’t want to take humans out of the loop.

Even so, the legality of U.S. drone strikes, particularly those in which civilians die as a result of the strike, remains a matter of some dispute. Ben Emmerson, the United Nation’s special rapporteur on human rights and counter-terrorism, authored a report in 2013 that found that 33 drone strikes may have violated International Humanitarian Law.

A separate U.N. report by Christof Heyns, the U.N. special rapporteur for extrajudicial, summary or arbitrary executions, noted that improvements to military drones would inevitability trickle down to civilian systems. The report questioned whether any government could hold true to a promise to never allow a robot to pull the trigger.

Official statements from governments with the ability to produce [lethal autonomous robots or LARs] indicate that their use during armed conflict or elsewhere is not currently envisioned. While this may be so, it should be recalled that aeroplanes [sic] and drones were first used in armed conflict for surveillance purposes only, and offensive use was ruled out because of the anticipated adverse consequences. Subsequent experience shows that when technology that provides a perceived advantage over an adversary is available, initial intentions are often cast aside.

Likewise, military technology is easily transferred into the civilian sphere. If the international legal framework has to be reinforced against the pressures of the future, this must be done while it is still possible,” it says. The report goes on to call on “all States to declare and implement national moratoria on at least the testing, production, assembly, transfer, acquisition, deployment and use of LARs until such time as an internationally agreed upon framework on the future of LARs has been established.”

Even proponents of greater military investment in unmanned systems have cautioned that increasing the amount of autonomy in armed flying robots carries some big risks. In a recent report for the Center for a New American Security, 20YY Warfare Initiative director Paul Scharre notes that “autonomy in the use of force raises the dangerous specter of ‘flash wars’ initiated by autonomous systems interacting on the battlefield in ways that may be unpredictable. While militaries will need to embrace automation for some purposes, humans must also be kept in the loop on the most critical decisions, particularly those that involve the use of force or movements and actions that could potentially be escalatory in a crisis.”

Despite that, Scharre says the benefits of greater autonomy far outweigh the risks.

“By embracing uninhabited and autonomous systems, the United States can disperse its combat capabilities, increasing resiliency, and expand its offensive striking capacity, all within realistic budget constraints,” he writes.

Increasing drones’ ability to “collaborate… find, track, identify and engage targets,” will surely ring alarm bells among those seeking to ban killer robots. But that’s unlikely to curb the military’s interest, for a variety of reasons. Aside form the savings in operator costs, more autonomous drones might actually be safer.

Jean-Charles Ledé, DARPA program manager, explains the program this way in the press release. “CODE aims to decrease the reliance of these systems on high-bandwidth communication and deep crew bench while expanding the potential spectrum of missions through combinations of assets—all at lower overall costs of operation. These capabilities would greatly enhance survivability and effectiveness of existing air platforms in denied environments.”

As Defense One has noted, improving the autonomy in armed drones decreases the likelihood of uplink communication hacking. The only thing scarier than a heavily armed robot that can do (some) thinking for itself is a flying missile-tipped droid that’s been hijacked by the enemy.

Close [ x ] More from DefenseOne

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Federal IT Applications: Assessing Government's Core Drivers

    In order to better understand the current state of external and internal-facing agency workplace applications, Government Business Council (GBC) and Riverbed undertook an in-depth research study of federal employees. Overall, survey findings indicate that federal IT applications still face a gamut of challenges with regard to quality, reliability, and performance management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • GBC Issue Brief: Supply Chain Insecurity

    Federal organizations rely on state-of-the-art IT tools and systems to deliver services efficiently and effectively, and it takes a vast ecosystem of organizations, individuals, information, and resources to successfully deliver these products. This issue brief discusses the current threats to the vulnerable supply chain - and how agencies can prevent these threats to produce a more secure IT supply chain process.

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

  • Information Operations: Retaking the High Ground

    Today's threats are fluent in rapidly evolving areas of the Internet, especially social media. Learn how military organizations can secure an advantage in this developing arena.


When you download a report, your information may be shared with the underwriters of that document.