An Italian Carabinieri explosive expert gives the thumbs up sign near a bomb disposal robot after it detonated an unattended bag near Grazioli palace, former Italian Premier Silvio Berlusconi residency, in Rome, Oct. 8, 2016.

An Italian Carabinieri explosive expert gives the thumbs up sign near a bomb disposal robot after it detonated an unattended bag near Grazioli palace, former Italian Premier Silvio Berlusconi residency, in Rome, Oct. 8, 2016. Gregorio Borgia/AP

What Happens When Your Bomb-Defusing Robot Becomes a Weapon

Treating a technology as a “platform” has consequences.

Micah Xavier Johnson spent the last day of his life in a standoff, holed up in a Dallas community-college building. By that point, he had already shot 16 people. Negotiators were called in, but it was 2:30 in the morning and the police chief was tired. He’d lost two officers. Nine others were injured. Three of them would later die. In the early hours of July 7, 2016, the chief asked his swat team to come up with a plan that wouldn’t put anyone else in Johnson’s line of fire.

Within 30 minutes, their Remotec Andros Mark 5A-1, a four-wheeled robot made by Northrop Grumman, was on the scene. The Mark 5A-1 had originally been purchased for help with bomb disposal. But that morning, the police attached a pound of C4 explosives to the robot’s extended arm, and sent it down the hallway where Johnson had barricaded himself. The bomb killed him instantly. The machine remained functional.

Johnson had served in Afghanistan before being discharged. It’s possible that he recognized the robot before it blew him up.

Nearly 20 years earlier, a young roboticist named Helen Greiner was lecturing at a tech company in Boston. Standing in front of the small crowd, Greiner would have been in her late 20s, with hooded eyes, blonde hair, and a faint British accent masked by a lisp. She was showing off videos of Pebbles, a bright-blue robot built out of sheet metal.

» Subscribe to our new, weekly podcast Defense One Radio! Episode 1 begins here.

For many years, the field of AI struggled with a key problem: How do you make robots for the real world? A robot that followed a script was simple; but to handle the unforeseen (say, a pothole or a fence), programmers would have to code instructions for every imaginable scenario. To engineers, that meant creating devices with ever more complex brains.

Greiner’s professor, Rodney Brooks, thought that approach was a dead end. Instead of trying to engineer a model of the real world, what if you used the world as its own model? If you wanted your robot to find open doors without bumping into things, you shouldn’t have to give it a detailed map of the room; you should just tell it to move in a straight line until it senses something in front of it. These two separate goals—going straight, and not hitting things—don’t have to be explicitly coordinated in an onboard brain. Instead, you can have two simple subsystems that work independently, with the subsystem that drives the robot forward overridden, when needed, by the subsystem that notices obstacles nearby.

That “subsumption architecture” was what allowed Pebbles to be both simple and adept. Each of its onboard systems was lean, designed for a straightforward task; the device’s ability to react to the world around it emerged, naturally, from the interaction of its parts.

Greiner and Brooks spent years trying to create practical, marketable robots. On the side, she would give lectures like this one, demonstrating her work in the hopes of drumming up interest in her product. During her presentation, Greiner noticed one man who couldn’t seem to sit still: a Special Forces operator who had a fascination with robotics. He watched as Pebbles moved fluidly while transmitting video and audio to an operator across the room. “We have to get these robots to the Air Force Academy,” he told Greiner.

Soon, Greiner was invited to join a military training exercise with other robotics companies—the first of its kind. With that invitation, she saw an opportunity she sorely needed: Her company, iRobot, was going broke. “We were all worried about funding,” she says. “We didn’t have the money in the bank to make payroll at the end of the year.”

She immediately booked a flight to Denver with a souped-up Pebbles robot, called ROAMS, and Rosario Robert, an engineer. Robert led the two-person ROAMS project. She worked on the mechanical side: drawing plans, soldering, drilling. She added a nicer camera, GPS (a brand-new technology at the time), and lifted the body higher off the ground. She stayed up late the night before the exercise, hand-painting camouflage patterns on the metal frame.

They drove to a parking lot an hour south of Denver. When they arrived at three in the morning, a group of soldiers materialized out of the dark, picked up ROAMS, and disappeared back into the night. The exercise involved sending robots up to surveil planes parked in a nearby hanger, all without being detected by a soldier posted in the hills. Earlier that week, a major had gone to Walmart and bought a dozen foam pool noodles. He used them to craft fake nuclear missiles hung with a small, hand-painted sign: “Your Geiger counter is now pegged.” The idea was to get your robot close enough to send a clear video of the sign back to the base.

Summer nights in alpine deserts are always cool. Some evenings, you can even see your breath fogging the jagged outline of Pikes Peak against the sky. Robert remembers sitting in the dark, feeling anxious. ROAMS was what they called a “lab queen,” a machine built for a controlled environment. There was no guarantee that it would hold up in the prairie outside the Air Force base, over 6,000 feet above sea level. The landscape lay dry and flat, nearly featureless but for low rabbitbrush and soapweed yucca. It was the kind of cold, open country that makes it particularly hard to surprise someone.

At the end of the exercise, the soldiers met with Greiner and Robert. They gave the women tips to make the robot into something the military might actually use in the field. Above all, they said, the next version had to be tough. “We’re going to throw it off the back of a C-130.”

ROAMS, apparently, had performed well enough. It could navigate a bumpy, open field. Because it ran cool—meaning it didn’t show up on thermal imaging—it managed to escape “enemy” detection. But the real distinguishing factor was its size. After millions of dollars in government-funded research, most military robots weighed thousands of pounds. ROAMS was under 30, and low enough to the ground to move without drawing attention.

Weeks later, a general was shown a video of ROAMS at work. “It’s about fucking time,” he said.

So began the end of iRobot’s financial woes. Months later, the company landed an almost $740,000 DARPA contract, and roughly $1.2 million more in 1998.

Greiner’s robots would go on to be more successful than anyone could have predicted. But the more they proved their worth to the military, the more the military adapted them to their own ends, until Greiner would have to face an unexpected question: What happens when the robots you built to save lives become weapons themselves?

With the expansion of autonomous weapons across the globe, the question has taken on a new urgency. Robots aren’t just being put on the offense. They’re learning to pull their own trigger.


With twin tracks like a crane, and a long, maneuverable head, the PackBot looks almost cute. The robot is the newest iteration of ROAMS: It’s a GPS-wielding, self-righting, stair climbing, portable machine that can survive a 10-foot drop onto a concrete floor. Greiner became fascinated with robots when, at age 12, she watched Star Wars: A New Hope. To her, Leia and Luke were supporting characters. Her most famous creations—the PackBot for the military, the Roomba for households—have the charm of R2D2.

Greiner started iRobot in 1990, along with Brooks, her professor, and Colin Angle, another grad student. She and her teammates spent their days, and often nights, in a lab space in Somerville, Massachusetts. In a previous life, the building operated as a slaughterhouse. The transition from abattoir to a lab that built life-saving robots seemed fitting. At that time, Boston was in its own transition from a depressed city to a world-renowned center of science and technology.

The first PackBot prototypes were deployed in Afghanistan in 2002, just as Roombas were released in stores like The Sharper Image and Brookstone. They operated mostly in tunnels, rolling down the narrow passageways, looking for booby traps. A natural patroller, the robot could be sent on reconnaissance and surveillance missions. It could climb stairs, travel over mud, rocks, and snow, and be controlled from almost a kilometer away. But it wasn’t until the Iraq War, and the advent of the roadside bomb, that the robot became indispensable.

From 2006 to 2011, Master Sergeant Elizabeth Butler worked as an Air Force explosive ordinance device technician—but unlike other techs before her, Butler had a secret weapon: the PackBot.

Butler and her two teammates spent their days in a truck stocked with guns, body armor, and the robot. Butler wore an eye patch that fed her a direct video feed from the machine’s perspective. Her controller looked nearly identical to a PlayStation’s. With deft twiddling of her thumbs, she could guide the PackBot to investigate, and then defuse, an IED. (Butler is rare among Explosive Ordnance Disposal techs: She never lost a robot.)

Ellen Purdy, who led the Joint Ground Robotics Enterprise for the Department of Defense during the Afghanistan and Iraq wars, believes that the PackBot’s success lay in iRobot’s ability to quickly adapt. The Iraq and Afghanistan wars were a “counter-counter” scenario: a race between insurgent and American forces to design, then combat, increasingly complex IEDs. Greiner and iRobot would deploy PackBots that could snip wires. The insurgents would start burying wires underground. Purdy would ask the PackBot engineers for a version that could dig. Butler and the other EOD techs would update their models, learn the new specs, and deploy again.

“People don’t realize how many lives were saved because of Helen and her engineers,” Purdy tells me. “They really are some unsung heroes.” I spoke with one member of the DOD who received all Navy anti-IED robots that had been blown up during combat. From 1997 to 2012, he counted over 260 robots damaged by a blast. Every robot, he told me, translated to at least one less human casualty. Purdy, who oversaw joint programs that included not just the Navy but also the Air Force, Army, and Marine Corps, estimated that anti-IED robots saved thousands of lives. One group of soldiers was so distraught after their robot was blown up, they tried to hold a burial for it.

This March marked the 20th anniversary of the creation of the PackBot. More than 4,000 of them have been deployed. They’ve been bought by militaries all over the world. They were used in 9/11 search-and-rescue efforts, during the manhunt for Dzhokhar Tsarnaev, and in the Fukushima plant, rolling around in areas with radiation levels too high for human engineers.

Purdy believes that the PackBot’s success changed the way the military thought about technology. “The PackBot was instrumental in getting the military to be open to the idea that you can automate tasks,” she says—the idea that robots “can do jobs better than people.”

In 2007, the DOD released its first Unmanned Systems Roadmap, which budgeted $2.7 billion for all military robots. From 2014 to 2018, the Pentagon budgeted nearly $24 billion for unmanned systems. “There’s been a bit of a sea change within the army,” says Paul Scharre, a former DOD official who helped draft the department’s first directive on autonomous weapon systems.

Pentagon officials see a future where every soldier is augmented, where humans and machines are joined in a symbiotic relationship. They call it “centaur warfare,” a strategy championed by Robert Work, the deputy secretary of defense who served during the Obama and Trump administrations before leaving the position in 2017.

In a 2015 speech, Work outlined how machine-human symbiosis could help the United States compensate for its manpower deficit against countries like Russia or China. Centaur warfare, he argued, could be the new nuclear weapon.


The 2016 Dallas standoff was the first time a robot, including drones, killed someone on U.S. soil. Outside the United States, it had been happening for a decade. In the mid-2000s, a group of soldiers duct taped a claymore mine to a small surveillance robot and sent it down an Iraq alleyway, killing a man. It was the first time that soldiers realized robots built to save lives could end them, too.

Though neither robot used in the Dallas or Iraq incidents were created by iRobot, the company has experimented with weaponized machines. iRobot advertises itself as a kind of “platform.” Anyone can build additions to their robots, like Geiger counters, explosive “sniffers,” or facial-recognition cameras. In 2007, iRobot released a semi-autonomous robot equipped with a gun from now-defunct Metal Storm. The gun could fire 16 rounds a second. (An average shooter can fire about 2 rounds per second with an AR-15.) The robot was never used in the field, but it was a turning point for the company.

Greiner left iRobot in 2008 for CyPhy Works, where she served, until recently, as the chief technical officer. The PackBot has since evolved without her. Endeavor Robotics (the iRobot spin-off that produces military robots) recently created a version that doesn’t require an operator. They’ve simulated sending it into buildings to map layouts and mark areas with high levels of radiation, chemicals, or enemy combatants. And they’ve developed a few weaponized versions, too.

Though Greiner left before those versions were designed, she thinks they’ll help soldiers by keeping them out of danger, just as earlier PackBots helped EOD techs. “If my son or daughter was out there, I wouldn’t want them to have to stick their head around the corner and take a shot,” Greiner tells me. She remains proud of her legacy: She made a machine that saved lives. She helped thwart IEDs, one of the most random and destructive forces in modern warfare.

Greiner, like everyone else I spoke to, was quick to insist that as long as a human pulled the trigger, even remotely, a weaponized robot was fair game. I was told over and over that autonomous robots—machines programmed to identify and attack without human operators—would never be a part of the U.S. military.

“I wish people understood just how blessed we are to live in the United States,” Purdy, the former head of DOD’s Joint Ground Robotics Enterprise, tells me. “Because our government has it exactly right. We abide by the law of armed conflict. We are never going to put a robot in charge of a life-altering decision.”

The Obama administration instituted a five-year ban on autonomous weapons back in 2012. Last year, the Trump administration upheld the ban. But there is an advocate for autonomous weapons in the White House. Steven Groves, who works as a special assistant to the president, wrote an op-ed for the Heritage Foundation that came out against the UN’s proposed ban on the technology.

Other countries are well on their way to creating autonomous weapons. Israel’s Harpy system identifies and then attacks foreign radars, and Russian weapons manufacturer Kalashnikov announced last summer that it will create an autonomous machine gun. “The technology genie is out of the bottle,” Purdy says. But she, along with various other DOD officials I spoke with, all insisted that American-made robots would never independently attack humans.

To Greiner, Purdy, and the DOD, there’s a critical ethical distinction between a robot deciding to kill somebody, and a robot being ordered to kill somebody. There’s something dehumanizing, they think, about a robot determining who lives and who dies. That type of decision should be left up to whoever is pulling the trigger—even if they’re thousands of miles away.

It all feels a bit anamorphic: Robots can kill, so long as they don’t make what the military calls a “kill decision.” Meanwhile, engineers are working on technology that takes soldiers further and further out of the field. Viewed from some angles, that’s lifesaving.

But from others, it’s not. One of the strangest things about the world we’ve created is when, and how, we allow one person to kill another. At the heart of those rules—in war, in policing—is the idea that when someone is trying to kill you, you have a right to kill her. But in a fight where one side can’t die or even feel pain, those rules become unclear. When should a robot be directed to disarm, or to kill?


It’s been ten years since Greiner’s new company, CyPhy, was founded. In that time, the company has raised over $30 million in venture capital. But even in her new role, Greiner stuck to her military roots. She couldn’t go into details, but said that a “special-forces operator” told her that CyPhy’s flagship product, the parcdrone, saves lives. The U.S. military uses the drones for surveillance operations and at dangerous checkpoints, alerting soldiers of would-be surprises.

Like PackBots, parc drones operate as platforms. They can carry cameras, monitoring devices, even cell-phone network boosters. They can stay aloft for weeks at a time, which makes them excellent for surveillance or other long-term projects. They’ve been deployed by police departments during the Boston Marathon, Sail Boston, and July 4. There’s even talk of putting them along the border.

In the future, Greiner envisions parcs spread across thousands of places, all over the world. Eventually, they’ll be able to operate independently, collecting information and images from 400 feet above the ground. Instead of filtering data for humans to sort through, the drones could use that information to increase their own intelligence, eventually replacing humans as a more efficient and effective monitor. An autonomous eye in the sky that never blinks.

As CTO, Greiner was not in charge of setting CyPhy’s policies or business strategy. But she did often give public talks about the promise of drone technology. In a speech during a Boston TedX event in 2015, she expressed what might pass as mild annoyance at the fear people seem to have of drones. “I’ve noticed that people are a little bit more worried about flying robots,” she told a laughing audience. “I’d like to change this conversation.”

She’s slowly pacing across the stage, wearing a lime-green jacket and clicking through a slide show. One picture is of a drone delivering a tiny, neatly wrapped package. The next is of a man sitting at a computer in the middle of a field. (CyPhy has since built drones that act as mobile cell-phone towers.) The final slide is an image of Shibuya Crossing. “A flying robot could see a knapsack that’s been dropped in a crowd,” she says, evoking the kind of backpack bombs that detonated during the Boston Marathon. “It could swoop down and grab it with its talons.”

Many experts believe that it’s not too late to stop autonomous weapons. Last year, 116 robotics companies from around the world signed an open letter asking the United Nations to ban killer robots. iRobot, Endeavor, and CyPhy Works were all absent from the list.


This story was produced with support from The GroundTruth Project.

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.