Robot-maker Sean Bielat says he’s fine with the Dallas Police Department’s apparently unprecedented use of a police bomb-disposal robot to kill a gunman on Thursday. “A robot was used to keep people out of harm’s way in an extreme situation,” said Bielat, the CEO of Endeavor Robotics, a spinoff of iRobot’s military division. “That’s how robots are intended to be used.”
Joergen Pedersen, the CEO of RE2 robotics and the chairman of the National Defense Industrial Association’s robotics division concurred. “If these robots are used in manners for which they were unintended, we would expect that the officers who are there to keep citizens and themselves safe would use good judgment where the application of lethal force is a last resort,” he said.
On Sunday, speaking to Face the Nation, Dallas Mayor Mike Rawlings blessed the operation. “The chief had two options and he went with this one. I supported him completely because it was the safest way to approach it,” he said.
But some ethicists are worried.
“My initial reaction was that we have just got onto the slippery slope,” said Heather Roff, a senior research fellow at Oxford and a research scientist at Arizona State University’s Global Security Initiative. “This is going to be very hard to put back and that the militarization of police capabilities means that they may now feel that it is reasonable to use robotics in this way to ensure compliance…If one doesn’t have to talk to a subject and can demand compliance, then this may mean more forceful or coercive demands are made.”
As military research pushes robotics prices down and Pentagon policies push battlefield gear to domestic law enforcement agencies, expect to see more armed robots on American streets.
Bielat believes that incidents like the one in Dallas, in which police used a Northrop Grumman Remotec Andros F5 to carry explosives close enough to a gunman to kill him, won’t become “a common occurrence,” in part because the Andros F5, like his company’s own celebrated PackBot, cost upwards of $100,000 apiece. But he also believes that military-grade robots are on the cusp of getting a lot cheaper and more capable, due to decreases in the cost of processing power, advances in 3D printing, and other factors.
“A lot of the components have come down in price because of consumer applications,” he said. “You now have more capability in the camera on a cellphone than you did on the…camera on the PackBot when it was first built.”
That’s leading to programs like the Army’s Common Robotic System Individual, or CRS (I), which aims to supply its infantry units with 4,100 sub-25-pound robots. Bielat anticipates that the Marine Corps will also sign onto the program, and perhaps more military groups as well. Many users will want to modify the robots to handle their own set of tasks — that could include lethal ones.
“As you see robots move out of the EOD [bomb disposal] community and into the infantry, you’re going to have people saying, ‘Hey, these things are really useful. What would make them even more useful is x, y, and z, and that probably includes some level of armament.”
Another program, the Navy’s Advance Explosive Ordnance Disposal Robotic System, or AEODRS, is meant to make this easy. It envisions paying one company to make a basic platform that other companies could sell add-ons for — say, a better camera or different robotic arm. “By doing that, you will be able to rapidly evolve the robot and rapidly enable it to incorporate missions other than just EOD, and make it easy for other communities within the services to acquire the robot,” Bielat says.
These kinds of programs represent a big change from just a few years ago, when the military’s ground robotics purchases came primarily from overseas contingency operations funding.
The advances that such programs yield — and the 1033 program, which allows local and state law-enforcement agencies to request military technology — will make police robots useful, cheap, and ubiquitous, Bielat says. The program wouldn’t provide armed military robots to police, but police may decide to arm them, says Bielat.
“Just like you have a laptop in every squad car and cameras in every squad car, you would have a small robot, not an EOD robot, but a small robot in every squad car and maybe that thing has a taser device on it, or some other less-than-lethal capability,” he said. “And maybe that’s used to approach a motorist at night when a cop doesn’t want to go up and approach with their hand on their holster. Maybe the robot goes up instead.”
Police bots won’t necessarily be lethally armed, said Bielat, who is a major in the Marine Corps Reserves. The key consideration is what weapons would be appropriate. “Taking an M240 machine gun and attaching it to a robot may or may not make sense. That weapon was designed to be operated by two human beings and a bunch of other things. That may not be an appropriate thing to put on a robot platform, but that doesn’t mean that no weapon system would be appropriate. It just means you’re designing something that’s unique that takes advantage of the robot,” he said.
“A robot provides a variety of options. A robot adds time and distance to the equation. The operator can sit back and use two-way audio and say ‘drop your weapon.’ A robot can use less-than-lethal force. A Marine often can’t use a taser unless he’s willing to get shot. A robot could,” he said.
The decision to arm a robot is up to the customer, Bielat said.
“We aren’t the ones who are going to think of these use cases. It’s going to be the end users as they get closer to the technology, as it gets more capable and less expensive. It’s going to be the end user who says, ‘wow, this additional capability would really make a difference and would really make my job safer if it had some level of armament on it,’” he said.
Time to Set Some Rules — And Quickly
That notion, and line crossed in Dallas, means it’s time to make some big decisions quickly.
“Now that we have crossed the rubicon of robots used to kill in domestic applications, strict guidelines must quickly be set as to when this is acceptable,” said Wendell Wallach, a Yale ethicist and author of A Dangerous Master: How to Keep Technology From Slipping Beyond Our Control. “From what little we know, the Dallas PD’s use was warranted. We are now beyond warfare in the use of robots to kill, and should not waste time in addressing how these technologies could expand in totally unacceptable ways.”
Are armed police robots going to be a good thing? The answer depends on who you ask.
A police officer dies in in the line of duty every 61 hours. It is dangerous work. But every day, three people in the United States are killed by police. The American public tend to view police shooting statistics through one of two lenses, one that either minimizes the importance of police shootings or that ignores the real danger that police face everyday. In fact, these trends are mutually reinforcing. A police officer who feels under constant threat also poses a greater likelihood of harm to the individuals he encounters.
In Warsaw, Poland, on Saturday, President Barack Obama said, “Part of what’s creating tensions between communities and the police is the fact that police have a really difficult time in communities where they know guns are everywhere. As I said before, they have a right to come home — and now they have very little margin of error in terms of making decisions.”
Robotics won’t fix the underlying fundamentals that contribute to crime in America. They won’t amend outdated drug enforcement laws that contribute to unsustainable incarceration rates; they won’t fix discrimination, or pervasive socio-economic inequality. They won’t remove guns from America’s streets, or teach people (including police) how to use guns with greater wisdom. But they can bring down the amount of pure adrenaline present in confrontations between citizens and cops, which could make these situations safer for everyone involved. Maybe, just maybe, they can increase the margin for error.