DARPA

Here’s the Key Innovation in DARPA AI Project: Ethics From the Start

A new effort to build patrol drones for urban fights began by forming an ethics advisory board.

A DARPA program seeks AI-infused drones that can help prevent friendly fire and civilian casualties in urban battles. But the truly innovative part of the URSA effort might be the inclusion of ethics advisors from the very start.

“It’s the first time we’ve considered this program approach. Certainly [before] we might consider it at the end,” said Lt. Col. Philip Root, program manager for the Urban Reconnaissance through Supervised Autonomy program. “It’s not that program managers shy away from that, but here we wanted to try something different which was invoke this analysis early on, and it’s proven to be absolutely essential.”

Root said URSA aims to collect information about people in complex warfighting environments, in order to help humans understand who is a threat.

“We really want to try to ensure we allow non-hostiles, non-combatants, to move out of the way. Future urban conflict is going to take place in large cities where the population can’t just go to the mountains,” Root said. “So we have to consider that all this is going to occur around people who don’t want to be there.”

Root said the development of such technology that interacts with humans is “fraught with legal, moral, and ethical implications,” which is why the ethics team was involved in the outset.

“We met [with the ethicists] even before we had technical performers on contract to begin thinking about the ethical problems we have and actually putting it on paper,” Root said. “Don’t know if the technical [side] will actually work, but we know it will be far more ethical and aligned with our national ethos.”

One of those advisors is Paul Scharre, who directs the Technology and National Security Program at the Center for a New American Security (and is an occasional contributor to Defense One).

“One of the things that DARPA is doing right here is reaching out early in the process to talk to a diverse array of experts on legal moral and ethical issues,” Scharre said. “If you reach out early and find out where there might be concerns, then that can drive actually how the technology develops, which is really critical.”

Scharre said he was surprised that URSA was the program to use this approach because supervisory autonomy “doesn’t really seem to raise a lot of difficult legal, moral, and ethical quandaries.”

“But, when you get into issues of how you’re presenting information to humans, and what that information is conveying, those things matter quite a bit,” he said.

Here’s how URSA works. Root said DARPA seeks to “replicate a traffic control point, but do so in the wild.” Spotting an unknown person on the street, the drone swoops in to deliver a warning message, then observes how the human responds. It may do this several times depending on whether the human responds affirmatively to the drone’s instructions, like “turn north” for example, or whether it ignores the drone. All of the information about how the human responded to the drone’s verbal or alarm stimuli, along with video and location data, are passed to a human who will help decide what to do about the situation.

“We seek to prevent a case where a non-hostile and a soldier or marine, come in contact with each other surprisingly. No one wins in that engagement. The longer we can provide information about those who are approaching, everyone in that engagement wins,” Root said.

Ultimately, the judgment on the person’s risk is left to a human, which is important to both Scharre and Root.

“We try to use the autonomy where appropriate, where suspicion is low, and when suspicion increases, revert to a more human-in-the-loop mode,” Root said.

“The idea of using autonomous systems to gain information about the battlefield or broader environment makes a tremendous amount of sense,” Scharre. “For me at least, I would distinguish between automation as presenting factual information to people, and automation as making a judgment ... I’m less comfortable with the role of automation in judgments.”

Scharre and Root both said keeping humans on the loop is important because machines lack context.

“Things like ‘is that person a combatant?’ That might be a difficult question to answer … and machines aren’t very good at that today,” Scharre said.

Root says URSA is in development, and will enter testing in fiscal 2021.

“It’s important to get the field trials so that we can technically see what it looks like but also ethically, legally, put ourselves in there and say ‘this is different than we thought,’” he said.