These might not be the droids you’re looking for, but these droids are looking for you. The Marine Future Warfighting Laboratory recently tested a new robot team that works together to look for objects, provide situational awareness, and hunt for “bad actors” on the battlefield.
The test married a ground robot, like an iRobot PackBot, with a small six-rotor aerial drone that docked with it and launched from its back. Working with virtually no human guidance, the pair of robots hunted for objects and reported on their progress.
Defense One caught up with Marine Corps Lt. Col. James Richardson Jr. at the Navy League’s recent Sea-Air-Space conference outside Washington D.C. He detailed the unmanned tactical autonomous control and collaboration, or UTACC, project, and its late-February demo at Carnegie Mellon University.
“The robots were given the task to find [an] object, take a photo of it, send it back to me,” Richardson said. “First, the ground vehicle, after some time, found it, took a photo and sent it to me saying, ‘Hey, I found the target.’ Then we hid the object in a manner that the ground robot couldn’t find it.”
Unbidden, the robot team dispatched the unmanned aerial system, or UAS, to locate the object, he said.
“You didn’t have the Marine controlling it; you had the software actually controlling the UGV [unmanned ground vehicle] and UAS to do the mission. That frees up the Marine from having to sit and control every second,” Richardson said.
The test confirmed that the ground and aerial robot could create and instantly share a 3-D map of a given environment and then autonomously locate objects within it, working as an independent team but with an operator “in the loop,” still overseeing but providing no guidance.
Brig. Gen. Kevin J. Killea called the demonstration the “next level” in unmanned teaming. “The unmanned systems must recognize what they’re being told to do, formulate a plan, and then execute a shared understanding of mission requirements,” he told a crowd at Sea-Air-Space.
In the test, the object in question was a green marker attached to a clipboard. The Marines, obviously, have bigger targets in mind, including people.
“In the future, where we would like to go is use one of the ground robots that we’re currently using in the Marine Corps and a UAV and, say, pick someone out in the crowd or look for a certain vehicle, or something like that … Say we know what Patrick looks like and have something flying around and on the ground. We can say, ‘We’re looking for Patrick because he’s a bad actor. Find him.’ Then the system can take a photo and say, ‘We found him. This is where they’re located.’”
While the most recent demo featured a ground robot and a simple propeller drone, Richardson emphasized that future tests could involve a variety of robots. Keeping UTACC “platform agnostic” is key to its utility.
“The point is to take any sort of UGV and any sort of UAV, with sensors and things of that nature and give them a mission. The meat and potatoes is the software engine that allows the robots to share information,” he said.
“Imagine: the Marine operator tells the unmanned systems what to do, not how to do it. This frees him up to work on other tasks while the autonomous systems collaborate together on tasks at hand to accomplish the mission,” Killea said.
Future UTACC projects will also use ground and environment sensors in addition to the sensors on the robots themselves.
Carnegie Mellon has a long history of robotics innovation in collaboration with the Department of Defense. In 2004, the Defense Advanced Research Projects Agency, or DARPA, sponsored its first competition for a self-driving car. While no team completed the completion requirements that year, Carnegie Mellon came the closest. Two Carnegie Mellon teams came in second place and third place in next year’s competition, the DARPA Grand Challenge (2005), the event that ushered in the self-driving car era.
The Marines are planning another test for UTACC in the fall of this year, according to Richardson.