Marine lab is rolling with new robotic vehicles

The Corps' multi-tasking machines show how far robotics has come -- and how far it still has to go.

Marines robotic vehcile

MAARS takes in some target practice on a firing range.


Humanoid robots are getting a lot of attention these days, particularly with next month’s DARPA Robotics Challenge on tap, but the military has long relied on radio-controlled “tractor-type” robots—IED-hunting mini-tanks, mostly—as regular tools. An entire Pentagon enterprise—the Joint Improvised Explosive Device Defeat Organization (JIEDDO)—exists solely for this purpose. But the capabilities for these unmanned ground vehicles (UGVs) have moved well beyond familiar “find ‘em, explode ‘em” tasking.

The Marine Corps, for example, is working on several projects for ground robotic vehicles that show both how far robotics has come and how far it has to go.

Take MAARS. Built by Britain’s QinetiQ North America/Foster-Miller, the Modular Advanced Armed Robotic System is a true multi-tasker. An advanced test-bed dating to 2008 and laden with technology, it can feature various combinations of ISR sensors, such as pan-and-tilt day/night and zoom-lens cameras, FLIR night camera, thermal imagers, front and rear and infrared drive cameras and laser rangefinders that work out to 10 kilometers. It also could include a high-intensity spotlight, siren, a “dazzler” to temporarily blind restive crowds, voice projector, M240 machinegun, hostile-fire detection system, Uzi submachine gun, even smoke generators if required to quit the battle space. Not to mention a quad-40 millimeter rocket or grenade launcher.

Like Army iterations, it’s tele-operated, with the controller remotely situated, tapping a keyboard and viewing a computer monitor. A wearable option is a possibility.  For now, its range is line-of-sight.

The project comes under the Combat Robotics System (CRS) program.  “We started CRS to understand how the dynamics of man-machine interactions would work,” said Capt. James Piniero, a Marine Warfighting Lab robotics project lead handling multiple portfolios. “And essentially, we’re less concerned with the specific equipment piece than with concept-based experimentation.”

The focus is on how robots might help infantry Marines as they do static-post guards, explosive ordnance disposal crews and engineers. “MAARS acts as an advanced optics suite with a direct-fire weapon aboard,” said Piniero, who spoke with Defense System recently after an Association of Unmanned Vehicle Systems International convention, where MAARS was showcased.

Its multiple capabilities notwithstanding, MAARS isn’t perfect—though its limitations aren’t with the platform but with how it’s controlled. Marine ground robots in general “don’t yet have the level of autonomy required to navigate” glitchlessly, or understand certain basic commands, Piniero said.

That lack of autonomy increases the “cognitive load” on Marines when driving/controlling MAARS during offensive operations. “They just don’t have the time to deal with it, and it becomes a burden instead of something that helps,” he said. The Marines, like the other military services, want to include “intelligent behavior” in future autonomous systems, meaning that the systems are easy to work with.

The Marines are still tweaking MAARS, expecting to improve its operations by year’s end. Meanwhile, the Marine Warfighting Lab has a “brand-new” tracked-vehicle project—the Robotic Vehicle-Modular (RV-M), Piniero said.

Marines Robotic Vehicle Modular

The Marines’ new Robotic Vehicle-Modular on display at a recent conference.


At about 800 pounds, the next-gen vehicle is smaller than a jeep but larger than MAARS. It’s also highly mission-configurable, like MAARS, but with lots more “elbow room.”

The RV-M project utilizes a one-off Polaris Defense/TORC Robotics vehicle. It carries the Marines’ Ground Unmanned Support Surrogate (GUSS) autonomy package, yielding, Piniero said, “a higher level” than the MAARS. He’s planning for a remote weapons station for direct fires experiments, plus a targeting package with laser designator. And MAARS-type ISR pieces, optics and other components are under discussion.

A real draw is the vehicle’s adaptability as a platform for the fearsome Javelin missile. (It can host a heavy machine gun, as well.) Another near-horizon prospect is communications and intelligence-sharing with other vehicles.

As part of another project, Unmanned Tactical Autonomous Control and Collaboration (UTACC), the robots share information. “If one identified a target it can pass it to another of the same or different type, say, aerial – for them to have a shared awareness of the battlespace. ... If the air robot sees something the ground can’t see, together they can solve a complex problem.”

A February demo validated this. A limited reconnaissance mission, Piniero explained, was done in a lab environment using a ground robot and a quad-copter, “and [organizers] placed a ball in there and the machines had to ID the object. It was in a place where the ground robot couldn’t see or find it, and because it couldn’t they launched the [unmanned aerial vehicle], which found it, snapped a picture, communicated with the ground saying, ‘yes I found it and pictured it,’ then alerted the Marine and said ‘is this the object you’re looking for?’ ”

Envisioned within several intersecting, parallel “future requirements” projects is “a robot or family of robots and then of course learning the man-machine interactions ... in support of a Marine infantry squad or company. CRS/MAARS was a piece of that.”

In a combat patrol, a “drives-by-itself” autonomous system frees the Marine, lets it tell the machine to go or return the its last known point, giving the Marine more crucial time and space. “So it eliminates the need to provide constant input so you can take your attention away from just controlling it and being in the fight.”

RV-M’s operational range is limited to the radio used at the time, “but that’s where the autonomy takes over, because that doesn’t necessarily need direct input from the controller. Essentially, it’s line-of-sight,” he said.

Regarding so-called “swarming” or “ganging” of robots, he said, “Yeah – that’s exactly what we’re driving towards. That command-and-control project is called Unmanned Tactical Autonomous Control and Collaboration (UTACC) – and that’s the overarching backbone of all of this. RV-M hits on modularity and multi-mission packages for the infantry squad or company [with] a higher level of mission autonomy.” And, he added, with minimal Marine input.

“IROC, Intuitive Robotic Operator Control, focuses on [robots recognizing and obeying] hand/arm signals, possibly incorporating radio-frequency options.”

Basically, it’s about getting the robot to come, to back up or maneuver left or right. “So in case I lose radio frequency or the radio’s malfunctioning, can the robot receive my inputs and can I have basic control of it?” 

Platform-to-platform data-migration is a distinct possibility. “As part of UTACC the robots share information. If one identified a target it can pass it to another of the same or different type, say, aerial – for them to have a shared awareness of the battlespace, a ‘common picture.’ Because part of what the Marines want from the collaboration is so that if the air robot sees something the ground one can’t,” together they can solve a complex problem.”

The Marines are hoping all the UGV initiatives pan out by 2018. “We finished year one in February with that UTAAC demonstration at Carnegie-Mellon, and now were in the second year and we’ll have another demo in the Washington, D.C. area around December.”

“The biggest thing to fight on the ground and be safe is to have time and space from the threat,” Piniero said. “And that’s what autonomous systems buy us, because you can put an enhanced camera on it; you can have software that does change detection, visual recognition and so on.”

It’s sensor fusion, he concludes, that creates intelligence. “And that allows Marines to understand the threat, understand the environment quicker, and basically see the threat before it sees them.”

NEXT STORY: This is your brain on a computer