This Could Be the Future of Battlefield Robotics
The competition is on for a better-robot steering system. By Patrick Tucker
The floor of the Walter E. Washington Convention Center in Washington, D.C., was an obstacle course this week, as the Association of the United States Army convention brought together, among other defense contractors, various robot makers from around to demo their goods to military leaders and the curious.
Piloting systems were a big draw.
Despite the drawdown in Afghanistan and the end of the war in Iraq, the military still needs robotic systems to detect improvised explosive devices as well as provide tactical intelligence or to look around the next corner, so to speak.
In the up-and-coming category is a young Israeli firm called Roboteam, which markets a variety or robots, the coolest of which is the Micro Tactical Ground Robot, or MTGR. The two-year-old MTGR made some news last summer as Israeli Defense Forces deployed them to search out Hamas tunnels in Gaza.
At around $70,000 for a no-frills unit, it’s much cheaper than competing systems like the recently unveiled the QinetiQ Talon V or the iRobot Packbot, which are priced starting at $100k. The MTGR is also less rugged, but that may not be problem for a robot that’s been designed to find stuff that blows up.
Which system is most likely to show up on the battlefield of tomorrow? There are number of different metrics and competitions that the military is using to determine that, everything from interoperability testing to the so-called Culvert Denial Challenge, a $50 million competition to find the best system to detect and inspect IEDs (and possible IEDs) that insurgents might place along roadside culverts in places like Afghanistan.
The competition took place on Oct. 10 at Fort Benning, Ga., and while the results have not yet been released, Corey Capone, a product manager from Roboteam North America, told Defense One certain that the company did “extremely well.”
The next step for robot manufactures is to make them easier to operate in groups. Last week, iRobot released a new Android piloting system called uPoint Multi-Robot Control. Defense One tested it on site and found it fast, intuitive and not unlike an iPad-based first-person-shooter video game.
The view is what the robot sees from one of its designated cameras. Press a button and you can steer it without a joystick simply by moving your finger on the screen. The Android tablet detects finger movement and actually plots where the user might send the machine in the form of yellow trajectory lines that show up on the screen. These allow the user to know if they are sending their $100,000 robot off a cliff. You can also designate a spot within line of sight and the robot will move there automatically. And you can easily switch between the different robots in network as easily as you might switch between apps.
Roboteam’s response is Tactical Situational Awareness (TacSA) system, a piece of battlefield intelligence software that manages camera and other data feeds on the MTGR and other Roboteam bots. The software is undergoing final beta-testing now and the company plans to officially unveil in January. It’s a bit less fun but more informational, providing the user with a bird’s eye view of different robot assets in the area as well as their status. The company's ROCU7 controller allows for easy switching between assets, as many as 200 MTGRs, but without simultaneous control. (The video-game equivalent would be Age of Empires.)
Within a matter of months, they want to be able to integrate multi-robot control into the interface so a user would be able to pilot two robots at the same time, a MTGR or other Roboteam bot and a drone from Stark Aerospace, (aided by sensor support form Israel Aerospace Industries.) The drone would hover in what they call “loiter mode,” collecting video or other data while the operator controlled the MTGR and then switched back forth.
At some point, the military will want robots capable of running their own missions, hunting for IEDs, looking around corners and sending visual data to the cloud for rapid—and robotic—visual analysis and all without direct piloting. A single operator would be able to control dozens of robots that weren’t just loitering but carrying out operations.
But that’s a technical challenge that vendors are unlikely to have to tackle themselves. The robot community will do the work and contribute the relevant code to the Robot Operating System or ROS, open source software library developed by Willow Garage.
Wireless ad hoc networks, also called wireless mesh networks or WMTs, will enable robots to keep in contact across farther distances. These are networks of radio nodes, routers, phones and laptops sharing encrypted communication. Because the communication is not point to point its difficult to jam. Better WMTs means that the robots of the future will be able to move farther and operate more autonomously.
A little shop like Roboteam doesn’t need a bunch of super artificial intelligence geniuses on staff. They just need to figure out the best way to integrate peripheral components like chemical sensors, cameras and infrared scanners into the system and make all of that data user accessible with less annoyance. Most importantly, they need deliver an interface that will allow people with very little training direct robots to perform vital and ever more complicated tasks.
NEXT STORY: These Web Sites Are Tracking the Spread of Ebola