A Ghost Robotics ground robot accompanies a group of soldiers on a demonstration of robot piloting via brain signal

A Ghost Robotics ground robot accompanies a group of soldiers on a demonstration of robot piloting via brain signal Australian Army

Soldiers Can Now Steer Robot Dogs With Brain Signals

A small sensor tucked neatly behind the ear allowed soldiers to mentally guide robotic quadrupeds.

A breakthrough that enables a human to guide a robot merely by thinking could help troops on a future battlefield  to communicate with a wide array of sensors, vehicles, and robots—all while the enemy is looking to intercept radio communications.

A new paper, published this month in Applied Nano Materials by Australian researchers working with the country’s Defence Department, documents how a test subject directed a ground robot to waypoints simply by visualizing them through a Microsoft HoloLens.

The U.S. military has achieved some remarkable success with brain-computer interfaces. In 2015, a paralyzed woman with a brain chip developed through the Defense Advanced Projects Research Agency, or DARPA, was able to pilot a virtual F-35 using only brain signals. But such chips must be surgically implanted. And sensors that can be worn over the skin typically require gels for better electrical conduction. That just doesn’t work well for soldiers in helmets.

“The use of the gel contributes to skin irritation, risk of infection, hair fouling, allergic reaction, instability upon motion of the individual, and unsuitability for long-term operation due to the gradual drying of the gel,” notes the paper.

“Until now [the brain-computer interface or BCI] systems have only functioned effectively in laboratory settings, requiring [the user] to wear invasive or cumbersome wet sensors and remaining stationary to minimize signal noise. In comparison, our dry sensors are easy to wear in combination with the BCI. They work in real-world environments and users can move around while using the system,” Chin-Teng Lin, a professor at the University of Technology Sydney and one of the paper’s authors, explained to Defense One in an email.

The graphene-based sensor that the researchers developed works well inside a helmet. The researchers coupled that with a Microsoft HoloLens. As the wearer looked around using the HoloLens his brain would send out a signal via the occipital lobe. These signals were collected through the sensor and run through a small Raspberry Pi 4B computer, which translated the signals (via steady-state visually evoked potential, a formula for translating occipital lobe activity into clear computer signals) into instructions about a specific way point corresponding to a spot. Those instructions went to a Q-UGVs robot from Ghost Robotics, which proceeded to the point. 

The Australian military is also working with the researchers and tested the system before they published the paper. In a video posted to the Australian Army’s YouTube page a month ago, they describe the successful experiment. In a second demonstration, a commander issued instructions both to robots and fire-team members  to do a security sweep of an area. The soldiers monitored the robot’s video feed via the HoloLens headset.

“This is very much an idea of what might be possible in the future,” Australian Army Lt. Col. Kate Tollenaar says in the video. “We’re really excited to see where the technology might go and to work with our stakeholders on…use cases”