Army avatars have the human touch
An ARL-sponsored program is using natural language processing to create virtual humans who can interact with and even counsel soldiers.
Virtual Staff Sgt. Jessica Chen helps soldiers practice basic counseling skills.
Ellie is a good listener. She knows how to react to smiles, frowns and gestures, when to prod for more details and when to give a compassionate, empathetic response.
Ellie is a computer program, an avatar created by the Army Research Laboratory and the University of Southern California’s Institute for Creative Technologies as part of a project to improve human-computer communication in order to improve immersive training, education and even counseling.
Research at ICT, an Army-funded research center, combines natural language, text, image and video processing to develop a multi-modal human-robot dialogue, according to an ARL news release. The project has produced a variety of autonomous intelligent agents—virtual humans—that can act as mentors, role players in virtual exercises or screeners, for tasks such as teaching leadership skills or helping to prevent suicide, sexual assault and harassment.
Ellie, for example, has interviewed more than 600 people with an eye toward identifying depression and PTSD, as part of ICT’s SimSensei project, which is funded by the Defense Advanced Research Projects Agency. And according to a recent study, the Army said, people were more willing to reveal their feelings to Ellie than to a real person.
The idea is to enhance the military’s growing use of immersive training by building on research toward developing realistic avatars, virtual humans and intelligent agent technologies that can provide easy, cost-effective access to training and other services.
"Research and technology are essential for providing the best capabilities to our warfighters," said Dr. Laurel Allender, director of the ARL Human Research and Engineering Directorate. "This is especially so for the immersive and live-training environments we are developing to achieve squad overmatch and to optimize soldier performance, both mentally and physically."
ICT’s projects include the Virtual Standard Patient, or VSP, in which medical students engage with virtual patients in order to practice their interview and diagnostic skills, and the Emergent Leader Immersive Training Environment. ELITE uses an ICT-developed natural language understanding technology for applications such as Sgt. Star, who takes questions about Army careers, and Staff Sgt. Jessica Chen, who helps soldiers practice basic counseling skills. The trainer for ELITE can be downloaded at the Mil.Gaming portal and is used by the U.S. Military Academy, ROTC, the Basic Officers' Leader Course and the Warrior Leader Course, ARL said.
The ARL/ICT project is one of several going on within the Defense Department to human interaction with computers to the next level. DARPA, for instance, is working on something similar with its Communicating with Computers program, which aims to get humans and computers to collaborate verbally on solving medical and other problems.
ICT is combining basic and applied research in natural language processing with a variety of other areas, such as entertainment and storytelling technologies, to advance education, health and performance.
"Our scientists are leaders in the fields of artificial intelligence, graphics, virtual reality and computer and story-based learning and what is unique about our institute is that they bring their disparate expertise together to find new ways to solve problems," said Randall W. Hill Jr., ICT executive director. "Being managed by ARL also provides great opportunities for collaboration and for aligning our research priorities with Army needs."