The Future Of Flight Training Is A Gamer Headset That Watches You Play
Google and other companies are helping design a smarter, cheaper way to produce aviators.
In the 1986 film Iron Eagle, aspiring fighter pilot Doug Masters rehearses for a dangerous mission in a flight simulator the size of a barn. He can’t practice when there’s no technician around, or when it’s needed by other pilots. Believe it or not, flight training hasn’t changed a great deal since, said Eric Frahm, a program manager with the Defense Innovation Unit, or DIU — though a new Air Force program involving Google and other companies looks to change that, enabling pilots to learn much faster through off-the-shelf gamer equipment and better collection of performance data.
Frahm says that the basics of flight training have remained largely static for decades. “You read about [how to fly]; you do it in a classroom; you take an academic test; you do it in a simulator; then you go do it in an airplane. So you really only get two to three looks at something before it’s time to really perform and essentially get evaluated on it,” he said The result is “a lowest-common-denominator approach to training.”
The Joint Immersive Training System aims to train pilots-to-be using off-the-shelf gamer gear such as virtual reality headsets. This will enable pilots to train without big, expensive flight simulators. More importantly, it will allow trainers to gather and collate more data about their trainees’ progress on written tests and simulated flights.
“We’re moving that same data infrastructure to a place where we can collect it [from] the aircraft itself, despite the fact that these are very basic training aircraft that don’t have things like a data link or traditional data capture methods,” Frahm said.
That will give instructors insight into student performance that would be impossible to capture from a specific score on a test. That would enable instructors to better understand trainees’ knowledge and skill gaps beyond their explicit answers.
“We’re going the next step and saying, ‘They got that question right but it took them five minutes’,” he said. That will lead to more sophisticated analysis of a student’s strengths and weaknesses, and help instructors understand what they need to teach. ‘They got that question wrong and ⅔ of the time when they get that question wrong, this other thing happens three weeks later.’ And so we’re going to proactively let the instructor know that. When they get to that event, the instructor needs to cover that in detail so we know the student knows it.”
Google is providing the cloud infrastructure to power the platform. “This enables them to use commercial off-the-shelf equipment, virtual reality headsets, joysticks, gaming chairs, so the Air Force doesn’t have to build its own immersive training devices that can be inexpensively built, scaled out,” said Mike Daniels, vice president of Google Public Sector.
It’s a direction that the Air Force has been moving toward for close to three years, said Frahm, but couldn’t realize fully until now. “What we’ve seen is enough to be convinced that we know this is technically possible to do. What we don’t have is the interconnects to actually start doing it yet,” he said. “That is what Google is helping us.”
NEXT STORY: AI Is Reshaping the US Approach to Gray-Zone Ops