The Next Step Toward Autopilot in Combat

A U.S. Air Force Captain with the 55th Fighter Squadron prepares his cockpit in his F-16 prior to takeoff from Shaw Air Force Base, South Carolina

20th Fighter Wing Public Affairs by Senior Airman Kenneth Holston

AA Font size + Print

A U.S. Air Force Captain with the 55th Fighter Squadron prepares his cockpit in his F-16 prior to takeoff from Shaw Air Force Base, South Carolina

The military is looking to roboticize the most exhausting aspects of flying jets. By Patrick Tucker

Flying military combat aircraft requires an exceptional amount of decision making in a very short window with lots of distractions. Now, the Defense Department wants the defense industry to build them much better autopilot.

The Defense Advanced Projects Research Agency, or DARPA, announced a new program to build an automatic pilot kit to install into military planes. The kit, called the Aircrew Labor In-Cockpit Automation System, or ALIAS, would be “rapidly adaptable” for a variety of aircraft and would take on many of the tasks normally associated with piloting military jets.

One of the chief objectives of the program, according to a DARPA release, is to “reduce pilot workload,” which refers to the huge number of decisions pilots have to make when operating aircraft. “We have autopilot for standard stuff, the question is how do you extend those technologies for more complicated missions,” said Mary Cummings, Duke University professor and former Navy fighter pilot. 

“There’s lots of, ‘How do you change load outs inside of the aircraft in case you’re carrying something and you want to switch which weapons? Even what mode, whether you’re in fight or attack mode,” said Cummings, who flew F/A-18 Hornets. Pilot workload is a specialized name for a condition that afflicts everyone, decision fatigue. Every decision that a person has to make has a cost in terms of mental resources. The more decisions that a person must make in a short time period, the more likely that the decision won’t be quite right. It’s part of the reason human self-control is limited after stressful interactions. That’s where the ALIAS program comes in.

Pilot workload is part of the reason that 80 percent of commercial aircraft accidents are caused by human error, as opposed to mechanical glitches, according to Boeing. For a pilot, decision fatigue can influence her ability to focus and fixate on a goal. It can manipulate a pilot’s understanding of where she is (situational awareness), alter perceptions of risk and add to distraction. High workload can even affect physical changes such as pupil diameter, respiration and even heart rate according to one study at Shanghai Jiao Tong University’s School of Aeronautics and Astronautics. In some flight situations, the workload is split among two or more crew members, which can contribute to staff costs.

Neurological limitations are one reason why machines might, at times, be better pilots than humans. “You don’t want to take away the complex decisions so much as the complex maneuvering that goes along with the decisions,” Cummings said. “For example, it would be better if a computer could take over during air combat maneuvering because a computer can probably fly closer to exact limits than a human can. A computer could sense better that a human is about to pass out than a human can.”

Too much automatic pilot technology in the cockpit can also make flying far more dangerous, according to experts. That’s why getting pilots and automatic flight systems to communicate effectively remains so difficult. Barbara Burian, senior research associate at NASA’s Ames Research Center, has shown that “the presence of advanced technology in the cockpit does not necessarily eliminate high workload events during a flight.” When poorly designed, autopilot tools are more likely to add to pilot workload than decrease it, Burian’s research shows.

Autopilots also can create new dangers if the system has the authority to make too many decisions, as evidenced by Air Transat Flight 236 in 2001. That plane’s automatic fuel system relayed a very subtle message to the pilots that it‘s diverting fuel, but did not explain why. Fuel continued to be diverted to a leaky engine, causing a total power failure mid-air. The pilot glided the plane to an emergency landing with no fuel.

“The fuel leak that was unrecognized early on was aggravated by the fact that the fuel management system was pulling good, usable fuel from the tanks that weren’t leaking and shoving it into the tank that was,” Ohio State University Aviation researcher Shawn Pruchnicki told Defense One. He calls it a “classic example” of how hard it is to design an automatic system that reduces workload but still communicates effectively without “bugging” the pilot.

“It’s really difficult to establish that balance between more autonomy and less. Where do you put the human in the loop? That’s something that we’re struggling with in terms of the newer [autopilot] designs,” Pruchnicki said. “We’ve created automated systems that are powerful, strong and silent. And they’ve played a role in accidents where the crews have not been aware of what the automated system is doing.”

Is the ALIAS kit just an intermediate step to a fully automated Air Force? Cummings says yes. We may live in the age of high-performance unmanned vehicles, but the military aircraft under the decks of aircraft carriers need pilots. “The kit that they want to build would be adaptable across systems. It’s to fill the gap,” she said.

“What they’re effectively doing is building the kit for the F-35,” which Cummings calls a “ridiculously expensive, very marginally capable” airplane. “What are we going to do with all of those in the future? We might be able to make them more useful by turning them into partially piloted vehicles.” 

Close [ x ] More from DefenseOne
 
 

Thank you for subscribing to newsletters from DefenseOne.com.
We think these reports might interest you:

  • Federal IT Applications: Assessing Government's Core Drivers

    In order to better understand the current state of external and internal-facing agency workplace applications, Government Business Council (GBC) and Riverbed undertook an in-depth research study of federal employees. Overall, survey findings indicate that federal IT applications still face a gamut of challenges with regard to quality, reliability, and performance management.

    Download
  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

    Download
  • GBC Issue Brief: Supply Chain Insecurity

    Federal organizations rely on state-of-the-art IT tools and systems to deliver services efficiently and effectively, and it takes a vast ecosystem of organizations, individuals, information, and resources to successfully deliver these products. This issue brief discusses the current threats to the vulnerable supply chain - and how agencies can prevent these threats to produce a more secure IT supply chain process.

    Download
  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

    Download
  • Information Operations: Retaking the High Ground

    Today's threats are fluent in rapidly evolving areas of the Internet, especially social media. Learn how military organizations can secure an advantage in this developing arena.

    Download

When you download a report, your information may be shared with the underwriters of that document.