The Next Step Toward Autopilot in Combat

A U.S. Air Force Captain with the 55th Fighter Squadron prepares his cockpit in his F-16 prior to takeoff from Shaw Air Force Base, South Carolina

20th Fighter Wing Public Affairs by Senior Airman Kenneth Holston

AA Font size + Print

A U.S. Air Force Captain with the 55th Fighter Squadron prepares his cockpit in his F-16 prior to takeoff from Shaw Air Force Base, South Carolina

The military is looking to roboticize the most exhausting aspects of flying jets. By Patrick Tucker

Flying military combat aircraft requires an exceptional amount of decision making in a very short window with lots of distractions. Now, the Defense Department wants the defense industry to build them much better autopilot.

The Defense Advanced Projects Research Agency, or DARPA, announced a new program to build an automatic pilot kit to install into military planes. The kit, called the Aircrew Labor In-Cockpit Automation System, or ALIAS, would be “rapidly adaptable” for a variety of aircraft and would take on many of the tasks normally associated with piloting military jets.

One of the chief objectives of the program, according to a DARPA release, is to “reduce pilot workload,” which refers to the huge number of decisions pilots have to make when operating aircraft. “We have autopilot for standard stuff, the question is how do you extend those technologies for more complicated missions,” said Mary Cummings, Duke University professor and former Navy fighter pilot. 

“There’s lots of, ‘How do you change load outs inside of the aircraft in case you’re carrying something and you want to switch which weapons? Even what mode, whether you’re in fight or attack mode,” said Cummings, who flew F/A-18 Hornets. Pilot workload is a specialized name for a condition that afflicts everyone, decision fatigue. Every decision that a person has to make has a cost in terms of mental resources. The more decisions that a person must make in a short time period, the more likely that the decision won’t be quite right. It’s part of the reason human self-control is limited after stressful interactions. That’s where the ALIAS program comes in.

Pilot workload is part of the reason that 80 percent of commercial aircraft accidents are caused by human error, as opposed to mechanical glitches, according to Boeing. For a pilot, decision fatigue can influence her ability to focus and fixate on a goal. It can manipulate a pilot’s understanding of where she is (situational awareness), alter perceptions of risk and add to distraction. High workload can even affect physical changes such as pupil diameter, respiration and even heart rate according to one study at Shanghai Jiao Tong University’s School of Aeronautics and Astronautics. In some flight situations, the workload is split among two or more crew members, which can contribute to staff costs.

Neurological limitations are one reason why machines might, at times, be better pilots than humans. “You don’t want to take away the complex decisions so much as the complex maneuvering that goes along with the decisions,” Cummings said. “For example, it would be better if a computer could take over during air combat maneuvering because a computer can probably fly closer to exact limits than a human can. A computer could sense better that a human is about to pass out than a human can.”

Too much automatic pilot technology in the cockpit can also make flying far more dangerous, according to experts. That’s why getting pilots and automatic flight systems to communicate effectively remains so difficult. Barbara Burian, senior research associate at NASA’s Ames Research Center, has shown that “the presence of advanced technology in the cockpit does not necessarily eliminate high workload events during a flight.” When poorly designed, autopilot tools are more likely to add to pilot workload than decrease it, Burian’s research shows.

Autopilots also can create new dangers if the system has the authority to make too many decisions, as evidenced by Air Transat Flight 236 in 2001. That plane’s automatic fuel system relayed a very subtle message to the pilots that it‘s diverting fuel, but did not explain why. Fuel continued to be diverted to a leaky engine, causing a total power failure mid-air. The pilot glided the plane to an emergency landing with no fuel.

“The fuel leak that was unrecognized early on was aggravated by the fact that the fuel management system was pulling good, usable fuel from the tanks that weren’t leaking and shoving it into the tank that was,” Ohio State University Aviation researcher Shawn Pruchnicki told Defense One. He calls it a “classic example” of how hard it is to design an automatic system that reduces workload but still communicates effectively without “bugging” the pilot.

“It’s really difficult to establish that balance between more autonomy and less. Where do you put the human in the loop? That’s something that we’re struggling with in terms of the newer [autopilot] designs,” Pruchnicki said. “We’ve created automated systems that are powerful, strong and silent. And they’ve played a role in accidents where the crews have not been aware of what the automated system is doing.”

Is the ALIAS kit just an intermediate step to a fully automated Air Force? Cummings says yes. We may live in the age of high-performance unmanned vehicles, but the military aircraft under the decks of aircraft carriers need pilots. “The kit that they want to build would be adaptable across systems. It’s to fill the gap,” she said.

“What they’re effectively doing is building the kit for the F-35,” which Cummings calls a “ridiculously expensive, very marginally capable” airplane. “What are we going to do with all of those in the future? We might be able to make them more useful by turning them into partially piloted vehicles.” 

Close [ x ] More from DefenseOne
 
 

Thank you for subscribing to newsletters from DefenseOne.com.
We think these reports might interest you:

  • Software-Defined Networking

    So many demands are being placed on federal information technology networks, which must handle vast amounts of data, accommodate voice and video, and cope with a multitude of highly connected devices while keeping government information secure from cyber threats. This issue brief discusses the state of SDN in the federal government and the path forward.

    View
  • Military Readiness: Ensuring Readiness with Analytic Insight

    To determine military readiness, decision makers in defense organizations must develop an understanding of complex inter-relationships among readiness variables. For example, how will an anticipated change in a readiness input really impact readiness at the unit level and, equally important, how will it impact readiness outside of the unit? Learn how to form a more sophisticated and accurate understanding of readiness and make decisions in a timely and cost-effective manner.

    View
  • Cyber Risk Report: Cybercrime Trends from 2016

    In our first half 2016 cyber trends report, SurfWatch Labs threat intelligence analysts noted one key theme – the interconnected nature of cybercrime – and the second half of the year saw organizations continuing to struggle with that reality. The number of potential cyber threats, the pool of already compromised information, and the ease of finding increasingly sophisticated cybercriminal tools continued to snowball throughout the year.

    View
  • A New Security Architecture for Federal Networks

    Federal government networks are under constant attack, and the number of those attacks is increasing. This issue brief discusses today's threats and a new model for the future.

    View
  • Information Operations: Retaking the High Ground

    Today's threats are fluent in rapidly evolving areas of the Internet, especially social media. Learn how military organizations can secure an advantage in this developing arena.

    View

When you download a report, your information may be shared with the underwriters of that document.