Automation refers to control of a process or system by a machine or electronic device. Each automated system requires a different level of monitoring by the user. Some require extensive operator input and monitoring, while others are almost completely independent. For example, entering an elevator car and selecting the desired floor requires minimal monitoring. Once the operator selects a floor, the elevator starts a complex process that delivers the car to the desired location and opens the doors when appropriate — all with minimal operator involvement.
Human-machine researchers have defined eight levels of automation, ranging from systems where the operator must do everything with little help from the automation to those where the automation does everything, ignoring the operator.1 In aviation, automation designed for pilots falls in the middle of this spectrum. This automation level “executes the suggestion automatically, then necessarily informs the human.”1 Aviation’s position along the spectrum has fluctuated over time as avionics and airplane systems have advanced. Compare an early model Boeing 727 with the new Boeing 787. The 727, introduced into airline service in 1964, required extensive pilot involvement and contained modest automation. This level of automation tasked the pilots with computing almost every performance and navigation solution.
In comparison, the 787’s advanced flight management system (FMS) can compute solutions far more accurately than a human can, and is more in line with the machine performing the actions while advising the operator.
As automation has gained in sophistication and systems integration, the role of the pilot has shifted toward becoming a monitor or supervisor of the automation. Instead of actively controlling many of the processes, pilots are increasingly tasked with evaluating the computed solution and either stopping automated control or allowing it to continue. The paradigm shift is significant, as it requires a different pilot skill set to be added to the traditional “stick and rudder” skills.
Pilots now need to learn new coping and automation management techniques to quickly and accurately interpret the high volumes of automation-generated data in real time and turn them into useful information. The trend on the level of automation will continue in only one direction. With the proliferation of automation-centric technologies such as RNP/AR (required navigation performance/authorization required),2 any idea of “un-automating” aircraft will not be practical if the aviation industry is to meet its goals of increased airspace system capacity, noise mitigation and carbon-emission reduction.
Measuring pilot attitudes about automation and collecting information about automation coping strategies were part of a study by the author on how boredom affects automation complacency in modern airline pilots.3,4 The survey used in the boredom study contained several open-ended questions to pilots about how they perceived the automation and what individual coping strategies they used in connection with it.
The sample group of 273 airline pilots was roughly 4.5 percent of the total pilot population in the major airline from which the sample was drawn. Each pilot was experienced in a highly automated aircraft. The bulk of the sample group — 54.4 percent — were between the ages of 41 and 50, with the next highest group — 28.1 percent — between the ages of 51 and 60. Thirty-six percent flew wide-body aircraft internationally. Finally, 76.8 percent had flown their airplane type for more than two years — which, significantly, allowed time for the pilots to become comfortable in it and establish individual automation attitudes and coping strategies.
Attitudes About Automation
One of the dominant themes that emerged from the question about automation in general was a wide-ranging lament about the effect of automation on maintaining hand-flying skills.
Of the 105 responses to this question, 33 percent indicated that a degradation of traditional flight skills is a significant issue in their daily flying, including how they deal with increasingly complex aircraft and operations. One pilot wrote, “As I hand-fly less, I become more dependent on the automation.” Another pilot described a side effect of automation dependency: “When the automation screws up, trying to play catch-up is hard to do because most pilots have relaxed too much and are not 100 percent in the loop as to where they are.” Yet another wrote, “Too many of my co-pilots fly with automation way too much. Their skills suffer from not hand-flying as much as they should.”
A pilot said, “As experience levels decrease overall in many companies, the automation and the decrease in ‘hand-flying’ training will continue to kill crews and passengers.” One pilot described the role change: “It has forced us to become system monitors more than pilots. I must force myself to be actively engaged. Huge decrease in job satisfaction.”
A pilot summed up the unwanted effect of automation: “I am a line check airman with 36 years in high-performance jets. The majority of pilots that I fly with do not back up the automation with raw data. Basic airmanship has dropped out of the training program. This is reflected by complacency on the flight deck and an unwarranted trust in the automation.”
The level of trust that a pilot can place in automated systems emerged as an issue in roughly 16 percent of the 105 responses on the subject. One principal factor that influences the level of trust is the perceived reliability of the system in question.5–7
Reflecting on this issue, one pilot said, “I use automation but I don’t trust it.” Other pilots echoed this sentiment in comments such as, “I try never to totally trust the automation, and I make every attempt to verify that the automation is doing what I expect.” One pilot reported treating automation as if it were “a student pilot.” Another pilot’s attitude toward automation was to “very seldom let the aircraft automation fly the approach.”
The level of trust guides the level of automation usage when the complexity of a system or time available prevents complete understanding of the nuances of an automated system. By deliberately mistrusting the automation, pilots bias their attention toward actively monitoring the automated system rather than assuming correct operation and focusing attention elsewhere. Many pilot comments reflected that the perceived reliability of the automated system directly affected the trust they had in that system and their level of vigilance. In situations prone to automation errors — in other words, poor reliability — the trust decreased, leading to increased vigilance and monitoring. For example, automation mode transitions were reported to be a frequent error source resulting in specific coping strategies.
One pilot spoke of “treating the automation like a bad copilot and watching everything the airplane is doing while in ‘transitional’ mode.” Another said, “Trust but verify, pay attention to detail, expect the unexpected, be suspicious when things are going too smoothly.”
Several pilots reported consciously verbalizing automated modes as a means of heightening their vigilance and automation situational awareness. They expressed this in such comments as: “With every button push, whether FMC [flight management computer] or autoflight, a confirmation is made verbally”; “Audible callouts, point and say”; “Verification for the other pilot, verbalizing what I observe”; and “Verbalize to the other pilot so he looks also.” This strategy effectively moves automation operation out of the automatic-task domain, where operation occurs subconsciously, into the high cognitive processing area of conscious thought.
Of all the comments by pilots, enhanced vigilance brought about by suspicions about reliability was the most common. A pilot commented, “I never trust automation for altitude capture. I assume it is going to fail.” Another pilot said, “I think it is very important to have personal cross-check and habit patterns where you program the FMS or MCP [mode control panel] and then verify on the FMA [flight mode annunciator]. I don’t think SOPs [standard operating procedures] do enough. Some pilots are very good at cross-checking, and some don’t perform it at all.”
Skepticism about automation led to various pilot coping strategies.
Automation Degradation and Hand Flying
Instead of trusting the automation always to work as advertised, many of the pilots in this study deliberately used less-complex alternative automation modes or different techniques to achieve the same result and remain actively engaged in the flight. One pilot said, “I like to use different modes of automation to monitor the progress of the flight. For example, on the B-737, one can engage the autopilot without the flight director on, using ‘control wheel–steering’ and pitch. I’ll use these modes to ‘capture’ the programmed VNAV [vertical navigation] and LNAV [lateral navigation] modes while monitoring the FMAs on the electronic attitude direction indicator. This requires more of my attention, is more ‘hands on’ and thus keeps my situational awareness at a high level.”
Other remarks added to the theme of downgraded automation usage. For example, “Very rarely do I let VNAV descend the plane. I will use vertical speed or level change”; “I fly with the flight directors off to stay mentally sharp and in the game. Also, autoflight and autothrust are off a lot, too”; and “I prefer VSPD [vertical speed] to VNAV for descents, utilizing the green arc [a display symbol that shows where the aircraft will reach the selected altitude].”
The benefits of such pilot strategies include less boredom and more vigilance, that is, maintaining attention for long, uninterrupted periods.8 Conventional theories on why vigilance suffers over time — the decrease begins after approximately five minutes — used to revolve around the monotony of the activity. Recently, cognitive scientists have determined that vigilance varies directly with the complexity of the task.7 The more cognitively demanding a task is, the more likely the user is to “load shed” and assume correct automation operation instead of allocating the necessary mental resources to monitor it.
Compared to hand-flying an aircraft, reading, interpreting and acting on automation-related information is a far more cognitively intensive process. Cognitive scientists consider reading and interpreting information a high cognitive task and hand-flying an automatic task. In the automatic-task realm, manual control occurs at a subconscious level, can occur in parallel with other activities and can occur very rapidly. For example, if airline pilots need to adjust pitch attitude during a hand-flown approach, they do not need to go through the entire decision-making process — the correction occurs subconsciously and automatically. Contrast this with high cognitive processing, which forces a pilot to think through each individual interaction with the automation. Interestingly, the task that requires the greatest amount of high cognitive function is monitoring items such as aircraft status.9
To lower the level of mental processing required, many pilots choose to hand-fly at times when they could rely on the automation. One pilot summed up this concept: “The more complicated the ‘button pushing’ becomes, the sooner I disconnect the auto systems, including the autothrottles.” Another wrote, “When I become task saturated with programming automation, I click off the autopilot and fly the airplane!”
The data in this survey support the anecdotal comments. Of the entire sample group, 85.3 percent hand-fly as much as possible, consistent with weather and fatigue factors. Only 17 respondents in the survey sample, or 6.2 percent, turned the automation on as soon as possible after takeoff, while 33 pilots, or 12.1 percent, kept the automation on as long as possible. Pilots who chose to hand-fly preferred varying autopilot engagement and disengagement altitudes.
Embracing Traditional Skills
Despite their highly automated fleet, pilots surveyed often suggested a deliberate embrace of traditional aviation skills. Many said they are refocusing on their manual skills and leveraging their experience in less-automated airplanes to help them cope with the advanced automation. According to the pilot observations, an effective strategy has been to apply traditional skills as a backup to the automation. One pilot said, “I call it flying the autopilot. I don’t work as much when watching the flight director bars as I do watching the words and mode changes along with the mode control panel and mode settings/requested changes.”
One pilot recalled a lesson from instrument training: “At every point the aircraft changes course, speed or altitude, such as waypoints or TOD [top of descent] points, I do a ‘six T’ check. Time — is it accurate to the plan? Turn — what direction and NAV [navigation] mode? Throttles — are the autothrottles behaving as planned? Twist — is there something that needs to be programmed, such as the missed approach altitude at glide path intercept on the ILS [instrument landing system]? Track — what course am I tracking to, is NAV engaged correctly? Talk — is there a checklist the crew needs to run, is there a call to ATC [air traffic control], is there a frequency that needs to be preloaded in the radio?”
Many pilots referred to the fundamentals of flying in their comments regarding individual coping strategies. One wrote: “Cross-check left, right and center instruments. Read aloud FMAs, assigned climb and descent altitudes. Engage autopilot to improve monitoring ability. Disengage and hand-fly whenever I can’t immediately resolve why it’s not doing what I want it to do.”
Many pilots seem adept at blending non-automated habits with automated flight control. One pilot described using traditional methods of verifying waypoint arrival times and fuel burn to compare with the automated solutions. The pilot also uses them as a reminder to check other automation-generated solutions: “Cross-check and confirm glass [navigation display] and switch selection with clearance. Tie existing habits in with new automation requirements such as checking ACARS [aircraft communications and addressing system] ‘howgozit’ [an automated printout tracking waypoint arrival times and fuel burn] reasonableness along with fuel balance, RVSM [reduced vertical separation minimums] altimeter check (all three), and FMC waypoint clearances — all done at the same time.”
Crew Resource Management
One of the more common threads in the comments by the 273 pilots surveyed involved effective crew resource management in coping with the challenges of automation. In addition to verbalizing automation mode changes, many pilots in the sample deliberately sought confirmation and clarification from the other pilot about automation-related actions. This technique is useful in keeping both pilots aware of the current and impending actions of the machine, and provides an effective safety net against possible input errors. Moreover, this technique fosters open communication on the flight deck and enhances situational awareness. Pilots said, “If I’m unsure why the airplane is doing something, I make sure to verbalize it to the other pilot”; “Verify FMS with the other pilot every time a change is made”; and “Confirm proper programming with the other pilot.”
Addressing the Downside
The comments from this sample group indicated strong coping mechanisms and good automation habits to address the downside of advanced automation. Many of the pilots said they developed these strategies independently of airline- and airplane-specific training, reflecting the experience gained and lessons learned after years of daily usage.
- Sheridan, T.B.; Parasuraman, R. (2006). “Human-Automation Interaction.” Human Factors and Ergonomics, pp. 89–129.
- RNP/AR is the International Civil Aviation Organization (ICAO) terminology for an RNP approach designed and approved according to U.S. Federal Aviation Administration Order 8260.52 or the ICAO RNP/AR manual.
- Bhana, H.S. (2009). “Correlating Boredom Proneness With Automation Complacency in Modern Airline Pilots.” Unpublished master’s thesis. Grand Forks, North Dakota, U.S.: University of North Dakota.
- Bhana, H.S. (2010). “Correlating Boredom Proneness and Automation Complacency in Modern Airline Pilots.” Collegiate Aviation Review 28 (1), pp. 9–24.
- Prinzel, L.; DeVries, H.; Freeman, F.; Milulka, P. (2001). Examination of Automation Induced Complacency and Individual Difference Variates. Hampton, Virginia, U.S.: National Aeronautics and Space Administration Langley Research Center.
- Lee, J.; See, K. (2004). “Trust in Automation: Designing for Appropriate Reliance.” Human Factors 46 (1), pp. 50–80.
- Bailey, N.; Scerbo, M. (2008). “Automation Induced Complacency for Monitoring Highly Reliable Systems; the Role of Task Complexity, System Experience, and Operator Trust.” Theoretical Issues in Ergonomics Science 8 (4), pp. 321–348.
- Sawin, D.A.; Scerbo, M.W. (1995). “Effects of Instruction Type and Boredom Proneness in Vigilance: Implications for Boredom and Workload.” Human Factors 37 (4), pp. 752–765.
- Dismukes, R.K.; Loukopoulos, L.D.; Jobe, K.K. (2001). “The Challenges of Managing Concurrent and Deferred Tasks.” Proceedings of the 11th International Symposium on Aviation Psychology. Columbus, Ohio, U.S.: Ohio State University.