
Image: YAKOBCHUK VIACHESLAV / shutterstock.com
This article is the fifth in a series on landmark events in aviation since Flight Safety Foundation was founded in 1945.
American author and aviator Ernest K. Gann wrote of his experiences in what some call the golden age of the airlines — the late 1930s to the 1950s. He chronicled a world where captains were kings and rarely questioned. In Gann’s memoir Fate is the Hunter, he described his role as a copilot: “I was expected to operate the landing gear and flaps on command, keep the log, the flight plan, and my mouth shut.”
But the golden age was not so golden. In 1959, in the United States alone, there were 40 fatal accidents per 1 million aircraft departures. Today, the fatal accident rate is a fraction of that; according to most calculations, it hovers not far above zero.
Improvements in cockpit dynamics and communication, championed by Flight Safety Foundation and other organizations, contributed to that improvement. Copilots, or first officers, are no longer expected to keep silent. Captains are expected to encourage crew input to make better decisions and prevent and mitigate errors. Crew resource management (CRM) can be defined as the effective use of all available resources for flight crew personnel to ensure safe operations, reducing error, avoiding stress, and increasing efficiency.
A 1979 workshop sponsored by the U.S. National Aeronautics and Space Administration (NASA) helped establish the CRM concept. The workshop, titled Resource Management on the Flightdeck, stemmed from NASA research on air transport accidents. Research presented at the event pointed out the types of human error that lead to accidents, including poor interpersonal communication, decision-making, and leadership.
Examples included the worst aviation disaster of all time, the March 27, 1977, collision of two Boeing 747s on the ground at Tenerife in the Canary Islands. The accident killed 583 people.
A KLM 747 began a takeoff roll in low daylight visibility at the same time a Pan American World Airways 747 backtracked on the same runway. The investigation found that the KLM captain:
- Took off without clearance;
- Did not obey the “stand by for takeoff” instruction from the tower;
- Did not reject the takeoff when the crew of the Pan Am aircraft reported they were still on the runway; and,
- Replied emphatically in the affirmative when the flight engineer asked if the Pan Am 747 had cleared the runway.
The accident demonstrated in tragic terms what can happen when communication breaks down.
Early research referred to the CRM concept as cockpit resource management. By the time NASA held another workshop in 1986, the name had been changed to crew resource management. Training focused on topics such as team building, briefing strategies, situational awareness, and stress management.
By the 1990s, CRM training had evolved to better reflect the flight deck environment. Airlines began to include modules on CRM concepts in aircraft automation. This led to memory aids that help prevent mistakes in automation use. An example is CAMI, which stands for confirm, activate, monitor, intervene:
- Confirm the function the crew wants to use;
- Activate that function;
- Monitor aircraft performance; and,
- Intervene if the automation does not do what the crew intended.
Training also began to look at human factors such as fatigue, and hazardous attitudes that can lead to accidents.
Those attitudes include:
- Anti-authority — Don’t tell me what to do.
- Impulsivity — The weather is marginal, but let’s just try it.
- Invulnerability — I never have a problem with this approach.
- Macho — The rules are for average pilots, and I’m better than average.
- Resignation — What’s the use?
Further refinements led to the U.S. Federal Aviation Administration (FAA) approving a major change in airline training, the advanced qualification program (AQP). AQP training includes CRM concepts put to use in a line-oriented flight training (LOFT) simulator session. The “LOFT ride” takes a captain and first officer on a normal line flight from departure to destination. Along the way, they encounter problems ranging from minor issues such as a runway change to major issues such as an engine failure or fire. For most of these problems, there is no right or wrong answer as long as the flight terminates safely. The pilots are graded not only on how they flew the aircraft but also on how they communicated and considered options.
Evolving CRM research brought an acknowledgement that human error cannot be eliminated entirely. This led to the threat and error management concept (TEM): If we cannot eliminate error, then how do we minimize, mitigate, and manage it? The origin of TEM can be traced to line operational safety audits (LOSA) conducted by the University of Texas Human Factors Research Project (UT). During the 1990s, UT conducted jump-seat observations with Delta Air Lines and Continental Airlines. Trained observers categorized the origin of errors and the response to them, along with the result.
This research led to a TEM framework model with three main components:
- Threats – events or errors beyond the control of line personnel. Threats can include a wide variety of things such as weather, malfunctions, air traffic congestion, and disruptive passengers.
- Errors – actions or inactions by line personnel that lead to deviations from intentions or expectations.
- Undesired states – conditions in which an unintended situation results in a reduced safety margin. These conditions can range from relatively minor mistakes such as turning onto the wrong taxiway, to potentially disastrous situations such as a runway incursion.
During TEM-oriented training, pilots are debriefed after simulator sessions and asked to identify the threats they faced, how they handled the threats, and the result. In addition, airlines have begun using threat-forward briefings to stay ahead of potential problems. For example, during approach briefings, instead of a rote recitation of data on an approach chart, a captain and first officer discuss anticipated threats: “There are low-level wind shear advisories for our destination. Let’s review the wind shear escape procedure.” Or for a departure briefing: “This is a low-visibility takeoff. Call out ‘center line’ if you see me drifting off it.”
This type of threat-forward briefing came about in the aftermath of the Aug. 14, 2013, crash of an Airbus A300-600 freighter during an approach to Birmingham-Shuttlesworth International Airport in Alabama, U.S. The crash killed the captain and first officer — the only people in the airplane — and destroyed the airplane.1 The U.S. National Transportation Safety Board (NTSB) report on the accident said that because the crew did not re-brief when elements of their planned approach changed, they “placed themselves in an unsafe situation because they had different expectations of how the approach would be flown.”
In 2017, an AeroSafety World article noted that the Birmingham accident was an example of how briefings had become a “one-sided, box-checking” event. The article advocated a more collaborative briefing concept.
To facilitate such collaborative briefings, airlines have begun using briefing aids. These are cards or placards that organize potential threats into categories, such as personal issues, weather, mechanical problems, and others. The aids encourage conversations between captains and first officers, such as “What other threats do you see? Did I miss anything?” Ideally, the most imminent hazards become top of mind for the crew, and they become better prepared to handle those hazards.
In the decades since Ernest K. Gann’s career, the aviation community has learned much about how communication on the flight deck can enhance safety. Gann dedicated Fate is the Hunter to “old comrades with wings … forever folded.” His list of those comrades runs for five pages and concludes, “Their fortune was not so good as mine.” Lessons learned from such fortunes continue to benefit the flying public today.
Thomas W. Young is a retired airline captain and a former instructor flight engineer with the West Virginia Air National Guard. Young has logged nearly 12,000 hours of pilot and flight engineer time.
Note
- The NTSB said the probable cause of the accident was “the flight crew’s continuation of an unstabilized approach and their failure to monitor the aircraft’s altitude during the approach, which led to an inadvertent descent below the minimum approach altitude and subsequently into terrain.” Among the contributing factors was “the captain’s failure to communicate his intentions to the first officer once it became apparent the vertical profile was not captured.”