Aviation is a visually guided behavior requiring constant and continuous application to work toward perfection. Granted, perfection may never be reached, but a professional strives for it. Famous football coach Vince Lombardi touched on this topic when he said, “Perfection is not attainable, but if we chase perfection we can catch excellence.” The drive for perfection should not be confused with the precision that technology and automation bring to aviation operations. Even with advanced systems, any pilot can be “precisely imperfect” if the automation is not used correctly. Thus, each pilot’s pursuit of perfection defines his or her professionalism and, consequently, keeps airplanes safely flying even when technologies fail.
When a student pilot learns how to hand-fly the flare and touchdown during the landing phase, the words “aim point, airspeed” might be taught as a technique. This simple and easy-to-repeat phrase focuses on the essentials when landing an airplane. The aim point cue refers to a selected spot on the runway to which the pilot aligns a selected fixed point on the windshield (or uses other displayed indications) while maintaining the desired glide path. The airspeed is the minimum safe approach speed or target airspeed to maintain while aiming the aircraft for a landing.
Furthermore, each of the pilot’s hands performs one part of the phrase. Touchdown at the aim point requires one hand on the stick/yoke (controlling pitch) and the other hand on the throttles (controlling thrust/speed). These control inputs produce an overall vector toward the selected touchdown point in a synergistic manner. Touchdown at the aim point requires the appropriate airspeed, and the airspeed requires a stable aim point. Even if the autopilot is engaged, the pilot must monitor the automation with the visual cues.
Despite everything else going on in a busy cockpit and runway environment, those two aspects of the landing are crucial priorities just before touchdown. Most interestingly then, in one sense, safety is not about the automation; it is about the effort by the pilot flying to achieve a perfect aim point and airspeed.
By some estimates, human factors play a causal role in 70 to 90 percent of aircraft accidents involving complex aircraft systems. So the “human in the loop,” often considered a point of strength in complex engineered systems, can become a rare but tragic source of failure. Automation and other advanced technology in the cockpit are intended to optimize human performance. The full effects of automation, however, continue to be debated regarding their promise to decrease workload and improve safety.1 Pilot monitoring errors, mode confusion, unexpected automation behavior and human-computer interface issues have been around for decades — though oversimplified as “pilot error.” What was intended to help pilots sometimes only changes the types of errors made, the paradox of technology.2
Regardless of the technology in the cockpit, pilots remain ultimately responsible for the safe operation of their aircraft. Navigation instruments and aids are just that, technology to assist the pilot or flight crew in controlling the flight path. Automation, likewise, is an aid, capable of precise flight path control, procedure coordination and communication. However, in the end, the pilot-in-command must make the appropriate decisions and visually guide the aircraft.
In 1994, a U.S. National Transportation Safety Board (NTSB) report addressed pilot monitoring and challenging errors that were involved in 31 of the 37 major U.S. airline accidents between 1978 and 1990.3 These were not errors in monitoring of automation so much as they were failures of one pilot to effectively monitor the other pilot — failure to keep an eye on what the other person in the cockpit was doing.
Now, pilots may struggle with monitoring the other pilot as well as the autopilot. Should we be surprised? In terms of the accidents in commercial air transport today, we can conclude that automation is not the only problem. Maybe professionalism has been the issue all along, and automation has recently deflected our attention. Pilots must take the lead.
Analyses of human error in aviation accidents have been going on as long as planes have been flying, and crashing. Some recent theories have centered on the essential nature of all aviation accidents; that is, that they do not occur primarily because of significantly unusual human behavior, but rather occur in the course of everyday people doing their everyday jobs — so-called normal accidents.4 This theory represents an incremental drift away from the original concept of safety margins established to protect the aviation system from the mistakes of individuals.
Under this newer theory, over time, there is a normalization of deviance,5 a series of small, trivial decisions and actions occurring separately from the safety margins built into engineered equipment/system designs and standard operating procedures.6 Consequently, years go by and the non-standard becomes the standard, often pressured by economic considerations, decentralized delegation of responsibility and limited resources — all drifting toward failure.7 Corners get cut, resources are shifted from safety efforts,8 and latent errors develop within an organization.9
On the surface, things appear ok. The absence of accidents and incidents provides an illusion of safety. Then one day, while normal work is being performed by a typical employee, a “blind spot” for that person or other “system trigger” causes the first domino to fall in a causal chain. Then — due to factors such as a slim margin for error, minimal resources, poor training, opaque systems and low accountability — an accident occurs, and we all ask, “Why?”
Automation is not the only problem. The common denominator underlying many aircraft accidents has been simply a lack of professionalism, in other words, people failing to strive for perfection. Mediocrity can become an accomplice in enabling unsafe practices.
The descent from excellence does not mean one unsafe act or instance of less-than-stellar performance, but results when professionalism is allowed to erode over time. Rarely do accidents occur because a person “gambles and loses”10 or because a person intentionally performs the incorrect procedure. Rather, the risk of an accident increases because people are routinely doing their usual work at a sub-par level, and “because people do not believe that the accident about to occur is at all possible.”11 Pilots, for example, have let their guard down and placed too much reliance on the automation, forgetting who is in charge. Again, technology will help pilots become more precise, but not more professional.
In flight operations, professionalism requires that we do the right thing, at the right time and in the right manner, regardless of whether the task is a manual or automatic procedure. Automation is just another tool, providing help but constantly in need of strict supervision.
In summary, pilots still need to fly the airplane. In doing so, they also must strive for perfection because the consequences can be catastrophic if they do not. If we can normalize deviance, we can normalize excellence.12 Instead of drifting toward failure, we purposely set our course toward success. Each person involved in the aviation industry is empowered to take responsibility — to study, train, prepare and deliver — and is accountable for his or her performance. As professionals, we need to strive for perfection and catch excellence on final for landing.
Randy Gibb, Ph.D., is the dean of the Colangelo College of Business at Grand Canyon University, in Phoenix, Arizona, U.S. He served for 26 years as a pilot in the U.S. Air Force. His aviation interests include human factors, safety and leadership development.
Notes
- Dekker, Sidney. “On the other side of promise: what should we automate today?” In Human Factors for Civil Flight Deck Design, edited by Harris, Human Factors Group, 2004.
- Norman, Donald. The Design of Everyday Things. 1988.
- National Transportation Safety Board “A review of flightcrew-involved, major accidents of US air carriers 1978 through 1990,” NTSB/SS-94/01. 1994.
- Perrow, Charles. Normal accidents: living with high-risk technologies, 1999.
- Vaughn, Dianne. The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA, 1997.
- Langewiesche, William. “The Lessons of ValuJet 592.” The Atlantic, March 1998.
- Dekker, Sidney. Ten questions about human error: A New View of Human Factors and System Safety, 2005.
- Ibid.
- Reason, James. Managing the risks of organizational accidents. 1997.
- Wagenaar, Willem & Goeneweg, Jop. “Accidents at sea: multiple causes and impossible consequences” International Journal of Man-Machine Studies, Special Issue: Cognitive Engineering in Dynamic Worlds, 1987, p. 596.
- Ibid.
- Kern, Tony. Retrieved from <www.convergentperformance.com>.