Although annual accident rates for U.S.-registered civil helicopters decreased and leveled off in the past decade, the role of human error â primarily pilot error â persists (see, âDownward Trendsâ).
Sixty-nine percent of the 1,653 accidents in the U.S. National Transportation Safety Board (NTSB) database involving U.S.-registered civil helicopters from 2001 through 2010 were attributed to pilot error.1 This implies that approximately seven of every 10 accidents were a consequence of human action â or lack of action â by pilots.
Exactly what constitutes human error? One formal definition is âan inappropriate action or intention to act, given a goal and the context in which one is trying to reach that goal.â2
Human error can include any of the following:3
- Failing to perform, or omitting, a task;
- Performing a task incorrectly;
- Performing an extra or non-required task;
- Failing to perform a task within the required time limit; and,
- Failing to respond adequately to an emergency situation (which abruptly changes not only the goal but also the tasks required to achieve the new goal).
Humans are a remarkably robust species â creative, flexible and adaptive to our surroundings and the constantly changing demands placed on us. Our weaknesses include frequent inability to maintain alertness (attention) and to respond to a situation with the correct actions.4
Realistically, human error may be unavoidable. However, it can be reduced significantly through training, mistake-proofing designs and developing prevention strategies such as checklists. Errors can increase with fatigue, physical and emotional stress, use of alcohol and other drugs and medications, and a host of other environmental and psychosocial factors.5 These factors can negatively influence the ability to observe, detect and assess ongoing events. This can lead to slow reaction times and poor decision making, both of which can lead to errors.
Cognitive science describes several primary types of human errors, each corresponding to different stages in the cognitive or decision-making process.6 In one model, human errors are typed as either slips and lapses or mistakes.
Slips and lapses correspond to errors in execution and/or recall of learned steps of an action sequence; for example, a person may intend to perform an action but actually does something else instead. Errors of this type also include forgetting to reposition a switch, or shutting off the wrong engine during an emergency.7 Slips and lapses usually occur when attention resources are insufficient for a task or are overwhelmed by other events. For example, in the Three Mile Island nuclear power plant accident in Middletown, Pennsylvania, U.S., in 1979, the attention resources of the safety monitoring personnel were overwhelmed by more than 100 simultaneous warning signals.
Mistakes are errors that correspond to incorrect intentions or plans. These are errors in choosing an objective or specifying a method of achieving it. Mistakes are identified as being rule-based or knowledge-based.8 Rule-based mistakes are made when the wrong rule is selected for action â that is, actions match intentions but do not achieve their intended outcome due to incorrect application of a rule. An example is the use of the wrong type of fuel in an engine. Knowledge-based mistakes are made when the wrong plan is created for a particular situation. In this type of mistake, the plan may suffer from a lack of knowledge or understanding of the situation. An example is when a pilot incorrectly diagnoses a problem with a new navigation system without having a full understanding of how the system works.
Learning From Mistakes
To characterize the types of human error associated with individual accidents, it is necessary to apply a formal accident causal factor analysis and classification system to accidents in which the NTSB identified pilot error as the first event causal factor.
One system for analyzing human error in aviation accidents is the Human Factors Analysis and Classification System (HFACS). HFACS was originally developed for the U.S. Navy and Marine Corps as an accident investigation and analysis tool.9 Since its original application, it has been used worldwide by both military and civilian organizations as a supplement to standard accident investigation and analysis methods. HFACS is widely recognized for its ability to produce comprehensive human error data.
HFACS is a broad human-error approach for investigating and analyzing the human causes of aviation accidents. Based upon human performance specialist James Reasonâs Swiss cheese model of latent and active failures, HFACS addresses human error at all levels of the system, including the condition of the aircrew and organizational factors.
HFACS captures data for four top levels of human-related failure:
- Unsafe acts;
- Preconditions for unsafe acts;
- Unsafe supervision; and,
- Organizational influences.
These four top levels of human-related failure are expanded into 11 causal categories that are further expanded into 10 subcategories, described as follows:10,11
Unsafe Acts
The unsafe acts level is divided into two categories: errors and violations. These categories differ in âintent.â
Errors are unintended mistakes and are further categorized as skill-based errors, decision errors and perceptual errors.
Examples of skill-based errors include inadvertently omitting an item on a checklist, failing to prioritize actions and omitting a procedural step.
One accident in which a skill-based error was cited as the major causal factor was the Nov. 10, 2002, collision of a Eurocopter AS 350B with a transmission line in Kingman, Arizona, U.S. The purpose of the flight was to film a traveling motor home for a television series. While maneuvering 60 to 75 ft above ground level to maintain the best angle for the camera, the pilot saw a cable in the flight path. He initiated a rapid deceleration, but the helicopter struck the cable and then the ground. The two passengers received minor injuries and the helicopter was substantially damaged in the crash. The NTSB report cited as the probable cause the pilotâs âinadequate visual lookout and failure to maintain adequate clearance from transmission wires while performing low altitude operations.â12
Examples of decision errors include using the wrong procedure, misdiagnosing an emergency and performing an incorrect action. One such accident involved a dynamic rollover during the attempted Oct. 29, 2002, takeoff of a Hughes 369D in Kaaawa, Oahu, Hawaii, U.S., because of what the NTSB called âthe combined effects of the soft, sloping terrain and the pilotâs failure to redistribute the passengers to a more favorable lateral [center-of-gravity] condition.â The pilot and a passenger on the on-demand air taxi flight were seriously injured and the second passenger received minor injuries in the crash, which destroyed the helicopter.13
Perceptual errors are those made because of visual illusions or spatial disorientation. An accident attributed to perceptual error was the Sept. 17, 2010, crash of a Robinson R44II into a lake near Duluth, Minnesota, U.S. The pilot said that, after taking off around midnight from a beach at the lake, he had âa sinking feeling in the seat all of a suddenâ and saw that the vertical speed indicator displayed a descent. He could not determine the helicopterâs height above the water because of the darkness, and the helicopter hit the water at about 60 kt and was destroyed. The pilot received minor injuries. The NTSB said the probable cause was the pilotâs âfailure to identify and arrest the helicopterâs descent due to spatial disorientation.â14
Violations are willful errors. Examples include violating training rules, performing overly aggressive maneuvers and intentionally exceeding mission constraints. Violations are subcategorized as routine violations, which tend to be habitual by nature and often tolerated by governing authority, and as exceptional violations, which are willful but rare departures from mandated procedures and are not necessarily indicative of an individualâs typical behavior or condoned by management.15
The most common violations were improper preflight planning and inspections. Inadequate in-flight fuel management also was commonly cited in accident investigation reports. In an October 2002 accident involving a Bell 47G3B1 helicopter, a commercial pilot had completed a timber spraying operation near Highfalls, Georgia, U.S., and felt a surge of engine power, then a power loss. The helicopter struck trees during the autorotative landing. The NTSB said there was no fuel in the helicopterâs fuel tanks and cited as the probable cause the pilotâs âinadequate fuel management and subsequent loss of engine power due to fuel exhaustion, and an in-flight collision with trees.â16
Preconditions for Unsafe Acts
The preconditions for unsafe acts level is divided into two major categories: substandard conditions of operators and substandard practices of operators.
The substandard conditions of operators category is divided into three subcategories: adverse mental states, such as complacency, âget-home-itisâ and misplaced motivation; adverse physiological states, such as medical illness and physical fatigue; and physical/mental limitations, such as inadequate reaction time and incompatible intelligence/aptitude.
The substandard practices of operators category has two subcategories: crew resource management, including problems such as failure to use all available resources and failure to coordinate; and personal readiness, which includes problems of self-medication and violation of crew rest requirements.
Unsafe Supervision
The unsafe supervision level is divided into four categories: inadequate supervision, such as failure to provide training, failure to provide operational doctrine and failure to provide oversight; planned inappropriate operations, such as failure to provide correct data, failure to provide sufficient personnel and failure to provide the opportunity for adequate crew rest; failure to correct a known problem, such as failure to initiate corrective action and failure to report unsafe tendencies; and supervisory violations, such as authorizing an unnecessary hazard and failure to enforce rules and regulations.
Organizational Influences
The organizational influences level is divided into three categories: resource/acquisition management, including lack of funding, poor equipment design and insufficient manpower; organizational climate, including policies on drugs and alcohol, value and belief culture, and chain-of-command structure; and organizational process, including quality of safety programs, influence of time pressure and the presence or absence of clearly defined objectives. In this analysis of pilot error, there are no errors in this category.
Pilot Error Analysis
Most narratives in the NTSB accident database include causal factor statements that use key words and phrases described in the HFACS, but a significant number of narratives lack sufficient detail to allow indisputable classification. As a result, in the following accident analysis, some educated judgments were necessary. Determination of error type and category was based on the accident investigatorsâ full narratives, with emphasis on the initial causal factor in the accident sequence.
Unsafe acts accounted for 91.3 percent, 7.8 percent were classified as unsafe supervision, and 0.8 percent were classified as preconditions for unsafe acts. As expected, the HFACS analysis of only pilot error meant that no accidents were placed in the organizational influences classification.
An examination of the decade percentages for all failure types shows that an overwhelming number of accidents each year are classified as unsafe acts. Accidents in other categories were recorded in far fewer numbers. Throughout the decade, the percentages for each failure type are fairly consistent; this implies that the human factors at the root of each error type have not changed over time.
Unsafe Acts
In the unsafe acts category, errors (83.4 percent) greatly exceeded violations (7.9 percent) for the decade. Within the errors category, skill-based errors (53.1 percent) exceeded decision errors (25.6 percent) by a factor of two. Perceptual errors averaged a relatively low 4.7 percent; however, perceptual errors are the most difficult type to discern, and their incidence most likely is underrepresented.
As would be expected, most skill-based errors were failures by the pilot to perform at the subconscious skill level expected of a rated pilot and were dominated by failures to maintain adequate visual awareness. Decision errors were more difficult to generalize, with failures ranging from inappropriate responses to emergencies to continued visual flight into instrument meteorological conditions (IMC).
All of the violations failures were subcategorized as exceptional, meaning that the actions were determined to be intentional departures from authorized and recognized safe procedures.
Preconditions for Unsafe Acts
Accidents in which preconditions for unsafe acts were the initial causal factor averaged 0.8 percent of all accidents annually over the decade, and were fairly evenly distributed between substandard operator conditions and substandard operator practices, at 0.4 percent each, and the respective subcategories.
Substandard operator conditions included both mental factors such as complacency and preoccupation with personal affairs and physiological factors such as impairment due to a recurring stroke.
Incidents of substandard operator practices involved lapses in personal readiness, including impairment due to use of medications or illegal drugs and fatigue caused by lack of sleep â resulting, in one incident, in the pilot falling asleep at the controls.
Unsafe Supervision
In this analysis, the classification of pilot error at the unsafe supervision level (7.8 percent) was used almost exclusively to characterize failures of instructor pilots to maintain adequate supervision of student pilots during training or of rated pilots during check rides. The NTSB accident narratives repeatedly cited improper supervision or failure to take corrective action as the causal factor.
Human Error and Blame
Human error may be inevitable. But pilot action is seldom the sole factor in an aviation accident. Aircraft are complex, high-tech systems consisting of thousands of components. Weather conditions are equally complex and frequently changing. A pilot makes most flight decisions using cockpit displays that are intended to present aircraft and environmental condition statuses and trends. However, these displays and the transfer of flight status data from display to pilot often are fraught with human factors engineering challenges. No matter how skilled and experienced a pilot, how many fail-safe systems are employed in the aircraft, or how good an organizational safety culture may be, there is always a level of residual and random error.17
Although great strides have been made in reducing accident rates, in such a demanding setting as aviation, accidents will continue to occur. As such, it is important to understand that the pilot-related human error classification is not a statement of blame but an important step in understanding the role of human error and in identifying potential sources of systematic error.â
Clarence E. Rash is a research physicist with 35 years of experience in military aviation research and development, and the author of more than 200 papers on aviation display, human factors and protection topics. His latest book is Helmet-Mounted Displays: Sensation, Perception and Cognition Issues, U.S. Army Aeromedical Research Laboratory, 2009.
Notes
- Another 4 percent were attributed to maintenance error; and less than 1 percent involved errors attributed to manufacturer, ground or control tower personnel.
- Felciano, R.M. Human Error: Designing for Error in Medical Information Systems. Stanford University. 1995.
- Adams, C. What is Human Error? ergonomics.about.com/od/ergonomicbasics/a/What-Is-Human-Error.htm.
- Lee, C. Human Error in Aviation. carrielee.net/pdfs/HumanError.pdf.
- Rash, C.E.; Manning, S.D. âStressed Out.â AeroSafety World Volume 4 (August 2009): 39â42.
- Reason, J. Human Error./span> Cambridge, U.K.: Cambridge University Press. 1990.
- Wildzunas, R.M. âThey Shut Down the Wrong Engine!â Flightfax, 25(9), 1â3. 1997.
- Reason.
- Shappell, S.; Wiegmann, D. Human Factors Analysis and Classification System â HFACS. Washington: U.S. Department of Transportation, Federal Aviation Administration. DOT/FAA/AM-00/7. 2000.
- Shappell, S.; Wiegmann, D. Unraveling the Mystery of General Aviation Controlled Flight Into Terrain Accidents Using HFACS. Presented at the 11th International Symposium on Aviation Psychology, Ohio State University, Columbus, Ohio, U.S. 2001.
- Rash, C.E.; LeDuc, P.A.; Manning, S.D. Cooke, N.J.; Pringle, H.L; Pedersen, H.K.; Connor, O. (editors). âHuman Factors in U.S. Military Unmanned Aerial Vehicle Accidents,â pp. 117â131 in Human Factors of Remotely Operated Vehicles. Elsevier. 2006.
- NTSB. Report No. LAX03LA027. Nov. 10, 2002.
- NTSB. Report No. LAX03LA017. Oct.29, 2002.
- NTSB. Report no. CEN10LA556. Sept. 17, 2010.
- Reason.
- NTSB. Report No. ATL03LA007.
- Whittingham, R.B. The Blame Machine: Why Human Error Causes Accidents. Elsevier Butterworth-Heinemann, Oxford, England. 2004.
Download the full article [PDF 274K]