As reported in the first article in this series (ASW, 2/13), Flight Safety Foundation analyzed 16 years of aircraft accident data and found that the most common type of accident is the runway excursion. We noted that the almost complete (97 percent) failure to call go-arounds (GAs) as a preventive mitigation of the risk of continuing to fly unstable approaches (UAs) constitutes the no. 1 cause of runway excursions, and therefore of approach and landing accidents.1
In this second article, we report on a large study of pilots conducted by Presage Group Inc. as one part of the Foundation’s ongoing Go-Around Decision Making and Execution Project. The study was designed to aid in understanding the psychology of compliance and noncompliance with GA policies when pilots decide to continue to fly UAs rather than call for GAs. After briefly describing the research approach used in the Presage study, this article will discuss three aspects of the research results: the pilot characteristics that differentiate the two decisions, the objective conditions that were most associated with continuing to fly UAs and GAs, and awareness competency differences as measured for each of the nine Presage Dynamic Situational Awareness Model (DSAM) constructs that we described in our previous article (Table 1). In order to thread together the relationships between pilot characteristics and objective conditions on the one hand with our nine awareness constructs on the other, we will report the results of the former and overlay, where appropriate, those awareness constructs that differ in the two event recall scenarios (GA versus UA), showing how they shape and ultimately drive the decision to continue to fly UAs instead of pursuing the GA option. Here’s our spoiler alert: In the moments leading up to a decision on whether to continue a UA or execute a GA, pilots reporting their recall of UAs were less situationally aware than pilots remembering GA experiences on every one of the nine DSAM constructs we assessed.
Pilot Survey
Presage conducted an online survey of more than 2,000 commercial pilots between February and September 2012. Pilot respondents for this Foundation-sponsored survey were solicited through direct communication with both safety personnel at various pilot associations and FSF-member and non-member airlines globally, as well as through various social media forums. The goal was to recruit and administer the survey to as many pilots as possible from around the world, representing a variety of fleets, aircraft types, flight operations, physical geographies, respondent experience levels, pilot nationalities and cultures. Participants’ anonymity was assured to inspire honest and complete self-reports of pilots’ experiences, as well as to stimulate participation.
Among the 2,340 pilots who completed the survey, we achieved a good range of pilot experience and operational types, as well as wide geographical representation, suggesting our results can be generalized to pilots worldwide (Table 2).
In the main part of this study, we asked pilots to recall specific instances of unstable approaches, at or below stable approach heights (SAH), that were recent and therefore highly memorable (in fact, we asked pilots to remember the last instance of a UA they had experienced within the last five years). The vivid information that this special “situated recall” task would elicit was necessary for what we needed pilots to report in detail, namely, their experiences during the minutes leading up to and including a decision on whether to call for a GA while flying a UA.
These experiences include their own subjective states (their situational and risk assessments, social pressures, fatigue, beliefs about their companies’ GA policies, etc.) as well as their psychological representations of the objective factors characterizing the aircraft and the environment during their approaches (flight instabilities, visual reference conditions, environmental factors, etc.). These variables constitute a full and in-depth recounting of the objective factors in each situation and their resulting psychological representations during the critical time leading up to pilots’ decisions. These representations, which constitute pilots’ states of dynamic situational awareness, were hypothesized to be the main drivers of pilots’ assessments of the risks of continuing to fly a UA rather than conduct a GA (Figure 1). To encourage full reporting, pilots were guided though a set of structured questions to elicit their recall of events.
In addition, to help refine the analysis, pilots also reported a variety of basic demographic information (such as rank, time on type, base of operations, etc.) and flight operational characteristics (long haul versus short haul operations, aircraft type, etc.). The content of the entire survey was reviewed, commented upon and amended in accordance with the recommendations made by members of the Foundation’s International Advisory Committee, its European Advisory Committee and other advisory team members.
Among pilots who had experienced both GAs and UAs, we randomly assigned some to recall a UA, and others to recall a GA event. This random experimental assignment allowed us to more confidently identify those objective and psychological situational factors associated with noncompliance with GA policies. Pilots who reported they had only flown GAs or UAs simply recalled their last event of those respective types. While paying particular attention to the factors influencing a pilot’s decision to continue with a UA, in the results below, we discuss differences between GA and UA events independent of a pilot’s prior history of having flown them. Therefore, in the findings to be reported, 57 percent of the pilots gave accounts of a UA they had participated in, while 43 percent discussed a UA that resulted in a GA.
UA Group Findings
First, our results showed that pilots flying UAs were more often first officers (FOs). In looking through our DSAM lens, and in particular, at the lower scores on keeping each other safe (relational awareness) for UA events, it makes sense that FOs, who are vulnerable to the authority structure of the cockpit and therefore less likely to assume authority from the captain to call a go-around, are more likely to continue with an unstable approach. It is important to note that pilots’ total flight hours reported at the time of the event (an average of 9,250 hours across the sample), as well as total time on type (average 3,000 hours), did not show any differences in the likelihood of having recalled a GA or UA event, reinforcing the argument that the differences between these groups lie not primarily in their pilot characteristics, but in their situational awareness readiness to follow the procedures (hierarchical and task-empirical awarenesses) should an instability occur at or below decision height. Geographically, there was a strong tendency for pilots based in South America and Asia to report more GAs than UAs, while those from North America and Europe recalled more UA than GA events. This suggests that operational environments, such as airport elevation or complexity of approach procedures, or cultural differences, or both, may play a part in these findings.
Our results show a host of effects and non-effects associated with reporting UAs versus GAs (Table 3, p. 28). Among the flight characteristics more associated with choosing to fly UAs are approaches in visual meteorological conditions (VMC) and already being unstable when reaching SAH. By contrast, recalled GA events were more associated with becoming unstable after SAH, and in instrument meteorological conditions (IMC) and non-precision approaches. These findings suggest that VMC may trigger a lack of discipline across two of our awareness constructs, most notably the gut feel for threats (affective awareness) and seeing the threats (anticipatory awareness). It is as though the UA pilots are seduced into thinking that because of the VMC they can literally “see” their runway miles and miles from touchdown and a stable landing will not be problematic. IMC and more complex approaches such as non-precision require, by definition, a heightened sense of situational awareness across a number of our dimensions in order to ensure the aircraft remains on profile.
In accordance with our DSAM explanatory model, the GA pilots will, in IMC with a pending non-precision approach, “see” these event characteristics as potential threats (anticipatory awareness) early in the descent profile, and should the aircraft become unstable after the SAH, an immediate “gut feel for this threat” (affective awareness) will be triggered with an accompanying compensatory action (compensatory awareness) initiating a go-around.
Pilots reporting on a UA experience noted more instances of excessive airspeed and inappropriate power settings. These results suggest that these pilots feel that although the aircraft is unstable on these flight parameters, they still have the ability to “manage” the aircraft energy prior to landing. Such a belief naturally requires the active suppression or silencing of a number of our situational awareness constructs, such as denying the alarms from the gut (affective awareness), not seeing the threat (anticipatory awareness) and dismissing the standard operating procedures ([SOPs], hierarchical awareness) that state that under these conditions one should be initiating a GA. Conversely, deviation in flight path and low airspeed were more often reported by pilots recalling a GA event. The DSAM model suggests that this makes sense given that their gut is actively engaged and sensing these threats, and that the risks associated with these factors are more accurately measured. These processes trigger the correct adjustment to compensate, that is, by initiating a GA.
Finally, all the environmental factors we assessed were more associated with the decision to go around: presence of tail wind, wind shear, turbulence, wake turbulence, insufficient visual reference and contaminated runways. Our intuitive assumption is confirmed empirically — the more complex the operational environment, the more engaged the pilot’s situational awareness. The fact that these environmental factors are less associated with UAs is consistent with the notion of the psychological seduction of fair-weather flying. Pristine flight conditions invite a greater tolerance for the belief that the absence of complex environmental factors equates with little or no risk to be managed, and suggest to the UA pilot that on one hand, there is a low probability of the aircraft becoming unstable, and on the other hand, should it become unstable, the environmental conditions nonetheless lend themselves to “managing” the instability correctly and landing uneventfully. The processes that lead to these seductive assumptions, however, require the active numbing or passive tuning out of the nine DSAM constructs.
Psychosocial Factors
Given that these objective, situational risk factors existed in the events pilots described, how were those factors perceived, explored and managed by flight crews prior to the decision to continue to fly UAs? What levels of situational awareness did pilots report that they and their crewmembers developed in these scenarios, were these accurate, and how did they contribute to their assessments of the risks of continuing UAs? Finally, did aspects of situational awareness differ in the moments leading up to the GA–UA decision in a patterning that might help explain those decisions to call or forgo a call to go around?
The findings of the study on situational awareness and the other psychosocial variables we measured are pervasive and robust (Table 4, p. 30). Working backward from the decision to continue to fly a UA (Figure 1, p. 27), a highly significant difference existed between GA-recall and UA-recall pilots in terms of the most immediate cause of their decision whether to call a GA, namely, their perception of the manageability versus unmanageability of the risk confronting them: UA pilots perceived far less risk lurking in the instabilities they were experiencing than did GA pilots. This difference is perhaps not so surprising given what they eventually decided to do (that is, to go around or not) based on these very risk assessments, but the strength of the difference is large. What our research sought to discover was why these strikingly different assessments of risk occurred. What factors were reported to be stronger or weaker in the situational awareness profiles of pilots leading up to their judgments of risk? Which of these could be implicated in leading directly to the lowered perception of risk among pilots in the group continuing to fly UAs? In fact, we see evidence that on all nine of the DSAM dimensions of situational awareness, the awareness competencies affecting their judgments of risk, pilots who continued a UA reported having less situational awareness than those who initiated a GA. Many of these effects are very strong (defined as half a point difference on our measurement scales or greater) and were observed across the range of items used to assess each construct.
Certain elements of the psychological situation were or were not present in the two event recall cases in the moments leading up to pilots’ decisions. Whether these situational aspects were sufficiently pursued by conscious exploration and deliberation (and whether they were pursued alone or with other crewmembers) is likely to have played a key role in whether pilots and their crews developed the kind of complete, dynamic and shared picture of the situation that would have allowed them to reach full and accurate competency across each of these nine dimensions of situational awareness that we have described. For example, we established that fatigue, while present in many of the events, did not differ between the two types of scenarios. However, the effectiveness by which a pilot adjusted to the threat (compensatory awareness) by implementing proper fatigue management procedures did differ. This illustrates another example of how, when situational awareness remains high, the pilot sees and feels the fatigue threat and then adjusts or compensates for it.
Similarly, the frequency of actual challenges to authority as reported by pilots in the UA or GA cases did not differ, whereas the quality of the influence the crew had on decision making did. When we examined our findings for the factor, “appropriate crew influence on GA decision making,” we saw that pilots who made a GA decision reported that in the moments leading up to the decision they experienced what we judge to be more appropriate crew discussion and behavior. Pilots reporting their experiences in flying a UA, on the other hand, were more likely to report that the authority structure in the cockpit was influencing their decision to call a GA or not; that they felt less comfortable in challenging or being challenged about conducting a GA; that they were feeling less support from their crewmembers for calling a GA; that they felt more pressure from other crewmembers to continue the approach and land; and that they were feeling more concern about a loss of face in calling a GA. In other words, unlike UA pilots, GA pilots had leveraged their relational awareness competencies to keep each other safe by creating a more supportive, non-judgmental and challenge-accepting cockpit environment and engaging in the appropriate conversations around operational and flight risks.
This awareness of keeping each other safe spilled into other areas of risk assessment for the GA pilots when we looked at how deliberately pilots recalled having “actively considered and discussed” various objective situational factors. While there were no differences in personal consideration and/or active crew discussion between GA and UA pilots on any of the environmental factors or air traffic control factors assessed, when considering five of the seven instability factors we measured, GA pilots considered them more thoroughly and had more communication between crewmembers than UA pilots, providing information that most certainly would have better informed their situational awareness and influenced their assessments of risk and its manageability.
Also of interest are pilots’ perceptions of their companies’ attitudes about performing UAs and GAs. When asked in general whether their companies reprimand pilots for performing either UAs or GAs, pilots in the two event recall types reported no differences in the consequences their employers would impose for compliance or lack of compliance with their companies’ policies. But when asked about these matters in the context of the events they were recalling, UA pilots reported that in those moments, they anticipated less company support for a GA decision. In addition, they were less likely to agree with their companies’ UA/GA policies and procedures and reported more personal tolerance for deviations from them. Although this is a topic we will explore further in our next article, it is worth noting that if pilots perceive that there will be less support from the company for a GA decision, and basically disagree with that company’s GA/UA policies and are more tolerant of deviations, they are primed for non-compliance.
Normalization of Deviance
Deficits in situational awareness that would lead to continuing an unstable approach can now be seen more clearly, and prompt the following conclusions about how effectively objective situational factors are all too often translated to a psychological representation before a decision is made. A very specific situational awareness profile emerges for the pilot who continues an unstable approach. Within the UA pilot group, this profile was characterized by a consistent and comprehensive denial or minimization of situational awareness competencies. In much the same way a dimmer switch can be used to illuminate a room to varying degrees, the UA pilots have selectively turned down or dimmed their situational awareness competencies and, in so doing, dulled their sensory and cognitive processes when assessing and evaluating operational risks. Because our nine DSAM dimensions of situational awareness are by definition inseparable and intrinsically interactive, it is fair to ask the question, “Which of the nine gets turned down first?”
Well, on one level it doesn’t matter, principally because once one dims, it naturally ripples across all of the other constructs, dimming them all in the process. This is exactly what we see in the results, namely an effect across all nine dimensions of the DSAM. On another level, and taking into account the totality of our findings, it can be argued that the dimming of one’s situational awareness competencies actually begins with the collective collusion on behalf of pilots in non-compliance with go-around policy and procedures. Others have referred to this type of comprehensive “buy-in” as an example of the “normalization of deviance.”2 Another way to state this is that a group’s non-compliance with a policy or procedure over time becomes the “new normal” within a culture or organization.
As lived through our DSAM for pilots reporting instances in which they continued to fly UAs, the normalization of deviance taps into the most fundamental level of situational awareness, namely the company’s support for safety (environmental awareness). When a pilot has the experience that his/her company and/or regulatory body is seemingly uninterested in protecting and monitoring compliance with procedures, he or she naturally personalizes this by becoming undisciplined or uncaring. The tendency then is for a pilot to be less strict about personal compliance with the company’s GA policy. There is a moment for every pilot flying a UA, whether at top of descent or at 5,000 ft, where his or her situational awareness competencies may begin to dim. Once a pilot’s commitment to a policy has shifted in general, almost immediately his or her gut feel for threats (affective awareness) shifts, too; with this now-absent awareness competency is the pilot’s increasing inability to see (anticipatory awareness) and adjust (compensatory awareness) correctly to the threats. Added to that will be the pilot’s active denial of his professional experience bank (critical awareness) as a means to assess present risk, as well as the minimization of his or her need to keep each other safe (relational awareness). The psychological landscape now lends itself to the pilots being less disciplined about what their instruments are telling them (functional awareness) and less concerned about knowing the procedures (hierarchical awareness) and aircraft operational limits (task-empirical awareness).
So why do pilots forgo the GA decision in 97 percent of UAs? We have discovered that continuing to fly a UA is associated with much lower levels of perceived risk about the unmanageability of instabilities experienced at and below SAH. These lowered risk assessments are in turn associated with a lowered level of situational awareness on each of the dimensions of the DSAM we have described. For pilots continuing to fly UAs, threats and risk associated with the objective flight conditions are inadequately translated to a compelling psychological risk understanding through a comprehensive, up-to-date and accurate set of dynamic situational awareness competencies. Owing to their interdependent nature, weakness in situational awareness in any of these competencies leads to a rapid undermining of other dimensions and a fast deterioration in accurate risk perceptions. With lowered risk assessment comes the decision to continue to fly a UA rather than execute a GA. And because most of the time we “get away with it,” managing the aircraft’s energy to a successful landing, this reinforces the belief that the risks of instability are manageable and perpetuates the cycle of chronically forgoing the GA.
The Presage Group specializes in real-time predictive analytics with corrective actions to eliminate the behavioral threats of employees in aviation and other industries. Further details of the methodology of their survey, experiments and results are described on their website.
Notes
- Burin, James M. “Year in Review.” In Proceedings of the Flight Safety Foundation International Air Safety Seminar. November 2011.
- Vaughan, Diane. The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. Chicago: The University of Chicago Press, 1996.