“It was just an isolated incident.” The last time I heard this phrase, I was sitting at a conference room table in a client hangar with two other consultants. As usual, it did not mean what the client thought it did.
Fortunately, the incident in question had not resulted in personal injury or damage to the aircraft (and so by definition was not technically an “incident”). In fact, at the time, the passengers were probably unaware that anything had happened. But within two weeks, it had led to the dismissal of the director of aviation and chief pilot, created ill will among the members of the department, and left everyone so sensitive to criticism that simply mentioning it led to the dismissive remark that it was simply “an isolated incident.” The group, we were assured, was conducting all its operations according to the book, and its heightened attention to safety and security since the event made further references to the incident unnecessary.
The truth of the situation was quite different. Responding to a request from senior management to evaluate the aviation department and its performance, we began by conducting one-on-one interviews with each member of the department — the flight staff, maintenance technicians, schedulers and other ground personnel. We also walked the hangar and made notes on safety equipment, cleanliness and security. Together with our interview notes, those observations made clear that, while the group’s intentions were good, its performance and the department’s levels of risk management were significantly below what its members believed. In most respects, the group’s operations were standard industry practices at best, with only a few areas in which they were adhering to best practices or best practices plus, the level most corporate executives should demand.
We also observed several disturbing things about the culture of the organization. Like most aviation professionals, its members were convivial and energetic, results-oriented and completely at home in highly structured environments. But they were also so conflict averse that until it became absolutely necessary to address operational challenges or areas of disagreement, they went out of their way to preserve a collegial, amiable environment in the department — which depended heavily on an unexamined faith that most members of the group were doing a good job, agreed on fundamental principles for operating the aircraft safely and had each other’s best interests at heart. But in some cases, a profile like theirs can spell trouble.1
For one thing, confidence in the excellence of a group’s work and the good intentions of its members do not guarantee high performance. When we asked the members of the client group to rate the department’s performance on safety, customer service and financial effectiveness, they consistently rated themselves between 4 and 5, the two highest ratings, not just on safety but on service and efficiency as well. But it was apparent to us, even from a preliminary audit, that ratings of 2 to 3 would have been more realistic and that a full assessment would almost certainly support the lower ratings. In short, it was clear there was nothing isolated about the event in question.
A Cultural Marker
An isolated incident is always a cultural marker, a warning and an invitation to improvement. Here is why.
Over the past two decades, social psychologists have adopted a strikingly different approach to the study of organizational knowledge, one that has important implications for business aviation.2 As they see it, documentation of rules and practices such as regulations, flight operations manuals and checklists, is only one type of organizational knowledge — and perhaps not the one that has the most impact on the way tasks are actually performed. In reality, they say, the organization’s “explicit” or objective, recorded knowledge — the set of regulations, practices and policies that most departments follow — is merely the written form of agreements, some explicit and some implicit, that the members of the department have made about the operation of the aircraft and the tasks it involves.
The scholars in question agree that rules and procedures, industry regulations and best practices are important, even indispensable, in effective risk management. In most cases, they reflect the industry’s best knowledge about professional aviation — including its experience with things that can go wrong, either with mechanical equipment or in the way people deploy it. Most aviation professionals know these rules and procedures well and do a good job of observing them. But in doing so, even the most professional of them mistakenly assume that safety and risk management are simply a matter of faithfully obeying regulations, following the flight operations manual, completing checklists, etc. And they also assume that the written rules accurately and fully represent the way tasks are (or should be) performed and the way decisions are (or should be) made.
Always Interpreting
But organization theorist Haridimos Tsoukas writes that there is more to performing even the most routine activity than simply following rules and procedures. He says that in making a decision of any kind, the members of the organization are, to some extent, always interpreting rules and guidelines — consciously or subconsciously deciding which rule applies, the limits of its application, and so forth; and that such decisions and interpretations always bring into play many pieces of information and many considerations that have not been recorded as organizational knowledge, that lie beyond and outside the written rules, and that may vary considerably in their application from one member of the group to another, one trip and set of passengers to another, and one aviation department to another.
For example, stick and rudder skills vary from one aviator to another.
But Tsoukas’ observations go well beyond stick and rudder skills. ”Any practice,” he reminds us, “is always richer than [the] formal representation of it.” In other words, no matter how detailed and comprehensive flight operations manuals and other departmental documentation may be, individual members of the group always understand and follow rules, procedures and other generalized instructions only “in practice” — that is, by applying them in specific, concrete circumstances. As a result, even the most routine activity is always “an interpretive act,” a situational application of established rules conditioned by the history of the individual who carries out the activity; the individual’s and the organization’s values; and a host of unforeseeable and therefore unanticipated factors that make application of the rules and procedures an art rather than a science.
Most aviation professionals find such observations disturbing. Because they prefer a highly structured, matter-of-fact environment, they believe that a good set of rules, operations manuals and checklists is the only reliable way to define the roles and activities for which they are responsible. For the same reason, the idea that the detailed, written processes and procedures laid out in the flight operations manual, minimum equipment list, safety management system and other documents are open to interpretation can strike them as wrong. After all, the whole point of standard practices, rules and regulations is to regularize operations and avoid the influence of individual quirks and idiosyncrasies.
Tsoukas would agree. But he would still insist that even the most exhaustive set of rules and processes requires interpretation.
First Line of Defense
Rules, manuals, checklists and guidelines alone cannot assure safe outcomes. At best, they can only prevent known types of failure, which means that the safety net of rules and regulations is not actually a safety net but a first line of defense. When everything goes according to plan, the safety net protects well enough against known risks and produces acceptable outcomes. But it is full of holes, any one of which could provide the opening for an unanticipated and potentially dangerous “isolated incident.” A healthy organizational culture, by comparison, underpins group understandings and interpretations, acting as an implicit but powerful presence that serves as the ultimate reference point, not for operations alone but also for strategic thinking, operational effectiveness and customer service.
The client group that experienced the “isolated incident” was not familiar with any of these views. They believed that because they had a detailed flight operations manual, had filled all the roles in the organization chart with experienced professionals, put the usual safety procedures and checklists in place, and were working hard to meet their responsibilities, they must be performing at or above the best standards for business aviation. But they had failed to consider several factors:
- With rapid department growth, it had become necessary to recruit several new pilots in a short period of time. The way they applied and interpreted department rules, systems and procedures — even the way they understood their roles in the organization — varied significantly from one person to another, sometimes unconsciously so.
- With an increased workload, a hurried move to a new airfield and hangar and the addition of new aircraft, the pressure on the department was greater than it had ever been. Consequently, there was little time to identify and fill in gaps in the safety net, discuss and agree on the alignment of the department with corporate goals, or even address the logistical and personal issues imposed by demanding schedules.
- Even before the department’s rapid expansion, its culture had been comparatively thin, based more on personal relationships than on shared professional and organizational goals and practices. In this respect, the culture of the parent organization was of little help. As is often the case, the aviation department was both physically and culturally removed from downtown, spending much of its time flying and little of it interacting with corporate personnel. That role was reserved for the director of aviation and the chief pilot, whose main concern, other than navigating the budget approval process, was taking care not to provoke the company’s CEO, who managed primarily on the basis of instinct, intimidation and rapid-fire decision-making.
As a result, on the day of the “isolated incident,” at the end of a long trip, in bad weather and with a contract pilot in the right seat, the gaps in the safety net made themselves felt in a way that was disturbing enough for at least one member of the flight crew to report it to the chief pilot. Astonishingly, there was no formal, structured discussion of the event, either by the crew in the immediate aftermath or subsequently by the department as a whole. When it was no longer possible to ignore what had happened, the group looked for a scapegoat. They assumed that since there was nothing wrong with the department’s standards of operation, the fault must lie with someone’s incompetence, a “personality problem” or a willful departure from best practices.
In fact, none of those caused the problem. The weather was a factor, with thunderstorms near the airport, introducing unexpected uncertainty and risk into the return trip from the West Coast. And the more senior captain, who had been the pilot in charge on the outbound trip, was making the return trip in the jump seat, leaving the second in command on the outbound flight to fly the aircraft, with the contract pilot, who had flown with them before but was not a full-time member of the department, in the right seat. When it became clear that the weather would be an issue, the senior pilot began issuing instructions from the jump seat — a significant departure from accepted crew resource management procedures. Ignoring his comments, the pilot in charge decided to land if a hole in the clouds opened up at the last minute, which it did — but not before disagreement between the jump seat pilot and the pilot in charge had turned the cockpit into a nightmare of resentment and angry silence.
Doing Nothing
In the aftermath, the senior pilot reported the incident to the chief pilot, who did nothing. The senior pilot then took the issue to the director of aviation, who also did nothing, preferring not to create additional conflict in an already overburdened department. When neither of them took action, the senior pilot took his complaint directly to the CEO, who had been a passenger on the flight and, although he had not sensed anything was wrong, was distressed to find that the director of aviation and the chief pilot had ignored a report of what he saw as a clearly life-threatening incident and fired them both.
What should the client organization have done? First, of course, the crew ought to have discussed the incident immediately. If they had, they might have seen that it represented an implicit challenge to their assumption that the organization was operating at the level of best practices. They might also have seen, as was apparent from our interviews, that it arose not from individual incompetence but from a failure on the part of everyone in the cockpit to identify the risk factors inherent in the situation and to agree in advance on the best way to deal with them. Instead, they depended on spur-of-the-moment decisions, incompatible assumptions and individual interpretations.
Given enough time and enough help from outside sources, they might also have avoided the impulse that led them to turn away from the event without discussion, to see it as an isolated incident, and nothing more. They apparently perceived it, consciously or unconsciously, as a threat to their reputations for professional competence, to the continuity of their personal and professional relationships inside and outside the department, and to the aviation department’s status in the corporation in general. But scholars and consultants would see the event as an invitation to broaden the organization’s practical knowledge and its commitment to a common interpretation and application of existing rules and procedures, and, beyond that, to strengthen the underlying fabric of values, judgments and agreements that defined the organization’s culture — including the addition of a mechanism to identify, discuss and repair tears in the organizational fabric rather than dismiss them as isolated incidents.
Coming to Terms
How, in specific, practical terms, can aviation managers come to terms with organizational culture and use it to craft a department that operates in the safest possible way, still pays close attention to its internal and external customers’ needs and provides the maximum financial and strategic return on the corporation’s investment in aviation? The necessary elements include:
- A detailed corporate strategy based on choices about what the corporation will not do as well as what it will and must do;
- Strategically aligned metrics for the aviation group that it can use to quantify and track the value it creates for the corporation and how effectively its activities contribute to achievement of key corporate objectives;
- A fully developed set of personnel policies and a comprehensive management development program, including plans for recruiting, screening, hiring, mentoring, training and promoting its members;
- An accurate psychological profiling instrument to support each of the steps in the management development program and help further define the corporate culture; and,
- Significant, mandatory and regularly scheduled time for group discussion, development of initiatives, and interaction, both among members of the group and with personnel outside the aviation department.
Needless to say, few aviation departments follow these practices, although doing so would not only improve their own performance but also add strength to the aviation group by providing it with the tools to carry out its role as full-fledged corporate entity with the same cultural, strategic and fiscal responsibilities as its counterparts downtown.
Jerry Dibble is an organization design consultant who has worked with numerous corporate flight departments.
Image: MatusDuda | iStockphoto
Notes
- Comments on the group’s profile are based on data from the Birkman Method, which has a large database of aviation professionals and allows users to work from a baseline of expectations about group and individual behaviors, spot distinguishing features of the department and use that knowledge to promote change and improve the quality of conversation and decision-making in the group.
- Tsoukas, Haridimos. Complex Knowledge: Studies in Organizational Epistemology, Chapters 3 through 6. Oxford, England: Oxford University Press, 2005.