Paradoxes of Accountability
Just Culture: Balancing Safety and Accountability
Dekker, Sidney. Aldershot, Hampshire, England, and Burlington, Vermont, U.S.: Ashgate, 2012. Second edition. 171 pp. Figure, table, references, index.
It has been four years since the publication of the first edition of Dekker’s Just Culture (ASW, 4/08, p. 53). The author says that the first edition was partly a response to the trend of criminalizing aviation accidents. Although criminalization remains a serious issue and has a place in this second edition, he has shifted his emphasis to the struggle within organizations to understand, create and maintain a just culture.
“Many of these organizations have found that simplistic guidance about pigeonholing human acts does not take them very far,” Dekker says. “In fact, it leaves all the hard work of deciding what is just, of what is the right thing to do, fully to them. Just recommending these organizations to divide human behavior up into errors, at-risk acts or recklessness is really quite useless. Somebody still needs to decide what category to assign behavior to, and that means that somebody will have gotten the power to do so.”
The second edition is organized differently from the first, he says, around the many issues arising from a nurse’s error that resulted in the death of a girl. Besides the reorganization, he says, “I have written new material on ethics, and on caring for the second victim [i.e., the person who committed the error]. Taking care of the professional who was involved in the incident is at least as important as anything else you might do to create a just culture in your organization.”
The last subject — a humane concern for the person who unwittingly was involved in causing an accident or serious incident — is unusual in discussions of just culture.
“For most professionals, an error that leads to an incident or death is antithetical to their identities,” Dekker says. “In fact, it could be argued that people punish themselves quite harshly in the wake of failure, and that you or your organization or society can hardly make such punishment any worse. The research is pretty clear on this: Having made an error in the execution of a job that involves error management and prevention is something that causes excessive stress, depression, anxiety and other psychological ill-health.”
For an example of how devastating such feelings can be, recall the fatal accident on Jan. 8, 2003, at Charlotte-Douglas International Airport, Charlotte, North Carolina, U.S. The Beechcraft 1900D crashed on takeoff, killing 21 people in the impact and post-crash fire. The U.S. National Transportation Safety Board, in its report, said that the probable cause was “the airplane’s loss of pitch control during takeoff. The loss of pitch control resulted from the incorrect rigging of the elevator system compounded by the airplane’s aft center of gravity, which was substantially aft of the certified aft limit.” Among the contributing causes was the “quality assurance inspector’s failure to detect the incorrect rigging of the elevator control system.”
So at least two people, the technician who performed the elevator system rigging and the inspector who signed for it, were directly connected with the disaster. We do not know what happened to them or what their subsequent status would be under a just culture. But it is safe to assume that even if they were not formally penalized, they were emotionally blighted for a long time, perhaps for life.
Dekker says, “In the best case, professionals seek to process and learn from the mistake, discussing details of their error with their employer, contributing to its systematic investigation and helping with putting safety checks in place. The role of the organization in facilitating such coping (e.g., through peer and managerial support and appropriate structures and processes for learning from failure) is hugely important. …
“If this condition is met, employee support, and particularly peer support, appears to be one of the most important mediating variables in managing stress, anxiety and depression in the aftermath of error, and one of the strongest predictors of coming out psychologically healthy.”
The author is sensitive to the many paradoxes involved in seemingly simple, clear-cut safety-related concepts. In other books, he has expressed skepticism about programs and campaigns intended to warn employees against committing errors, because almost no one chooses error; knowing an action is erroneous means not doing it. The employee who does the wrong thing believes, at the time, it is the right thing.
Setting aside questions of punishment, it seems plain common sense that an employee should be accountable for his or her actions. But in Dekker’s view, here is another paradox: “If your job makes you responsible for a number of things, then you might be held accountable for not living up to that responsibility. The question is — who is going to hold you accountable, and by what means? This is where the processes of putative justice and learning can start to diverge.
“Suppose you are held accountable by somebody who has no knowledge of the messy, conflicted details of your responsibilities. That sort of accountability might not help you and your organization learn and improve. And it might not even be seen as just. Accountability that works for safety, and that is just, should be intimately informed by the responsibilities for which you are being held accountable.”
One of Dekker’s typically provocative chapter titles is, in presumably ironic quotation marks, “You Have Nothing to Fear if You’ve Done Nothing Wrong.” In it, he discusses the idea — central to most notions of just culture — of drawing a line between honest mistakes and unacceptable behavior. The latter does not necessarily mean only deliberate misbehavior, or the aviation industry, being steeped in professionalism and ideals of responsibility, would scarcely need to worry about just culture. It must also include negligence, which is obviously unacceptable.
Now, however, the subject begins to blur. Dekker cites a definition of negligence from the Global Aviation Information Network that reads as if its author was trying to capture every nuance:
“Negligence is conduct that falls below the standard required as normal in the community. It applies to a person who fails to use the reasonable level of skill expected of a person engaged in that particular activity, whether by omitting to do something that a prudent and reasonable person would do in the circumstances or by doing something that no prudent or reasonable person would have done in the circumstances.
“To raise a question of negligence, there needs to be a duty of care on the person, and harm must be caused by the negligent action. In other words, where there is a duty to exercise care, reasonable care must be taken to avoid acts or omissions which can reasonably be foreseen to be likely to cause harm to persons or property. If, as a result of a failure to act in this reasonably skillful way, harm/injury/damage is caused to a person or property, the person whose action caused the harm is negligent.”
All that verbiage should nail it. In a courtroom it would serve its purpose — which is to say, it would give attorneys on both sides of the case plenty of room to debate whether the accused was negligent. But the point of just culture is to get away, insofar as possible, from legalistic judgments.
Dekker isn’t buying the definition. “It does not capture the essential properties of ‘negligence,’ so that you can grab negligent behavior and put it on the unacceptable side of the line,” he says. “Instead, the definition lays out a whole array of questions and judgments that we should make. Rather than this definition solving the problem of what is ‘negligence’ for you, you now have to solve a larger number of equally intractable problems instead:
- “What is ‘normal standard’?”
- “How far is ‘below’?”
- “What is ‘reasonably skillful’?”
- “What is ‘reasonable care’?”
- “What is ‘prudent’?”
- “Was harm indeed ‘caused by the negligent action’?”
Of course, any definition of an abstraction requires interpretation or judgment, as the author acknowledges. But he adds, “It is, however, important to remember that judgments are exactly what they are. They are not objective and not unarguable. … What matters are which processes and authorities we in society (or you in your organization) rely on to decide whether acts should be seen as negligent or not.”
Although Dekker goes to great lengths to point out the ambiguities inherent in the idea of just culture, he supports the principle. The rest of the book is concerned with making it work — not perfectly, which is impossible, but as well as intelligence and goodwill allow.
Here are some of his suggestions:
- “A single account cannot do justice to the complexity of events. We need multiple layers of description, partially overlapping and always somehow contradictory, to have any hope of approximating a rendition of reality”;
- “A just culture accepts nobody’s account as ‘true’ or ‘right’ and others [as] wrong”;
- “Disclosure matters. Not wanting to disclose can make a normal mistake look dishonest, with the result that it will be treated as such. … Disclosing is the practitioner’s responsibility, or even duty”;
- “Protecting those who disclose matters just as much. Creating a climate in which disclosure is possible and acceptable is the organization’s responsibility”; and,
- “Proportionality and decency are crucial to a just culture. People will see responses to a mistake as unfair and indecent when they are clearly disproportionate.”
When a Boeing 747 captain was found guilty of negligently endangering his aircraft and passengers by almost striking an airport hotel on the approach to London Heathrow Airport — he conducted a go-around and no one was injured — he was fined £1,500 by a court and reduced in rank to first officer by his airline. As Dekker tells the story, a pilot friend of the former captain asked what the captain had been found guilty of. “Endangering the passengers,” he replied. The friend laughed and said, “I do that every day I fly. That’s aviation.”
Dekker urges organizations that are serious about instilling just culture to begin immediately with the following steps:
- “An incident must not be seen as a failure or a crisis, neither by management nor by colleagues. An incident is a free lesson, a great opportunity to focus attention and to learn collectively”;
- “Abolish all financial and professional penalties in the wake of an occurrence. Suspending practitioners after an incident should be avoided at all cost. These measures serve absolutely no purpose other than making incidents into
- something shameful, something to be kept hidden”;
- “Implement, or review the effectiveness of, any debriefing programs or critical incident/stress management programs to help practitioners after incidents. Such debriefings and support form a crucial ingredient in helping practitioners see that incidents are ‘normal,’ that they can help the organization get better, and that they can happen to everybody”;
- “Build a staff safety department, not part of the line organization, that deals with incidents. The direct manager (supervisor) of the practitioner should not necessarily be the one who is first to deal with that practitioner in the wake of an incident”;
- “Aim to decouple an incident from what may look like a performance review.
- Any retraining of the practitioner involved in the incident will quickly be seen as punishment (and its effects are often quite debatable), so this should be done with utmost care and only as a last resort”; and,
- “Be sure that practitioners know their rights and duties in relation to incidents. Make very clear what can and typically does happen in the wake of an incident. … Even in a climate of anxiety and uncertainty about the judiciary’s position on occurrences, such information will give practitioners some anchor, some modicum of certainty about what may happen. At the very least, this will prevent them from withholding valuable incident information because of misguided fears or anxieties.”