Books
Slips, Mistakes and Violations
A Life in Error: From Little Slips to Big Disasters
Reason, James. Farnham, Surrey, England, and Burlington, Vermont, U.S.: Ashgate, 2013. 149 pp. Figures, references, index.
A teapot and a cat set James Reason on the path to becoming a renowned expert on error in complex industries such as aviation.
As Reason tells the anecdote in his new book, one afternoon in the early 1970s, he was brewing tea and about to put the tea leaves in the pot when the cat — “a very noisy Burmese” that slightly intimidated him — showed up and howled insistently to be fed. Reason opened a tin of cat food and spooned some out … into the teapot.
“I little realized at the time that this bizarre slip would change my professional life,” Reason says. A lecturer in psychology at the University of Leicester, he was going through a dry spell in research topics.
“I started to reflect upon my embarrassing slip,” he says. “One thing was certain: There was nothing random or potentially inexplicable about these actions-not-as-planned. They had a number of interesting properties. First, both tea-making and cat-feeding were highly automatic, habitual sequences that were performed in a very familiar environment. I was almost certainly thinking about something other than tea-making or cat-feeding. But then my attention had been captured by the clamouring of the cat … .
“This occurred at the moment I was about to spoon tea into the pot, but instead I put cat food into the pot. Dolloping cat food into an object (like the cat bowl) affording containment had migrated into the tea-making sequence.”
A Life in Error is not, despite the title, an autobiography, although Reason occasionally describes personal events to illustrate points. Unlike his more academic books that have made him one of the most influential theoreticians in risk management, this new publication (so far available only in paperback and as a Kindle download) is written in an informal style without sacrificing scientific rigor.
A simplified version of the theories that have been the basis of his career, A Life will be especially valuable to operational personnel whose jobs bear on safety, without specializing in it. Such readers could range from top-floor executives to frontline workers. Even specialists who are familiar with most of the content should appreciate the book’s concision, smooth flow and glimpses into its author’s personality.
Most people believe they know what an error is, yet the concept can be surprisingly hard to pin down. “Dictionaries send us on a semantic circular tour through other [similar] terms such as mistake, fault defect and back to error again,” Reason says. “That dictionaries yield synonyms rather than definitions suggests that the notion of error is something fundamental and irreducible. But we need to probe more deeply into error’s psychological meaning.”
Error is connected in the human mind to other constructs — plan, intention, action and results: “The success or failure of our actions can only be judged by the extent to which they achieve, or are on the way to achieving, their planned consequences.”
Reason suggests the following as a useful working definition:
The term “error” will be applied to all those occasions in which a planned sequence of mental or physical activities fails to achieve its desired goal without the intervention of some chance agency.
Only the last phrase might raise a question for some people. Reason takes into account the possibility that a goal might be achieved, but not because the plan worked as intended.
“If you were struck down in the street by a piece of returning space debris, you would not achieve your immediate goal [to cross the street], but neither would you be in error, since this unhappy intervention was outside your control,” Reason says. “By the same token, achieving your goal through the influence solely of happy chance — as when you slice a golf ball that bounces off a tree and onto the green — could hardly be called correct performance.”
He distinguishes two basic ways in which an objective can fail to be achieved:
- “The plan of action may be entirely appropriate, but the actions do not go as planned. These are slips and lapses (absent-mindedness) or trips and fumbles (clumsy or maladroit actions). Such failures occur at the level of execution rather than in the formulation of intentions or planning.
- “Your actions follow the plan exactly, but the plan itself is inadequate to achieve its desired goal. These are termed mistakes and involve more complex, higher-level processes such as judging, reasoning and decision making. Mistakes, being more subtle and complex, are much harder to detect than slips, lapses, trips and fumbles. … It is not always obvious what kind of a plan would be ideal for attaining a particular objective. Thus mistakes can pass unnoticed for long periods — and even when detected they can be a matter of debate.”
Reason adopted the error-type classification formulated by Jens Rasmussen, a Danish control engineer, that recognized three performance levels — skill-based (SB), rule-based (RB) and knowledge-based (KB). Reason says, “Using his framework, I was able to distinguish three distinct error types: skill-based slips, rule-based mistakes and knowledge-based mistakes.”
The cat food/teapot error was an example of an SB slip. “Activities at the SB level involve routine and habitual action sequences with little in the way of conscious control,” Reason says. “There is no awareness of a current problem; actions proceed mainly automatically in mostly familiar situations.”
Such actions sound simple and easy. What can go wrong? Often it is a distraction (which might include an outside event, stray thought or daydreaming) that leads to confusion about the context of the action. The routine is carried out as it is meant to be, but it is the wrong routine for the situation.
RB and KB mistakes differ from SB slips in that they come into play only when the person acting realizes there is a problem to be solved. Both require thinking about a solution, but insofar as they are mistakes, the solution chosen is incorrect.
“There are two kinds of problem: those for which you have pre-packaged solutions (RB level) and those for which you have not (KB level),” Reason says. In the former case, once a problem is recognized, the need is to determine the appropriate response that has already been formulated — for instance, a checklist published in a quick reference handbook. Devising KB solutions for problems whose exact nature may be unclear or for which no standard solution exists requires analysis and creativity.
Yet another kind of error, violations, strongly impressed Reason as a result of the 1986 Chernobyl nuclear power plant disaster in Ukraine. The explosion of the reactor’s core could be traced to two distinct types of unsafe act. Reason says, “There was an unintended slip at the outset of the fatal experiment that caused the reactor to operate at too low a power setting … . Unable to bring the power level up to the desired 20 percent, the operators deliberately persisted with the trial and in so doing made a serious violation of safe operating procedures.
“They did this partly because they did not really understand the physics of the reactor, but also because of their determination to continue with the testing of the voltage generator — which, ironically, was intended as a safety device in the event of a power loss.”
Chernobyl marked the beginning of a new, wider phase of Reason’s study of error. “Up to this time the focus had been very largely upon individual error makers,” he says. “But the appearance of violations required a shift away from a purely cognitive information-processing orientation to one that incorporated motivational, social and institutional factors, and paramount among the latter were rules and procedures.”
Why would anyone deliberately violate procedures designed, at least in theory, to safeguard people — including the operators committing the violations? Reason cites several motivations. Corner cutting or routine violations are intended to save time and effort, or to work around what are perceived as unnecessary limitations. Thrill seeking provides stimulation for some people as they “push the envelope.” Necessary violations are a counterweight to the tendency of organizations to issue ever-stricter rules, usually in response to the most recent accident, which operators at the sharp end find impractical.
Reason cites the “balance sheet” developed by German psychologist Petra Klumb in connection with violations:
- Perceived benefits: “An easier way of working; saves time; more exciting; gets the job done; shows skill; meets a deadline; looks macho.”
- Perceived costs: “Possible accident; injury to self or others; damage to assets; costly to repair; risk of sanctions; loss of job; disapproval of friends.”
The potential benefits versus potential costs might seem to most people, especially those not actively involved with the work, as a bad trade-off. But Reason points out that “the benefits of non-compliance are immediate and the costs are remote from experience: violating often seems an easier way of working and for the most part brings no bad consequences.”
He is skeptical that “get tough” policies toward rule violators who are caught will improve matters much. Most do not get caught; some believe they have no choice if they are to meet their productivity targets in spite of unfeasible and excessive requirements. Instead, Reason believes a better approach is “to try to increase the perceived benefits of compliance. That means having procedures that are for the most part workable, correct and available. They should describe the quickest and most efficient ways of doing the job.”
From studying violations, it was a short step to developing the ideas that Reason is most associated with — “holes” in the layers of defense against error that occasionally align to create a path of vulnerability, latent threats that lurk undetected and organizational factors. All these conditions are now recognized in orthodox risk management.
I counted three possible absent-minded slips by the author or editor in the book. He says, “Chapter 5 deals with absent-minded slips and lapses.” But they are the subject of Chapter 4. “In Chapter 10, we move from errors to violations,” he says, with the next page headed “Chapter 9, Violations.” In that chapter, he correctly gives the year of the Chernobyl nuclear accident as 1986, but in the chapter on organizational accidents, he dates it to 1985.
Reason has nothing to be embarrassed about. He has long made the point that everyone will make errors from time to time, which includes world-famous researchers in the field. Like many of the colleagues he has influenced, he is concerned not with the impossible task of eliminating all error but building barriers against it and reducing its consequences.