The safety management system (SMS) is only the most recent in a long line of formal efforts to instill safety in aviation operations, safety specialists say, and they caution against viewing SMS as a system that emphasizes data above the human elements of creating and maintaining a just safety culture.
SMS is more flexible than most of the safety programs that preceded it, Wes Scott, consulting services director at the U.S. National Safety Council, told the CHC Safety and Quality Summit during its March meeting in Vancouver, British Columbia, Canada.
“Think of it as a pathway to continuous improvement,” Scott said during a panel discussion about ways of encouraging widespread use of SMS. “Figure out where you are, do what it takes to get to the next level, then start over.”
Graham Braithwaite, head of transport systems at Cranfield University, called SMS another phase in the continuing evolution of aviation safety.
“There was a time before SMS when there were certain companies doing some things that we might now recognize as SMS; it just wasn’t called that,” Braithwaite said. “Now there are some companies doing things they call SMS that still aren’t SMS because they haven’t really understood what it’s all about.”
The data-driven elements of SMS have added new levels of credibility to the safety process, he said, noting that the data have enhanced understanding of safety for everyone in the aviation community.
In the process, Braithwaite said, “SMS has become quite an industry in its own sense. The danger is this is going to grow into some giant mothership that undocks itself from reality and that the people who are not at a conference like this … not even interested in it — they’re the people we’re worried about. How do you get them enthused about it? …
“I worry that in professionalizing safety, we perhaps lose out on some of the enthusiasm that used to be the thing that kept us safe. The enthusiasts who were there because they genuinely cared about safety and wanted to spread the message — let’s not replace them with a system that’s just about the numbers.”
New and different ways of looking at safety situations can help retain interest and enthusiasm in safety, he said. For example, expressing losses in terms of dollars and work hours can make an impression, he said, citing the case of a New Zealand airline that calculated how many flights would be required to recoup financial losses stemming from ground damage to its airplanes. The answer: 22 days of Boeing 747 flights from Auckland to London.
Like other safety programs before it, SMS will necessarily evolve over time, Braithwaite said.
“We’ll add a letter to it, and it will become I-SMS or E-SMS,” he said. “It’s just another stage of evolution. Naturally there’ll be something else to come. … You’ve got to change the message now and then because if, in 10 years’ time, we’re all talking about SMS in the same way, then everybody will be completely bored with it. And we’ll lose the excitement that should come with it.”
Scott Shappell, professor and chair of the Department of Human Factors and Systems at Embry-Riddle Aeronautical University, agreed that, if the message is not modified from time to time, “you get numb to it — it’s meaningless.” An evolution of the message is required to ensure that SMS will not be seen as just “another three-letter program that we’re all going to have to do,” he said.
Addressing Noncompliance
Safety systems typically devote considerable time and effort to people who intentionally do not comply with safety requirements, Scott said. As an example, he cited a worker who did not wear safety glasses, even though they were required.
Instead of merely insisting that the worker wear his safety glasses, Scott said, managers questioned him about his reasons for disregarding the requirement. His answer: The high humidity made the lenses fog up. The issue was addressed, and the worker resumed wearing his safety glasses.
“If noncompliance can’t be addressed, it leads to more noncompliance,” Scott said.
Another method of addressing noncompliance is to develop new ways of thinking not only about compliance but also about conformance, or “wanting to belong, wanting to do what everyone does,” George Santos, an aviation safety consultant based in Calgary, Alberta, Canada, said during another CHC presentation.
Unintentional nonconformance can result from inappropriate training, irrelevant rules or inappropriate procedures, said Santos, who also is a line pilot in the offshore oil and gas industry in the North Sea. He added that operators should review their rules to ensure that they make sense and encourage personal responsibility.
Intentional nonconformance can stem from poor workplace discipline or situations in which employees get away with rule-breaking or are rewarded for noncompliance, he said.
Operators don’t necessarily need new, stricter rules, Santos said. Instead, they need a strong safety culture, which can only be developed over time, using as motivation whatever is important at that time to the individuals involved.
One key is understanding why people behave the way they do, along with identifying the changes that must be made in their ways of doing things, implementing those changes and evaluating their effects, he said.
“What we want to do is develop a culture of people wanting to do what they have to do,” said Santos.
‘Bedrock’ of Safety Performance
That type of just safety culture is crucial in the development of safety programs, speakers agreed.
“You won’t get a proactive safety culture if you aren’t just,” said Keven Baines, director of aviation safety consulting firm Baines Simmons. “Being just is the bedrock of safety performance.”
He noted that a just culture — which acknowledges that an accident rate of zero is not possible but that the number of accidents can be reduced — is increasingly being required by aviation regulations, especially in conjunction with mandatory occurrence reporting intended to identify threats to safety before they cause accidents or incidents.
In the past, workers have failed to report safety issues, sometimes because they feared the consequences but more often because they found the reporting process difficult or confusing, Baines said, noting that at one time, one airline had eight different ways to file a report, depending on various circumstances.
His organization, which has developed a just culture tool kit known as the FAiR (Flowchart Analysis of Investigation Results) 2 System, has determined that, as part of the risk-reduction process, operators first need to identify effective means of preventing or reducing events in which “humans and systems fail to perform in the manner expected” and then take steps to balance personal accountability with “the desire for learning and improvement.”
The FAiR tool is a “handrail” for guiding an organization toward a just culture, Baines said. Throughout the process, the focus should be on “actions and interventions, not consequences,” he added.
In investigating an event, the process calls for use of an event review group, made up of trained investigators, to answer a series of questions to determine the causes of the event and to identify effective interventions to limit chances of a recurrence.
The questions include the following:
- “Was there a conscious and substantial and unjustifiable disregard for risk?
- “Were the consequences as intended? Did the individual deliberately set out to cause the outcome of the event?
- “Were rules intentionally broken?
- “Was a correct plan of action selected? Would the plan … ever have achieved its goal?
- “Under the circumstances, were all applicable rules available and workable and intelligible and correct?
- “Was the action at the time of the event beneficial to the organization? [and,]
- ”Was the action at the time of the event outside of normal practice?”
The goal of the procedure is to identify risks and limit their recurrence, not to punish those who have made errors, Baines said.
As an example, he described what might happen after a maintenance technician forgets to properly close and fasten an engine cowling because he was distracted by a coworker with a question about an unrelated issue. Because a distraction caused the problem, steps might be taken to limit similar distractions in the future, perhaps by requiring technicians engaged in safety-critical tasks to wear signs to signal to others that they should not be interrupted. Some organizations have adopted policies for dealing with other forms of distraction by taking steps such as banning the use of cell phones, Baines said.
In this process, disciplinary actions are reserved for events involving sabotage, violations for personal gain or recklessness, he said.
Featured image: © Andres Meneses | AirTeamImages