SCHELL Game
SMS for Aviation — A Practical Guide. 6: Human Factors
Civil Aviation Safety Authority of Australia (CASA), 2012. 28 pp.
This is one of six modules published by CASA as a resource kit to acquaint its personnel and other aviation professionals with the components of a safety management system (SMS).1
Human factors (HF) “is an umbrella term for the study of people’s performance in their work and non-work environments,” the module says. “Perhaps because the term is often used following human error of some type, it is easy to think of it negatively. However, human factors also includes all the positive aspects of human performance: the unique things human beings do well.”
Although SMS is perhaps most often associated with technical and procedural metrics such as flight data monitoring and trend analysis, risk calculation and incident reporting, HF also plays an integral role. “It is unlikely that your SMS will achieve its full potential for improving safety performance without a full understanding and application of HF principles by all your staff to support a positive safety culture,” the module says. “Regulations and safety management systems are merely mechanical unless organisations understand and value safety behaviour.”
The Theory
The module begins with a widely accepted theoretical framework, the SHEL model, now often expanded into SCHELL. The latter term’s components include S (software — the procedures and other aspects of work design); C (culture — the organizational and national cultures influencing interactions); H (hardware — the equipment, tools and technology used in work); E (environment — the environmental conditions in which work occurs); L (liveware — the human aspects of the system of work; and L (liveware — the interrelationships between humans at work).
“The SCHELL model emphasises that the whole system shapes how individuals behave. Any breakdown or mismatch between two or more components can lead to human performance problems,” the module says.
“For example, an accident where communication breaks down between pilots in the cockpit, or engineers at shift handover, would be characterised by the SCHELL model as a liveware-liveware problem. Situations where pilots or engineers disregarded a rule would be characterised as liveware-software.”
The Skills
Another way of looking at human factors is that “human factors training should focus squarely on providing aviation safety-critical personnel with the non-technical skills to manage the prevention/consequences of human error. This implies that making errors is normal and expected. The consequences of error are just as important as the causes.
“Non-technical skills are the decision making and social skills that complement technical skills. For example, inspecting an aircraft engine using a borescope is a technical skill performed by a licensed maintenance engineer. However, maintaining situational awareness (attention to the surrounding environment) during the inspection of a wing, to avoid tripping over hazards, is a non-technical skill.”
The module lists as the main categories and elements of non-technical skills managing fatigue; managing stress; alcohol and other drugs; team-based cooperation and coordination; decision making; situational awareness; communication; and leadership.
One key to developing non-technical HF skills is threat and error management (TEM), the module says. TEM begins with recognition — of human errors, of threats to safety and of undesired aircraft states.
- A second key is professionalism, which encompasses these abilities and qualities:
- Maintain discipline — follow approved procedures to perform a given task.
- Assess situations — know what’s going on around you.
- Make decisions — take decisive actions.
- Set priorities and manage tasks — prioritize safety above personal concerns.
- Maintain effective communication and interpersonal relationships.
- Maintain currency.
The System
“If you want to find actual solutions for the problems human errors cause, you often need large systemic changes,” the module says. “For example, you might have to modify maintenance rostering to combat fatigue, or revise your flight manuals to make them easier to interpret.”
Beyond systemic changes, error tolerance can be built into the organization and operator procedures. This is something like a human-centered version of the redundancy that engineers include in aircraft systems design, so that a single-point failure is extremely unlikely to be catastrophic.
“Error tolerance refers to the ability of a system to function even after an error has occurred,” the module says. “In other words, an error-tolerant system is one in which the results of making errors are relatively harmless. An example of building error tolerance is a scheduled aircraft maintenance program. Regular inspections will allow multiple opportunities for catching a fatigue crack in a wing before it reaches a critical length.
“As individuals we are amazingly error tolerant, even when physically damaged. We are extremely flexible, robust, creative and skilled at finding explanations, meanings and solutions, even in the most ambiguous situations. However, there is a downside: The same properties that give human beings such robustness and creativity can also produce errors.”
How can creativity and flexibility produce, as well as reduce, errors? Part of the problem is that we extrapolate from the known to the unknown — for instance, we fill in missing information. We surmise, especially in task-saturated or time-pressured situations. Usually, this creative response is rational, based on experience. Sometimes, though, the reality differs from the norm and the conventional assumption is mistaken.
“Our natural tendency to interpret partial/missing information can cause us to misjudge situations in such a believable way that the misinterpretation can be difficult for us to discover,” the module says. “Therefore, designing systems that predict and capture error — in other words installing multiple layers of defences — is more likely to prevent accidents that result from human error.”
Supplementing error tolerance is error containment. Error containment strategies include policies that “formalise acknowledgement that errors are ‘normal’; [include] regular systemic analysis to identify common errors and build stronger defences; identify risk of potential errors through normal operations behavioural observation programs; identify potential single-point failures (high risk) and build stronger defences; [and] include the concept of shared mental models in team-based training initiatives.”
The Fit
CASA recommends blending HF principles into at least these SMS elements:
- Hazard identification and reduction to as low as reasonably practical;
- Change management;
- Design of systems and equipment;
- Training of operational staff;
- Task and job design;
- Safety reporting and data analysis; and,
- Incident investigation.
Each of these subjects is discussed with an explanation of its HF content, an example scenario and a checklist. To illustrate the methodology, here is how the module examines the first element, integrating HF into hazard identification and reduction.
“Your hazard identification program can reveal potential or actual errors and their underlying causes,” is the summary statement. An example follows:
“A pilot notices the mobile aircraft stairs being left unsecured and the potential for the stairs to hit the aircraft, particularly in strong wind. The pilot reports this concern via the company hazard reporting process. The company safety manager considers the human factors issues involved, and, in talking with ramp staff, finds out that sometimes people forget … to secure the wheel brake properly.
“On inspecting the stairs, the safety manager finds that there are no signs on them to remind operators to activate the wheel brake. Simple human factors solutions would be to install a sign prompting operators to secure the wheel brake, and to ensure that all airport staff are regularly reminded of the danger of unsecured stairs.”
The module’s SMS checklist for hazard identification and reduction includes items such as these:
- “Do you consider HF issues in general risk assessments where hazards are identified?”
- “Are the HF issues involved with hazards understood?”
- “Are different error types with hazards recognised? Are the workplace factors that increase error potential for hazards, such as high workload, or inadequate equipment availability or design, considered?”
- “Do you consider human performance issues in regular staff workshops identifying potential safety hazards?”
- “Is your hazard-reporting process user-friendly and does it prompt users to consider HF issues? What errors might result if the hazard is not managed well?”
Change Management
The module describes how HF fits in with the other identified SMS elements, noting, “Any major change within your organization has the potential to introduce or increase human factors issues. For example, changes in machinery, equipment, technology, procedures, work organisation or work processes are all likely to affect performance and cause distractions.
“Carefully consider the magnitude of change: how safety-critical is it? What is its potential impact on human performance? Consider human factors issues especially during the transition period of the change.”
Design of Systems and Equipment
“Poorly thought-out equipment design can have a major impact on the performance of your staff, and you should ensure that there is a good fit between the equipment and those using it,” CASA says. “The design of equipment such as displays and control systems, alarm systems, signals and warnings, as well as automated systems, may involve significant human factors risks.”
Training of Operational Staff
“Before training operational staff in non-technical skills, do a training needs analysis, so that you know which error management measures to target to which groups — individuals and/or teams.”
Task and Job Design
“Tasks involving excessive time pressure, a complex sequence of operations, relying overly on memory, or that are physically or mentally fatiguing, are likely to negatively affect performance. Task design is essentially about task matching — make sure that tasks and activities are appropriate and suited to a person’s capabilities, limitations and personal needs.”
Safety Reporting Systems and Data Analysis
“Generally, the same decision-making, communication breakdown and distraction problems you see in a serious accident you will also tend to see in minor occurrences. Your safety reporting system should not only collect information about notifiable occurrences and incidents, but also hazards, near-misses and errors that otherwise might have gone unnoticed.”
Incident Investigation
“Make sure your investigation procedures detail clearly how human factors considerations are included. … Your investigators need to be trained in basic human factors concepts and design procedures to be able to establish which human performance factors might have contributed to the event.”
The Preconditions
As in other aspects of SMS, human factors goes hand-in-hand with a paradox: solutions cannot always be applied at the location or time when errors are made. Mitigation often resides at a different level, the conditions that predispose fallible humans to error. Those conditions can be far in time or distance from the “sharp end.”
For instance, take the issue of unsecured mobile stairs discussed earlier on the subject of hazard identification and reduction. The suggested solution is to install a sign reminding operators to be sure the wheel brake is locked.
But incident analysis might discover that a dozen kinds of errors have been made in connection with airstairs. Should management post signs warning of them all? How many signs can gate area personnel read while attending to their duties? If they forget to set the brake as they have been emphatically trained to do, will they remember the reminders?
These questions are not carping — they go to a fundamental issue in human factors. Telling people not to make mistakes probably does not help much. Personnel typically are trying to perform their work correctly. If distractions or time pressure cause them to forget to secure the stairs, they’re even less likely to remember a warning sign, let alone numerous signs.
If any conclusion can be drawn from this, it may be that HF — like the SMS itself — is an interrelated whole. Individual steps are useful, but should not lead to a “check-off” mentality that says, “We’ve done this, that and the other so we’re good to go.” SMS is above all a habit of mind.
Note
- The other modules concern SMS basics; safety policy and objectives; safety risk management; safety assurance; and safety promotion. A DVD is included with the kit. All modules are available online.