Analysis of the findings of a maintenance line operations safety assessment1 (M-LOSA) at a U.K. facility has identified errors — largely procedural errors associated with non-compliance — in 86 percent of observations, British researchers say.2
Most of the errors discovered through the maintenance operations safety survey (MOSS) — as the assessment was called — were classified as inconsequential, according to the report by Marie Langer and Graham Braithwaite of Cranfield University and to a presentation by Langer at the 2012 seminar in Baltimore of the International Society of Air Safety Investigators (ISASI).
Nevertheless, their report said, 34 percent of observations involved errors that “resulted in undesired states mainly associated with aircraft areas not checked for damage at any point during the check, APU [auxiliary power unit] left running unattended or failure to complete all checklist items before certification.”
Although the undesired states did not contribute to accidents or incidents, they still must be addressed, the report said, noting the potential for an accident or incident to result from occurrence of a similar undesired state.
For example, the report cited an incident in which a large access door separated from a Boeing 777 after takeoff from Gatwick Airport, damaging cabin windows, the fuselage and the fin. Some pieces of the access door penetrated the 777’s cabin while others landed near a couple who had been walking near a wooded area.
The U.K. Air Accidents Investigation Branch (AAIB), in its final report on the June 26, 2003, event, attributed the door’s separation to a deviation from standard operating procedures during routine maintenance, and said it was “likely that only one of the 13 door catches had been fastened.”3
Despite 11 subsequent walk-around inspections, conducted by nine people, no one noticed that the door catches were unfastened, the report said, adding, “The inadequate fastening had apparently occurred during a routine maintenance check due to a deviation from standard procedures — a practice that reportedly had been fostered by features of the maintenance system and may have been commonplace.”
The Langer-Braithwaite report said that, had a MOSS program been in place where the 777 was being serviced, observations could have identified “specific threats contributing to the failed systemic defence (e.g., walk-around inspection) and opportunities for errors with the potential to result in similar incidents so these can be addressed and reoccurrence prevented.”
Although the principles underlying the LOSA that is commonly used to assess flight crews can be applied in aviation maintenance and on the ramp, difficulties abound in transferring LOSA to work environments that bear little resemblance to the flight line, Langer said.
Release of Langer’s report coincided with the issuance by the U.S. Federal Aviation Administration’s (FAA’s) Civil Aerospace Medical Institute (CAMI) of a document presenting guidelines for implementation of an M-LOSA program or a ramp line operations safety assessment (R-LOSA).4 Similar documents were published by Boeing5 and Airlines for America, formerly known as the Air Transport Association of America.6
11 Steps
Guidelines for implementing a maintenance line operations safety assessment program (M-LOSA) or a ramp line operations safety assessment (R-LOSA) program begin with (1) obtaining buy-in from senior management.1
Assuming that management approves, the next steps call for (2) forming an implementation team, (3) marketing the M-LOSA and/or R-LOSA programs and (4) integrating those programs with existing safety programs, as well as the safety management system.
Next, the guidelines prescribe (5) developing LOSA infrastructure, “including three parallel activities: adapt/customize LOSA database, conduct train-the-trainer training, [and] establish and maintain a virtual LOSA website.” The next step is (6) to customize and conduct training for LOSA observers.
After that, the guidelines call for (7) collecting data, (8) validating data, (9) populating and maintaining a database, (10) analyzing data and compiling a report and (11) providing feedback to employees.
—LW
Note
- Ma, Maggie J.; Rankin, William L. Implementation Guideline for Maintenance Line Operations Safety Assessment (M-LOSA) and Ramp LOSA (R-LOSA) Program, Report No. DOT/FAA/AM-12/9. August 2012.
The CAMI document — developed through a four-year effort to extend LOSA methodology to aviation maintenance and ramp operations — presented an 11-step process for program implementation (“11 Steps”).
“The goal was to capitalize on the successes of flight deck LOSA,” the CAMI report said. To accomplish that goal, the FAA’s researchers consulted with airline safety representatives worldwide who were involved in M-LOSA and R-LOSA efforts.
Threat and error management (TEM) is the underlying framework for LOSA data collection, the report said, adding, “The TEM model is aimed at understanding error management (i.e., detection and response) rather than solely focusing on error causality (i.e., causation and commission). Regardless of the error type, its effect on safety depends on technicians’ and ramp employees’ detection and response to avoid an undesired operational state and prevent a potentially unsafe outcome.”
Under the TEM framework, safety observers can detect threats and errors that might go unnoticed by maintenance personnel and ramp employees, the report said.
Despite different opinions about what the “A” in LOSA stands for, the CAMI report said the program is an “assessment” process and “should not be represented or used as an audit program. It focuses on observing normal operations by peers in a non-punitive environment to identify ‘at-risk’ behaviors to implement changes to get employees to work more safely, as well as capture information on effective countermeasures currently in place. LOSA samples activities in normal operations — the vast majority of these are well-managed and successful operations. Confidential data collection and non-jeopardy assurance for frontline employees are fundamental to the process.”
LOSA’s data-derived safety information is intended to lead to “continuous quality improvement over time,” the report said, likening a LOSA experience to a person’s annual physical examination.
“People have comprehensive checkups in the hope of detecting serious health issues before they become consequential,” the report said. “LOSA is built upon the same proactive and predictive notion. It provides a diagnostic snapshot of strengths and weaknesses that an aviation organization can use to bolster the health of its safety margins and prevent degradation.”
The report cited 10 “essential characteristics” for the success of LOSA, including peer observations during normal operations, “confidential and non-punitive data collection,” voluntary participation, use of observers who are “trusted and trained,” sponsorship by both management and labor, a “systematic observation instrument based on TEM model,” a secure repository for collected data, “data-verification roundtables,” “data-derived targets for enhancement” and feedback to workers.
In the report published by Boeing, the procedures used in M-LOSA and R-LOSA were described as being very different from those used in flight crew LOSA.
“Flight LOSA relies on trained pilots using open-ended text to record observations,” the Boeing report said. “Ramp LOSA and maintenance LOSA have structured observation checklists that are used by an airline’s own staff. The tools developed for ramp LOSA and maintenance LOSA include a ready-to-use database and data analysis software that are kept with the operator. There is no need for outside data storage and analysis. This ensures that company data are secure and that analysis does not require external consultants.”
An M-LOSA observation form contains nine specific items: planning, prepare for removal, removal, prepare to install, install, installation test, close-up and complete restore, fault isolation/troubleshooting/deferral, and servicing. M-LOSA observations generally are conducted by one trained maintenance peer observer, but two observers may be required if an especially complex task such as an engine change is being performed.
An R-LOSA form — for observations to be carried out during an airplane turnaround by a team of two or three trained ramp peer observers — can contain a varied umber of items, depending on the organization’s choices. Among the possibilities are arrival, downloading, lavatory and potable water service, catering, cleaning service, fuel service, uploading, departure, deice and anti-ice, and pilot walk-around.
Measuring Readiness
The CAMI report said that, before beginning to implement an M-LOSA or R-LOSA program, organizations must have support for the idea from senior management, labor unions and other employee groups, and the workforce itself.
In addition, if the organization does not have at least some additional recommended items — familiarity with the LOSA concept and with safety management systems, other non-punitive safety programs such as an aviation safety action program, at least one formal safety data collection program, a human factors program and organizational support for a just culture — an M-LOSA or R-LOSA program is unlikely to succeed, the report said.
“Address any issues you identify first, and then come back to prepare for a LOSA implementation,” the document added.
In the beginning phases of program implementation, the main tasks for the implementation team should include publicizing LOSA within the organization — and especially among the employees who will be the focus of LOSA observers, the report said. The team also must decide on the focus of the LOSA — whether it will involve observations of a sample of the entire operation or of specific areas — as well as the timing of the observations.
The subsequent marketing of LOSA should involve multi-level, multi-strategy marketing plans, using face-to-face meetings, printed material and websites to target employee groups, frontline employees, managers and business partners.
“Organizations are naturally resistant to change,” the report said. “A good marketing plan should clearly define the safety value and benefits of a LOSA program.”
Later in the process, when the observations are about to begin, frontline employees should be reminded about the plan and the purpose of the LOSA and be given an opportunity to decline to be observed.
“Plan a reasonable number of observations per observer per day to allow sufficient time to complete the observation coding and write detailed comments,” the report advised. “Build some flexibility into the schedule to allow for the unexpected. Finally, do not let the observations continue indefinitely — schedule a set of observations within a one- to three-month period, if possible. The data need to be assessed and actions implemented in a timely fashion. This is not to preclude using LOSA observations as part of the overall SMS set of tools and conducting [the observations] if and when needed in your operations.”
After all data have been collected, validated and analyzed, the recommended procedures call for providing feedback — first to managers and labor leaders and then to frontline employees — about “what has been learned and action items derived from the initial round of LOSA observations.” Various departments within the operation may want to investigate further, and if so, data should be made available to them, the report said.
Later, the report added, “it is critical to continuously monitor the safety change process through implementing and monitoring actions resulting from the LOSA observations. Historically, organizational safety changes within aviation organizations have been driven by accident/incident investigation and intuition. Today, organizations must deal proactively with accident and incident precursors.”
Additional actions beyond the 11-step plan call for the integration of LOSA data with data derived from other safety programs and a return-on-investment analysis of the effects of LOSA and the resulting safety interventions.
Ultimately, information gathered through various LOSA programs should be made more widely available, the report said. “As more organizations implement LOSA programs, an industry-wide LOSA information-sharing meeting may be held biannually to exchange best practices and lessons learned, in addition to zooming in on fleet-wide problems. It is a priority to involve more airlines in the M-LOSA and/or R-LOSA initiative, as well as participants from the regional airlines and maintenance repair and overhaul communities.”
Notes
- LOSA was previously widely known as a line operations safety audit. The FAA and other supporters of the new terminology say the word “assessment” later was selected because of a desire to make clear that LOSA is distinct from the traditional airline safety audit process.
- Langer, Marie; Braithwaite, Graham. “The Development of the Maintenance Operations Safety Survey: Challenges in Transferring a Predictive Safety Tool From Flight Operations to Aircraft Maintenance.” Paper presented at ISASI 2012 Annual Seminar, Baltimore, August 2012.
- AAIB. Accident Report EW/C2003/06/04, published in AAIB Bulletin No 3/2005.
- Ma, Maggie J.; Rankin, William L. Implementation Guideline for Maintenance Line Operations Safety Assessment (M-LOSA) and Ramp LOSA (R-LOSA) Program, Report No. DOT/FAA/AM-12/9. August 2012.
- Rankin, William; Carlyon, Bill. “Assessing the Safety of Ramp and Maintenance Operations.” Boeing Aero Quarterly (Quarter 2 2012): 10–15.
- Airlines for America. Implementation Guideline for Maintenance Line Operations Safety Assessment (M-LOSA) and Ramp LOSA (R-LOSA) Programs. Washington. 2012.