Reports
Untimely Action
FAA Needs to Improve Risk Assessment Processes for Its Air Transportation Oversight System
U.S. Department of Transportation Office of Inspector General. Report no. AV-2011-026. Dec. 16, 2010. 36 pp. Figures, tables, appendix. www.oig.dot.gov/library-item/5468.
Quis custodiet ipsos custodes? asked the Roman poet Juvenal: Who will guard the guards themselves? Had he been living today, he might ask how to oversee those who oversee aviation safety.
The U.S. Federal Aviation Administration (FAA) needs to provide more oversight of its Air Transportation Oversight System (ATOS), the Office of Inspector General (OIG) says. The latest report reaches essentially the same conclusion as OIG reports in 2002 and 2005.
The FAA uses the ATOS to conduct surveillance of nearly 100 airlines that transport more than 90 percent of U.S. airline passenger and cargo traffic.
“We have consistently reported that ATOS is conceptually sound because it is data-driven and intended to target inspector resources to the highest risk areas,” the report says. In the earlier reports, however, “we reported that FAA needed to strengthen national oversight of ATOS to hold field managers more accountable for consistently implementing effective oversight practices.”
The objectives of the most recent OIG audit were “to determine (1) whether FAA has completed timely ATOS inspections of air carriers’ policies and procedures for their most critical maintenance systems; (2) how effective ATOS performance inspections have been in testing and validating that these critical maintenance systems are working properly; and (3) how well FAA implemented ATOS for the remaining [U.S. Federal Aviation Regulations (FARs)] Part 121 air carriers and what, if any, oversight challenges FAA inspection offices face.” The audit was conducted from May 2008 through July 2010.
The ATOS has three primary functions, the report says. All involve FAA inspector oversight of airlines. The first function is assessments of air carrier system design — policies and procedures, typically analyzed through reviews of operational manuals. Second are performance assessments that confirm that the system is producing the intended results. Third is evaluation of risk management. Inspectors look at air carriers’ ways of dealing with hazards, as well as FAA enforcement actions and rule making.
The report says that “FAA has not completed timely ATOS inspections of air carriers’ policies and procedures for key maintenance systems.”
Incorporating the findings of the earlier audits, the report says that “over an eight-year period, inspectors at all eight major air carrier inspection offices in our review did not complete 207 key inspections of air carriers’ maintenance policies and procedures on time. This is despite changes FAA made to ATOS over the last 10 years that actually decreased the number of maintenance programs inspectors were required to review and increased the intervals between inspections.”
From federal fiscal year (FY) 2002 through FY 2009, each from Oct. 1 to Sept. 30, none of the eight FAA major air carrier inspection offices completed systemic ATOS reviews of maintenance policies and procedures — called design assessments — on time. “Four … completed less than 50 percent of their inspection workload at the required interval,” the report says. “As a result, any risks in the air carrier systems would remain ‘unknown’ until inspections are completed.”
The most common overdue inspection was “required inspection item training requirements” at seven offices. “Availability [of manuals],”
“manual currency,” “supplemental operations manual requirements,” “parts/material control/suspected unapproved parts” and validation of the qualifications of the people holding the positions of chief inspector and director of maintenance were each overdue at six offices.
“Principal inspectors stated that they missed inspection intervals due to confusion over FAA’s guidance on when they should complete ATOS design assessments,” the report says. “According to these inspectors, the guidance only ‘suggested’ a five-year inspection cycle for this type of assessment.
While this may have been the case when FAA issued guidance in 2001, it reissued the guidance in July 2007, in part, to clarify inspection requirements. The revised guidance explicitly stated that these assessments must be completed every five years.”
The FAA has a system to monitor field office inspections, known as the Quarterly ADI (Action, Determination and Implementation) Completion Report. But the OIG audit found that the FAA “does not track overdue and unassigned ATOS inspections. … We examined FAA’s quarterly reports from June 2008 through June 2009 and identified 237 scheduled inspections that were left unassigned and uncompleted. However, FAA did not use the Quarterly ADI Completion Report to track any of these to ensure they would be rescheduled and completed.”
Some inspection programs had previously been identified by the FAA inspectors as involving “elevated risk.” Of those, “engineering/major repairs and alterations” was four years past due; “other personnel with operational control” was three years and three months past due; “training of flight attendants” and “major repairs and alterations records” were each 18 months in arrears.
“ATOS was envisioned to be a risk-based oversight system, but we found the risk assessment process — the basis for prioritizing inspections for timely completion — does not give priority to maintenance programs where FAA inspectors found increased risk,” the report says. “Also, inspectors we interviewed were not analyzing voluntary disclosure data (i.e., maintenance errors that air carriers self-report) or industry events that could impact a carrier’s performance and stability. Voluntary disclosure data and changes in the airline industry are important indicators of whether air carriers are properly maintaining their aircraft during times of economic downturn.”
As examples of “high-criticality” inspections most often missed at major air carriers, the report cited 30 continuing analysis and surveillance system (CASS) inspections, 28 aircraft reliability program inspections and 21 airworthiness directive management inspections. At small carriers, these included 20 engineering/major repairs and alterations program inspections; 15 maintenance control program inspections; 15 CASS inspections; and 12 aircraft airworthiness inspections.
The report identifies what OIG considers a flaw in the ATOS system design: Prompting inspectors “to place priority on programs that are not necessarily high risk. This is because ATOS disproportionately weights maintenance programs designated as high-criticality over lower-criticality maintenance programs, even though inspectors have identified deficiencies in the lower-criticality programs” based on “inspectors’ analyses of air carrier data and inspection reports, which indicate that the individual maintenance program is experiencing problems.”
The report cites as an example an air carrier’s general maintenance manual, which “by itself, will not result in an unsafe condition on an aircraft. However, as the foundation for an effective aircraft maintenance program, without accurate manuals, maintenance errors can occur. We agree that high-criticality programs warrant vigilant FAA oversight, but they may not always present the highest risk to safe air carrier operations if inspectors have not identified any hazards in the programs. … More emphasis on prioritizing programs with increased risk, regardless of the criticality designation, would allow FAA to better target inspector resources.”
ATOS can be a poor fit for smaller Part 121 air carrier inspectors, the report says: “Managers and inspectors at 12 FAA inspection offices for smaller air carriers that recently transitioned to ATOS cited concerns with the system’s design, such as inspection checklist questions, air carrier staffing limitations, confusion over how to record inspection findings, and insufficient data to effectively support the ATOS ‘data-driven’ approach.”
In addition, inspectors at smaller offices “expressed frustration with the gap between the time [when] they received system safety training and when they actually began using ATOS to oversee their assigned air carrier.” In some cases, the report says, the training occurred as much as six years before they began inspections under the system. “For those inspectors who had a gap of three or more years, nearly 70 percent reported being unable to recall and apply system safety concepts to answer ATOS inspection questions. Understanding and applying system safety principles is key to ensure that air carriers’ maintenance programs work effectively and that ATOS contains accurate data.”
OIG recommends that the FAA:
- “Redesign the Quarterly ADI Completion Report to include cumulative roll-up data from previous quarters and conduct trend analyses that could be used to hold regional and local inspection offices accountable for scheduling uncompleted inspections;
- “Develop procedures to document justification for significant changes to ATOS (i.e., planned changes to alter the number of data collection tools or prescribed inspection time intervals);
- “Redesign the current risk assessment process within ATOS so that it appropriately prioritizes maintenance programs with the greatest percentage of increased risk (regardless of criticality level) for inspector resources;
- “Provide training to inspectors to help them more accurately interpret data from all available sources … and apply the results more consistently when planning risk assessments;
- “Evaluate ATOS to determine if it is designed to accurately record inspection findings unique to smaller air carrier operations;
- “Evaluate whether ATOS is scalable across all Part 121 air carriers; [and,]
- “Expedite enhancement of ATOS training methods … to assist inspectors in understanding how to use ATOS data collection tools and increase their proficiency in using ATOS.”
The FAA concurred with four of the recommendations and partially concurred with three.
Variations on a Theme
U.S. Airline Transport Pilot International Flight Language Experiences, Report 5: Language Experiences in Native English-Speaking Airspace Airports
Prinzo, O. Veronika; Campbell, Alan; Hendrix, Alfred M.; Hendrix, Ruby. U.S. Federal Aviation Administration (FAA) Civil Aerospace Medical Institute. Report DOT/FAA/AM-10/18. December 2010. 23 pp. Tables, references. www.faa.gov/library/reports/medical/oamtechreports/2010s/2010/201018.
Previous reports in this series have documented experiences of U.S. international airline pilots where the native language is other than English (ASW, 3/09, p. 49; 5/10, p. 54). But U.S. international pilots also fly in airspace of countries such as Australia, Canada, Ireland, New Zealand and the United Kingdom where English is the primary native language. There, too, local dialects may differ in pronunciation and even grammar from U.S. English. And regional variations in English speech are common within the United States.
“The [48] pilots who participated in this study were instructed to think about how hearing other native dialects of the English language affected their safety and communications between them and air traffic controllers,” the report says. As with the earlier reports, pilot responses were combined and condensed for the sake of readability.
Seventy-nine percent of the pilots said that they “rarely” had communication problems in native-English-speaking environments, while 15 percent “occasionally” had problems. Two pilots “frequently” experienced communication difficulty, one in the United Kingdom, the other in the eastern Caribbean. One pilot “often” had difficult interactions with controllers — on opposite coasts of the United States. That pilot mentioned differences in dialect between controllers in Los Angeles and Washington.
Asked which English dialect was hardest to understand, participating pilots mentioned U.K. English most often, perhaps because of greater frequency of flights compared with other native-English-speaking countries. The British Isles have many sub-dialects. One pilot mentioned particular trouble understanding Welsh, Scottish and Irish pronunciation as well as “a thick Southampton accent” — probably meaning East London speech, formerly called “Cockney.”
“I notice that sometimes the controllers in New England talk too fast, as do the New Yorkers,” said a pilot. “I’ve experienced New York Approach fire instructions nonstop, and I know they’re not going to be happy if we miss one. When at [Chicago] O’Hare, I perceive controllers speaking like auctioneers, but I understand them. I think it would be difficult to operate in that environment as a foreign pilot.”
Said another: “I’ve made quite a few trips to Delhi [India] and was unprepared for the particular cadence in their speech.”
Although pilots who experienced difficulty mostly regarded it as an annoyance, one suggested a possible safety concern: “There are quite a few pilots [with seniority] who avoid flying to some parts of the world because of language concerns — they’re afraid of miscommunication. This forces junior pilots with the least experience to fly into these sections.”
Another pilot said, “The rigors of flying to Delhi or other destinations, going through Russia and all the [bordering nations] to get there, is professionally satisfying because it involves meters, QFE instead of QNH [altimeter settings], different accents and carbon microphones. But I no longer want to be challenged in that way.”
Pilots were asked how often controllers in native-English-speaking countries slipped out of standard International Civil Aviation Organization phraseology into “common” English, that is, normal or conversational speech. Eight percent answered “without fail,” 21 percent “often” and 19 percent “frequently.”
One pilot commented, “We get colloquial in the U.S. because we have that common understanding. After the initial clearance, we’ll ask for other stuff and it’s frequently in common English. It’s just easier to communicate in your native tongue. I think that’s why, internationally, controllers sometimes switch to their native language — it’s quicker and easier.”
The report recommends that “controllers should be discouraged from using local jargon, slang, idiomatic expressions and other forms of conversational communications when transmitting messages to pilots. Although colorful and fun, they have no place in air traffic control and diminish situational awareness, can lead to requests for [a] repeat and otherwise disrupt information transfer.”
One pilot dissented from that view, saying, “First of all, I’m building a bit of a rapport, should I need something. Second, since we’re human beings, we like to treat each other with some amount of respect, and that’s the way to do it on the radio.”