Partners and Programs:
  • BARS
  • SKYbrary
  • ASN
  • Contact Us
  • Members' Center
  • Login
  • Support Aviation Safety

  • Industry Updates
  • The Foundation
    • About the Foundation
    • Asia Pacific Centre for Aviation Safety
    • Founders
    • Mission
    • History
    • Leadership
    • Officers and Staff
    • Media/Communications
    • Aviation Award & Scholarship Programs
    • Work with Us
    • Join Us
  • AeroSafety World
  • Events
  • Toolkits & Resources
    • Mental Health and Wellness
    • Global Action Plan for the Prevention of Runway Incursions (GAPPRI)
    • COVID-19 Crisis Resources
    • Fatigue Management
    • Flight Path Monitoring
    • Global Action Plan for the Prevention of Runway Excursions (GAPPRE)
    • Go-Around Project
    • Global Safety Assessment Project
    • Learning From All Operations
    • Past Safety Initiatives
    • Pilot Training and Competency
    • Special Reports
    • ASN Accident Dashboards
    • ASN Accident Data
    • Videos
  • Industry Updates
  • The Foundation
    • About the Foundation
    • Asia Pacific Centre for Aviation Safety
    • Founders
    • Mission
    • History
    • Leadership
    • Officers and Staff
    • Media/Communications
    • Aviation Award & Scholarship Programs
    • Work with Us
    • Join Us
  • AeroSafety World
  • Events
  • Toolkits & Resources
    • Mental Health and Wellness
    • Global Action Plan for the Prevention of Runway Incursions (GAPPRI)
    • COVID-19 Crisis Resources
    • Fatigue Management
    • Flight Path Monitoring
    • Global Action Plan for the Prevention of Runway Excursions (GAPPRE)
    • Go-Around Project
    • Global Safety Assessment Project
    • Learning From All Operations
    • Past Safety Initiatives
    • Pilot Training and Competency
    • Special Reports
    • ASN Accident Dashboards
    • ASN Accident Data
    • Videos
  • Contact Us
  • Members' Center
  • Login
  • Support Aviation Safety
Partners and Programs:
  • BARS
  • SKYbrary
  • ASN

FLIGHT SAFETY FOUNDATION HEADQUARTERS

701 N. Fairfax Street, Suite 250,
Alexandria, Virginia 22314

Phone: +1 703 739 6700 Fax: +1 703 739 6708

  • Aviation Safety Experts
  • AeroSafety World
  • Data-Entry Errors Can Lead Aircraft Off Course

Human Factors Issues, Commentary, Managing Editor’s Notebook

Data-Entry Errors Can Lead Aircraft Off Course

Distractions and heavy workload are often cited as contributing to programming mistakes.

by Robert Baron | December 28, 2020

In 1995, American Airlines Flight 965, a Boeing 757, struck a mountain near Buga, Colombia. A last-minute change in the arrival caused a rushed descent. The captain entered “R” for “Rozo” (a nondirectional beacon [NDB] radio navigation aid) into the flight management system (FMS). However, the Rozo NDB could be accessed only by entering its full name; entering only “R” accessed the Romeo NDB in Bogotά, 132 nm (244 km) away. The input was not verified by the first officer (FO), and neither pilot verified the effect on the flight path, which resulted in a loss of situation awareness. Once the pilots recognized the error, a recovery was initiated, but it was too late. The aircraft struck El Diluvio Mountain at approximately 8,900 ft above mean sea level. There were 159 fatalities; four passengers survived.

Erroneous (or incomplete) data frequently are entered into the FMS, and frequently this results in a vertical, lateral or speed deviation. This articles focuses on navigation-type (aircraft handling) FMS errors. However, the same principles apply to other types of FMS errors such as weight and balance and thrust settings.  

Typical FMS

While technology has reduced human error on the flight deck over the past few decades, it has also presented other pathways for errors. An FMS is only as good as the human who interfaces with it. If pilots enter a wrong waypoint, without verification, the computer will do exactly what it is programmed to do. Often, the first time the pilots become aware of the error is when it is pointed out by air traffic control (ATC).

For the purpose of this article, the following error types/definitions are used. Each of the error types includes a practical example, which was obtained through either a U.S. National Aeronautics and Space Administration (NASA) Aviation Safety Reporting System (ASRS) report or from a line operations safety audit (LOSA) observation:

Slip — An action that is performed incorrectly (error of commission).

The FO installed the correct SID [standard instrument departure] but with the incorrect transition. When comparing this to the FMS, the waypoint COREZ was not included … so I directed him to add it because it was the clearance. Simply we were to fly the WLKKR3 RNAV departure COREZ transition not the CSTRO transition. To add to the confusion, our filed flight plan included both SID ending waypoints, which almost never occurs (NASA ASRS Accession No. 1584377).

Lapse — An action that is not performed (error of omission).

Crew received a new clearance and entered it in the FMS, though NAV mode was not selected. The aircraft flew through the desired track in HDG mode; crew noticed the error when the aircraft was 2.4 nm [4.4 km] left of the track in controlled airspace (LOSA observation).

Intentional Violation —  Intentional action (or inaction) that results in noncompliance with known rules, policies, procedures or acceptable norms.

During pre-departure briefing, the crew discussed the ORD 4 departure procedure with special attention paid by the SIC ([second-in-command], the pilot monitoring) to the 250 knots until advised notation on the SID. During taxi out, complicated taxi instructions from ground control on a large, unfamiliar, air-carrier centric airport led to substantial confusion during taxi and a breakdown in crew communication. Immediately following a normal departure, the PIC [pilot-in-command] deselected autothrottle and refused the SIC offer to select a vertical mode for climb while he hand-flew the climbout. The crew turned to the ATC assigned heading and acknowledged the climb instructions to an altitude above 10,000 feet. A few minutes later, the crew was handed off to a different departure controller, who informed the crew “resume normal speed” and assigned a climb to a higher altitude. It was at that point the SIC realized the aircraft was at 300 knots and had exceeded the 250 knots until advised limitation listed on the SID (NASA ASRS Accession No. 1586088).

Non-intentional Violation — Unintentional action (or inaction) that results in noncompliance with known rules, policies, procedures, or acceptable norms.

After a few minutes of setup, I briefed the RNAV [area navigation] approach, I confirmed IAF CLFFF at 11,000 ft on the FMS and LNAV and VNAV PTH [lateral and vertical navigation path] on the FMA [flight mode annunciator], but I didn’t properly VVM [Verbalize, Verify, Monitor] the rest of the approach on the FMS. As we crossed CLFFF and did not start descending, we realized something was not set up properly (NASA ASRS Accession No. 1590629).

In most cases, FMS errors are unintentional. In other words, the pilot did not intend to enter an incorrect waypoint (slip), or to omit the waypoint completely (lapse). In other cases, FMS errors are classified as a violation. In these cases, the pilot did not intentionally enter an incorrect SID, or waypoint, to counter an ATC clearance, but the violation occurred because of how the error was handled. One example is the failure to cross-verify, per the company standard operating procedures (SOPs). In some cases, automation complacency can compound violations and lead to a complete loss of situation awareness, as in the following example.

On Oct. 21, 2009, Northwest Airlines Flight 188, an Airbus A320, overflew its destination airport in Minneapolis-St. Paul, Minnesota, U.S., (MSP) by about 150 mi (241 km) and was out of radio contact with ATC for 78 minutes. According to the U.S. National Transportation Safety Board report, the pilots were going over crew scheduling using their laptop computers. Neither pilot said he was aware of the airplane’s location until a flight attendant called the cockpit about five minutes before the plane was to have landed and asked their estimated time of arrival. Once ATC verified the pilots were in control of the aircraft, Flight 188 was then cleared back to MSP, and landed more than an hour past its scheduled arrival time.

Countermeasures

Automated systems must be managed and monitored by flight crews at all times. Effective management and monitoring significantly reduce the likelihood of automation events. Yet, completely preventable events continue to occur on a regular basis. Why? And what can be done?

FMS-related errors are typically influenced by environmental (error-provoking) conditions such as stress, pressure, distractions and high workload. These errors occur during all phases of flight. In fact, some of the most egregious errors occur during the pre-departure phase, before the aircraft even begins to move. Regardless of where and when these events occur, the consequences can range from a simple inconvenience, to a controlled flight into terrain accident.

Flight crews are confronted with environmental influences on a regular basis and receive specific training, such as crew resource management (CRM) and threat and error management (TEM), to effectively handle these influences in a crew environment. Unfortunately, in many cases, pilots who successfully complete CRM/TEM training in the classroom or simulator fail to apply CRM/TEM principles in the practical world.

Flight crewmembers who do not effectively apply CRM/TEM principles will have a much more difficult time mitigating the effects of pressure, stress, high workload and distractions. Two common FMS errors that result from poor CRM are failure to verify inputs and failure to monitor. These errors may occur separately, or in unison, as was shown in the American Airlines Flight 965 accident.

When it comes to FMS error prevention, you already know what to do. Practice good CRM/TEM and follow your SOPs. That way, when errors do occur (and they will), you will be better able to identify and trap the errors before they become consequential and lead to undesired aircraft states.

Stay sharp. Avoid complacency. Manage your automation; don’t let it manage you!

Robert Baron, Ph.D., is the president and chief consultant of The Aviation Consulting Group, Inc., in Myrtle Beach, South Carolina, U.S. He conducts extensive training, research and program implementation in human factors, safety management systems, crew resource management and line operations safety audits. He consults with, and provides training to, hundreds of aviation organizations worldwide.

Photo credit:  © petroalexey | Adobe Stock

 

Share:

Print:

Key Safety Issues

  • Controlled Flight Into Terrain (CFIT)
  • Loss of Control–In Flight (LOC-I)
  • Mechanical Issues
  • Runway Safety (approach and landing)
  • Sabotage/Intentional Acts
  • Midair Collisions (MAC)
  • Runway Safety (Conflicts)
  • Wildlife Issues
  • Fatigue
  • Cabin Safety
  • Emerging Safety Issues
    • Lithium Batteries
    • Safety Information Sharing and Protection
    • Unmanned Aircraft Systems

Related Content

Flight Ops, Laser Strikes, News, Technology

Safety News

The European aviation community is stepping up efforts to prevent midair collisions involving small aircraft.

by FSF Editorial Staf

Accident/Incident Investigation, Flight Deck, News

Smoky Distraction

The heavy workload accompanying a smoke-in-the-cockpit event interfered with checklist completion by a Metro 23…

by Linda Werfelman

Aviation Research, Human Factors, President's Message

Opinion: Co-Designing Safer Skies

The continuing evolution of CRM is prompting the authors’ call for subject matter experts to…

by Kimberly Perkins and Fabio Mattioli

Read more articles

1920 Ballenger Ave., 4th Floor, Alexandria, VA 22314

Phone: +1 703 739 6700 Fax: +1 703 739 6708

Projects & Partners

  • Basic Aviation Risk Standard
  • SKYbrary
  • Aviation Safety Network
  • Asia Pacific Centre for Aviation Safety
  • Donate
  • Advertise on our website
  • Sponsor & Exhibit at our Events
  • Work with Us
  • Contact Us
  • Site Map
  • Privacy

© 2025 Flight Safety Foundation

Join our group on LinkedIn