An official website of the United States government
A .mil website belongs to an official U.S. Department of Defense organization in the United States.
A lock (lock ) or https:// means you’ve safely connected to the .mil website. Share sensitive information only on official, secure websites.

Learning From Others

  • Published
  • By TSgt Ray Johnson
  • HQ AMC ASAP Program Manager
To err is human. It is a familiar phrase that we have all heard before. No matter how much we prepare, failures are inevitable; we are human after all. What is important is that we learn from our mistakes and help others learn from them, as well, so they do not recur in the future. AMC's Aviation Safety Action Program (ASAP) does just that--it encourages crews to get the word out about threats, errors, unsafe conditions, and lessons learned. Among the most prominent ASAP lessons is the importance of monitoring the aircraft and VVM (verbalize, verify, and monitor) procedures.
 
"Upon landing, the crew realized the difference between the altimeter reading and actual field elevation. After recalling the events and listening to ATIS again, the crew realized that the controller had reported the altimeter setting in MB. The phrase "millibars" was not used, and since the reported figure was below 1000 millibars, the crew's "confirmation bias" led to their setting the altimeter in inches. The interpretation error resulted in an incorrect altimeter setting that led to the aircraft being 500 feet lower than indicated ..." ASAP 583

Since its inception in 2009, AMC's ASAP office has received numerous reports like the one just described from crews. In fact, altitude deviations are the number one submission for ASAP, making up over 16 percent of all reports submitted. These reports cover a wide range of circumstances. Some include leveling off at the wrong altitude while others point to altimeter errors resulting in aircraft altitude deviations, some of which occurred in threatening mountainous terrain. Crew vigilance and effective VVM procedures can trap these errors--just one crewmember noticing and announcing such discrepancies will stop the error chain.

"We were cleared for an intermediate level off at FL270. Passing through FL260, both pilots acknowledge by stating passing 260 for 270 set and armed. Both pilots momentarily diverted attention from the instruments and then the PF recognized the aircraft approaching FL280. The PF noticed the AP had advanced to capture mode but didn't level off at the preset altitude." ASAP 459

By now, we are all familiar with what monitoring is and what the duties of the pilot monitoring are. Our culture reinforces it through the teaching of CRM/TEM and the principles of VVM. We must use this skill every day in our flying duties. However, mistakes happen and these examples are situations that others may have encountered. They are also completely avoidable situations. In the latter case, the pilots set the automation and watched the aircraft begin the maneuver. Other flying priorities distracted them from monitoring the aircraft to ensure it captured the correct altitude. In the former, the pilots allowed their expectations to drive their decisions and missed indicators that could have pointed out their error. Both ASAPs are good examples of why continual monitoring of the aircraft is so important.

Despite us appreciating the importance of strong monitoring skills, it is important to take a step back and maintain a healthy amount of suspicion to keep us alert. System automation has brought many benefits to aviation. We've seen improvements in safety, reliability, and efficiency, but automation has also led to an increased risk of complacency. Many of our more experienced pilots may remember a time when system failures were commonplace and they were used to expecting them, but newer pilots are coming into the field at a time when the opposite is true. Modern automation systems are highly reliable, increasingly user friendly, and easy to operate. The rarity of failures in highly automated cockpits presents a hazard because humans are inherently poor monitors for rare events--we tend to recognize patterns and expect them to continue. When the pattern is interrupted, our natural biases cause us to ignore the conflict and focus on our expectations, which too often leads to poor handling of the situation or missing the discrepancy altogether. In this next example, the aircraft leveled off as the crew expected, but they missed that it was at the wrong altitude.

"... we were told to climb to FL330. As the aircraft approached FL320 it leveled off even though our MCP (Mode Control Panel) was programmed with FL330. Our FMC was programmed with FL320 as the cruise altitude. I had the autopilot coupled to the FMC, was flying in VNAV, and adjusting airspeed to meet mission timing. After cruising at FL320 for approximately 5-10 minutes, ATC asked us if we were supposed to be at FL330. We said yes and immediately climbed to FL330." ASAP 332

ASAP submissions demonstrate that despite their increased reliability, automated systems and our human interaction with them is not without fallibility. We can make flying operations safer by self-identifying and reporting these unsafe conditions via ASAP so others may benefit. Knowledge of the condition aids in preventing similar incidents with other crews. Ultimately, it provides the opportunity to make an inherently dangerous operation safer for us all.