(2) One minute or so after donning the mask, the pilot notices his or her motor and cognitive functions are deteriorating. In an effort to get help, he or she drunkenly moves the flight controls or kicks off the auto pilot while fighting to stay conscious. The aircraft goes out of control. Result: many fatalities.
Oxygen bottles have been installed in aircraft for at least 60 years. We mechanics have been servicing oxygen bottles for almost as long as we’ve been twisting safety wire. How could this happen?
Human factors case study
I believe that our Qantas incident is a maintenance error that is destined to become a human factors classic case study. I am also sure human factors specialists will identify additional human factors other than what I offer. Disclaimers aside, I would like to use two approaches to analyze this problem. The first is to broadly identify how the error could have been prevented using the Dirty Dozen list of error reduction and prevention strategies. The second way is to take a closer look at the task performance and unsafe acts using the STAMINA method.
Dirty Dozen: Going down the list of Dirty Dozen factors, I have selected the following:
Norms: The nitrogen cart looked just like the rest of the existing oxygen carts. It was not like the other nitrogen carts. Ergo, the cart must contain oxygen.
Lack of awareness: The mechanics did not seem to be aware that the bottles were marked with the word “nitrogen” and were of a different color.
Complacency: For almost 10 months, many mechanics used the cart before the error was discovered.
STAMINA is a human factors training program developed by Trinity College in Dublin, Ireland, and adopted by the JAAT, the training arm of the JAA. The STAMINA program is also taught in the states by the Airworthiness Standards Institute out of Richmond, TX. STAMINA human factors program uses the acronym “AWES_M” or “Awesome” to help human factors students remember the kinds of unsafe acts that we mechanics are prone to and is used to dig a little deeper into the cause of the problem. Using the Qantas oxygen/nitrogen incident as our example, let’s examine the incident using AWES_M.
A stands for action slip. The mechanic’s intent was to service the jet with oxygen but it was serviced with nitrogen instead. The intent was correct but the outcome was dead wrong.
W stands for work-around. The fittings on the new nitrogen cart were designed to service aircraft tires, not oxygen fittings, so they did not fit the oxygen service ports. The mechanics took the old fittings off the old oxygen cart and installed them on the new nitrogen cart line. Instead of figuring out why the new cart fittings did not fit or checking the manual, the mechanics worked around the problem and devised a dangerous solution.
E stands for expertise error. This can means two things. First, the original mechanics that brought the cart into service were not trained to properly service oxygen bottles so they could recognize the difference between oxygen and nitrogen bottles and their associated fittings. It may also indicate a lack of understanding on the part of the other mechanics that followed them even if they were given formal training on the equipment to service oxygen bottles.
S stands for situation awareness error. The major cause of the incident or critical factor in this incident was the fact that the new nitrogen cart looked identical to the old oxygen cart. The only differences were that the bottles in the cart were not green and the word nitrogen was stenciled on the bottles. While the nitrogen bottles were properly identified, all the mechanics saw was a cart that was identical in design and color to the old oxygen cart; therefore, there had to be oxygen in the bottles. This “looks like a duck” scenario is a classic situation awareness error. In this kind of human factors error, your brain fills in the blanks when you select the first, best solution in your mind, even if that solution is wrong.
M stands for memory error. Without prejudging the mechanics, and based on my own experience, I would be willing to believe they fell feet-first into the complacency trap. I base this observation on the fact that there were groups of mechanics who must have done this particular task at least 100 times before. I could find no mention in my research that the mechanics had a manual or a work card to use as a memory guide. They performed the task with their memory running on auto pilot instead of actively engaging in thinking the task through.
ESP fault locator This article seems almost humorous when the use of the ESP devise is disallowed by the FAA — ha. Excuse me, if I can troubleshoot a system with any non-evasive device that...
Vacuum anchor technology I read with interest the article in the March 2008 issue page 32 about the “technology developed to protect mechanics and aircraft” by Tim Maroushek. Although I am...
Maintenance Error Decision Aid (MEDA) A process to help reduce maintenance errors By Joe Escobar April 2001 April 28, 1988 — an Aloha Airlines Boeing 737-200 lost 20 feet off the...