Human Factors: Programs Continue to Advance

March 29, 2011

The term “Human Factors” (HF) provokes a wide range of definitions and emotions in ramp safety circles. HF programs, simply stated, promote safe and efficient work with an eye on human performance. This article takes a quick look back at nearly 30 years of human factors in aviation. HF programs have evolved to be a critical part of Safety Management Systems by looking not only at the human, but at the organization. This is the first in a series of articles that will showcase applied human factors research products that you can implement today.

Historical Human Factors

Today’s ramp and maintenance HF programs have been around since the late 80’s when the US Congress gave FAA a mandate and funding to study and apply human factors to all aviation workers.  HF consultants and scientists used a number of useful models, acronyms, and other memory aids to simplify HF concepts.   Table 1 lists the famous ones, that are available on a quick Google search.

Place Table 1 Here

These memory aides are an excellent human factors review.  Heinrich (1972) used a floating iceberg to suggest that the largest and most dangerous part of the iceberg is below the water line as shown in Figure 1.  The tip of the error iceberg represents the accident that we investigate. However, there are thousands of small errors below the surface that really lead to problems.  For example a cargo door may have not been sufficiently fastened.  While the unsealed door was the tip of the iceberg, the contributing causes were lack of enough qualified personnel, a rushed departure, and a faulty belt loader that distracted the crew.

Figure 1: The Error Iceberg

By 1992 the Boeing Company created a system to investigate the many small events, often human errors that contributed to a serious incident.  Boeing named their process Maintenance Error Decision Aid (MEDA).   It is used by over 800 airlines worldwide to investigate ramp, maintenance, and other aviation events.

Many HF memory concepts emerged in the early to mid-nineties.  James Reason used a stack of Swiss Cheese to represent all of the safety and redundancy in the aviation system.  When many slices of cheese are stacked together none of the holes line up and the system is protected. However, sometimes the holes can align and permit the pile of cheese to have a weakness that can be penetrated by the many threats surrounding the usually safe aviation system.   Figure 2 shows a Lufthansa Technical Training rendition of Reason’s Swiss Cheese using terms like “Contributing Factors” and “Corrective Actions,” from Boeing’s MEDA. 

Figure 2:  The Swiss Cheese

No discussion of aviation HF is complete without mentioning Gordon Dupont’s “Dirty Dozen.”   The Dirty Dozen represent twelve common contributors to human error including lack of assertiveness, poor communication, fatigue, lack of knowledge, lack of resources and six other contributors. The concept has become a mandated part of human factors training by most aviation authorities.

Bill Johnson and Mike Maddox introduced “PEAR” as a way to recall and consider HF issues in any organization.  PEAR reminding you to consider the “People” in the organization; the “Environment” in which they worked, including the physical and the social environment; the “Actions” they must perform; and all of the “Resources” necessary to complete the job. Figure 3 shows PEAR.

Figure 3: The Components of PEAR

Human Factors Today

Initially, aviation HF focused on specific human characteristics of pilots and other aviation workers.  That singular focus expanded to the study of teams.   The term, “Crew Resource Management” (CRM), emerged.  The CRM concepts focused on communication, leadership, and how a small groups worked together to focus on flight safety, personal safety, and on work efficiency.

Today’s HF looks beyond equipment design, individual performance, and teamwork to the concept of “Safety Culture”.   That means that everyone in the organization places safety as the highest priority. Each employee can explain their contribution to the safety of the organization.   Today’s HF focus is broadened to the entire organization.   The organization has the responsibility to manage safety with Safety Management Systems.

Today’s safety cultures pay attention to HF, using all of the models mentioned above.  Today’s safety cultures take HF to a higher level by applying the concepts of risk analysis.   That means that everyone in the organization looks at their job and the process to assess risk.  Everyone continuously conducts risk assessments. They must estimate the severity of the consequences of their actions weighed against the probability that their actions could lead to those consequences.  For example, a worker must ask such questions as the following:  What are the consequences if I damage a surface of the aircraft by scraping it with a tug, dock, or belt loader?  Then, what is the probability that the consequence could happen?  If the consequence is low and the probability of happening is low then it is probably a low risk situation.   Safe organizations must educate each worker to understand the procedures for risk assessment.   When a worker cannot assess the risk they must have a way to get immediate assistance.

Today’s safety cultures rely on data and on open reporting of errors.   Workers must be able to report when they have made an honest mistake.   Such errors must become learning experiences that prevent the same error from happening to another worker.  That means that the organization must have a written policy that is fair and “just” to all workers.  A young ramp worker must have the same rights to report an error as those of a senior flight crew member.

Today the international industry is focusing on Safety Management Systems (SMS).   That means there will be an evolving importance on looking at the kind of data that not only tell why something went wrong but also that can look at the data to predict when something may go wrong, before there is damage, injury, or an accident.  That means that SMS will address threats before they become errors.  Voluntary reporting systems, that show very small threats, are a key to predicting and preventing the big events.

The Human Factors Side of Safety Management

 The International Civil Aviation Organization (ICAO) has recommended that all member states mandate Safety Management Systems.   Governments and the industry are implementing SMS now.  Airlines, MROs, and Aviation Authorities are scrambling to establish SMS departments and the SMS business is booming.  As often happens, the focus is primarily on process and management of the new buzz words and acronyms.  However, increasingly, users are seeing the SMS safety and financial payback.

While the SMS-building process accelerates the human error in aviation continues.  The percentage of accidents caused by human error remains around 80%.  Whether the primary accident cause is controlled flight into terrain, loss-of-control, or some other factor, human performance is usually a significant contributing factor.  SMS data, eventually, will return to that known conclusion.   That rediscovery will reinvigorate industry-wide attention to HF.

The human factors community has been instrumental in promoting voluntary reporting and establishment of “just” cultures.  Many of the voluntary reports submitted to NASA’s Aviation Safety Reporting System, to Boeing’s MEDA, and the FAA’s Aviation Safety Action Program are fraught with the language and challenges of human performance.   SMS will amplify that message and raise new attention to human factors across aviation working environments.

You should get involved with the SMS implementationin your organization. Continue with your daily emphasis on risk assessment.  Train workers to understand risk assessment and integrate it into their on-the-job activities.  Finally, stay tuned here for more information about applied human factors.

What is next in the series of Articles?

Fatigue Risk Management Systems (FRMS) are a required part of some Safety Management Systems.   These systems are great for industry because they permit an organization to show how they will manage the risk within their company, as opposed to following a one-size-fits-all duty time regulation.  In the next installment of this series Dr. Katrina Avers, of the FAA Civil Aerospace Medical Institute, will describe FRMS and show the applied products of FAA’s fatigue R&D.