All you need to know about ‘Crew Resource Management’

The Asiana 214 fuselage, on its way to the hangar

Three died when Asiana 214 crashed just short of Runway 28L at San Francisco. It has been only ten days, but the pro-active and transparent reporting we have seen from NTSB has helped the public to understand what happened. The reports, although only preliminary, have made it clear that the likely ‘causal factors’ will come down to a very disturbing pilot error, in which an entire flight crew failed to notice a dangerous loss of airspeed leading to an imminent stall. As such, this accident becomes another Case Study, soon to be dissected in training programs, to depict the need for effective Crew Resource Management.

What exactly is Crew Resource Management?

Crew Resource Management (CRM) is essentially a cultural shift within the aviation work environment. It is a transition, from a culture where people are intimidated or otherwise fail to communicate, to a culture where all participants are actively engaged and clearly sharing information. CRM focuses on solid communications, sound leadership, and clear-minded decision-making. This makes for a safer work environment, and it thus saves lives.

The Problem (why we needed a CRM Solution)

Aviation has historically cultivated strong personality types, and that has created many cases where it is difficult for flight crew members to work together. This, of course, can be a problem when different personalities need to work together to manage a complex process such as landing a commercial jet. There are a few factors that reasonably feed into this problem, including:

  1. All pilots are deeply trained to understand the concept that he or she is the final authority as to the safe operation of their flight. This pattern of responsibility and self-reliance builds habits that can sometimes include focusing past others.
  2. Pilot certification requires an investment of years of training, through many certification steps. This can cause flight crew members to view each other as being at a different level, higher or lower, which in itself tends to discourage effective communications.
  3. Many commercial pilots previously spent years as pilots in the military, often flying single-pilot on dangerous missions. This experience also can work against open, cooperative flight crew communications and problem-solving.
  4. On-the-Job Training (OJT) is common in aviation. A new air crew member needs to prove to his/her peers that he/she can reliably perform the job. The fully certified crew members are thus empowered to make or break a new hire. This system works well to teach the ropes, but on the flip-side, it is a real problem that the new air crew member has to be careful to not alienate those coworkers who might decide they are a training failure. So, if a fully certified crew-member makes a mistake or engages in a practice that is dangerous or corrupt, new air crew members tend not to notice.
  5. After all those years of effort to earn their certifications, many pilots hire on to the airlines, and then spend years doing routine and uneventful airline flying. They may become complacent, with a sense that nothing bad can happen. Complacency and boredom can lead to reckless decision-making or even corrupt practices.  But, since these poor decisions tend to be made by the more senior members of the air crew, it is common that the junior air crew members stay silent.
  6. Fatigue and language skills can undermine pilots. A pilot who is tired or who may difficulty phrasing a radio transmission to ATC (such as rejecting an ATC clearance they are uncomfortable with), will be more inclined to stay quiet, versus challenging a bad clearance.
  7. There are also cultural patterns. This can be a corporate culture based on the reputation or prevailing attitudes at the airline, or it can be a culture rooted in thousands of years of religion, values and philosophy. In either case, if the prevailing cultural values discourage a junior crew member from challenging an apparent problem initiated by a senior crew member, the problem may not be addressed, and it may escalate into an accident.

This problem has been thoroughly researched for flight crews, but it also applies to ATC. There are many control facilities where interpersonal conflicts have at times severely diminished the quality of ATC services. These conflicts are sometimes sourced in the management, sometimes sourced in the Union, and sometimes sourced in the individual employees, but they always impede communications, and thus diminish the quality of ATC service. The end result is towers with bad reputations … where controllers yell at pilots, where controllers lack flexibility, where controllers quit caring and turn on the DVD movie, where controllers fail to look out the window and then have to hide their failure when dozens die.

Whether it is for flight crews or ATC, CRM training directed at improving attitudes and communications is ALWAYS a good idea.

The Swiss Cheese Model

In 1990, James Reason and Dante Orlandella at the University of Manchester developed what is commonly called the Swiss Cheese Model. The concept is very simply presented as a row of cheese slices, with random ‘holes’. Accidents happen when an entire series of holes aligns, enabling the ‘error’ to pass straight through. A safe aviation system requires many redundant layers. Ideally, all layers will be solid and lack holes. Where the safety holes do occur, it is important that other redundant layers compensate. All of the many factors (proficiency, communications, weather, aircraft maintenance, pilot fatigue, pilot assertiveness, ATC engagement, etc.) provide opportunities to prevent safety failures.

How did CRM start?

Predictably, CRM began after a series of fatal aviation accidents. More precisely, these were accidents where the post-mortem analysis showed communications failures or poor task assignment, where important details like the decaying airspeed of Asiana 214 went unnoticed. In fact, it was a rash of such accidents in the 1970’s, that led to a 1979 conference, and CRM was born. Here is a list of some of those accidents:

  • Eastern Airlines Flight 401 — on 12/29/72, a Lockheed L-1011 has a landing light indicator problem while approaching Miami. The crew is cleared to climb to 2,000′ and all crew members become immersed in trying to solve the problem. Nobody notices until the very last moment that they are not maintaining 2,000′. The flight crashes in the Everglades, killing 101. NTSB Report
  • Eastern Airlines Flight 212 — on 9/11/74, the flight crew of a DC-9 on approach to Charlotte in low fog conditions was distracted; they failed to make altitude callouts, held non-pertinent discussions, and became focused on visually acquiring the runway. The flight crashed short of the runway, and 72 died. NTSB Report
  • National Airlines Flight 193 — on 5/8/78, a Boeing 727 on a radar approach to Pensacola crashed onto Escambia Bay, three-miles short of the approach end of Runway 25. The approach was in conditions with fog and a low ceiling. NTSB Report
  • United Airlines Flight 173 — on 12/28/78, a DC-8 approaching Portland, OR failed to produce a green landing light. The pilot went around and, just as with Eastern 401, the crew members became preoccupied with fixing the problem but failed to monitor their fuel remaining. The flight reached fuel exhaustion and crash-landed in a residential area six miles southeast of PDX, with ten fatalities. NTSB Report

CRM training has been rising and falling for decades.

Here’s the interesting thing I found while researching CRM: it has been going on for decades, it has helped, but it is still failing, too. In fact, CRM training has tended to rise and fall in cycles, triggered by high profile accidents. And, the need for CRM is clearly not specific to any subset of pilots: it is for ALL pilots.

One veteran pilot told me about what appears to be a pre-cursor to CRM. He explained that the safety record for airlines in Japan was so poor in the early 1960’s, that (as he understood it) the insurance companies mandated the addition of a western (mostly American) pilot to each flight crew. This, coupled with training similar to CRM, helped solve a problem where junior Japanese pilots had difficulty challenging senior Japanese pilots. (…and, most likely, that junior pilot’s intimidation also impeded his challenging ATC communications…)

The rash of U.S. accidents in the seventies necessitated the formal CRM push that began in 1979. A comparable push was eventually made in the ATC workplace, where similar communications problems existed, and for the same general reasons. That is to say, we air traffic controllers have the same tendencies toward controlling personalities, deep training/focus, perceptions of ‘different’ authority levels, fatigue related to absurd work schedules, etc.

The next up-swing of CRM training came about in the late 1990’s, after accidents like the Korean Airlines Flight 801 crash in Guam. There was a special emphasis on CRM for pilots at the major airlines in Korea: Korean Airlines, and Asiana Airlines.

So, what might we learn from the Asiana 214 crash?

Probably the most important lesson is far deeper than dwelling on how Korean culture or lack of 777 experience (for a pilot with thousands of hours in other, similarly complex aircraft) might have contributed. The big lesson may be a simple wakeup call: that even with more than three decades of focused CRM training, people who fly airplanes can (and will) make mistakes, and those mistakes will not always be caught by the current set of redundancies. We are thus wise to always look for additional redundancies to prevent accidents.

There has been lots of media attention directed toward the pilot’s training hours and what bearing his Korean culture might have on his piloting or decision-making, but this is really far off target. The pilot — and the entire crew on the flight deck — are fully responsible for this crash, but ATC is equally responsible for failing to stop this crash. FAA’s radar collected lots of data on this flight, and FAA’s software could have detected (or maybe actually did?) the low airspeed trend a half minute or more before the crash. With the billions that FAA has spent on these systems, shouldn’t they include the sort of cost-effective software element that alerts the tower controller, so he or she can immediately transmit the one alert that saves the day?

And there is more to consider about ATC on this accident. What else could ATC have done to prevent this? Why was the Glideslope taken out of service for MONTHS, knowing that it does in fact serve flight crews to stabilize their approach? Did ATC challenge the airport authority’s plans for such a long decommissioning, or did ATC just accept it (…and maybe like the idea as a way to encourage more Visual Approaches)? If a flight crew is going to get a Visual Approach, should it be OK for ATC to do so with a quick clearance way out, on a 15-20 mile final? Would the system be safer (with more redundancy) if ATC instead ‘controlled’ the flights further in before issuing the Visual Approaches (perhaps, within a 10-mile final)? Will the next flight crew (perhaps Korean or Lufthansa or Southwest) be able to get more service from ATC, perhaps radar vectors and altitudes (active radar control) to a five mile final, and THEN cleared for their Visual Approach, if they want it that way?* Would this service aid them in achieving a safe and stabilized approach? Frankly, do we want a situation where competent pilots from other parts of the world might fail to ask for more ATC services, because their limited language skills make them intimidated by machine-gun-talking FAA controllers? Probably not…*A paper by Robert Helmreich, et al in 2000 illustrated cultural differences. It presented a graph, showing mean scores of pilots from 18 countries on the FMAQ Automation Preference and Reliance Scale. The data suggested that not all pilot populations are equal; that, in some nations, pilots may prefer visual hand-flying, while in other nations pilots may prefer instrument flying and automation. In view of this, it is important for ATC service providers everywhere to be flexible, and try to accommodate pilot preferences. Here is a link to that graph.

Let me reiterate. The flight crew is 100% responsible for this crash, but both the flight crew and ATC share equal responsibilities for the failure to save the crash.

This crash should not have happened. FAA can do better here.

<< <> <<>> <> >>

Here are some notes and links to CRM reference materials:

A 1992 CRM Handbook. Prepared under contract to FAA.

A 1997 CRM Paper. Prepared by Robert Helmreich, et al.

A 2000 Paper: Culture, Error & CRM. Also prepared by Robert Helmreich, et al.

A CRM video posted by FAA on 4/5/12. It runs 24-minutes, but the portion well worth viewing is the first six minutes (most of the final 18-minutes is just repetitive). Starting at 1:10 is a discussion of Eastern Flight 401 at Miami, with a video simulation. Starting at 2:50 is a discussion of the rash of commercial aviation accidents in the 1970’s, that led to the development of Cockpit Resource Management (which later became Crew Resource Management). The narrator presents an explanation of the SHEL model: software, hardware, environment, and liveware. As presented, ‘software’ refers to procedures and regulations; ‘hardware’ refers to the immediate environment for the pilot (the seat, flight deck indicators, computer systems, etc.); ‘environment’ refers to the whole aircraft, as well as weather and terrain conditions; ‘liveware’ refers to the people within this system, and includes pilots and other flight crew, ATC, maintenance, and ground crews.