16 Jul 2019 10:52

AUTHOR serkisFlight

The vast majority of airline   are attributed to flight crew error. However, the great majority of your pilots has received strict training, is checked with  regularity, flies with enhanced safety, technology and is highly experienced, . They do their job according to a flight operations manual and checklists that prescribe carefully planned procedures for almost conceivable situation, normal or abnormal, they will encounter.

How can all this expertise co-exist with the pilot error that we are told is a factor in more than half of airline accidents? Why these experienced professional pilots make these errors?

“This mmwell-known fact is widely misinterpreted, even by experts in aviation safety. Certainly, if pilots never made mistakes the accident rate would go down dramatically, but is it reasonable to expect pilots not to make mistakes? For both scientific and practical reasons, this expectation is not reasonable.”

“The accident rate for major airlines operations in industrialized nations is already very low. This impressive record has been accomplished by developing very reliable systems, by thorough training, by requiring high levels of experience for captains, and by emphasizing safety. However, this accident rate can be further reduced substantially through the understanding of the underlying causes of human error and  ways of managing human error and changing how we think about the causes of error.”

“It is all too easy to say, because crew errors led to an accident, that the crew was the problem: they should have been more careful or more skilful. This “blame and punish” mentality or even the more benign “blame and train” mentality does not support safety—in fact, it undermines safety by diverting attention from the underlying causes.”

“Admittedly in general aviation, many accidents do show evidence of poor judgment or of marginal skill. This is much less common in airline operations because of the high standards that are set for this type of operation. Nonetheless, whatever discussion about airline operation could have implications for general aviation.”

“There are two common fallacies about pilot error:

    Fallacy 1: Error can be eliminated if pilots are sufficiently vigilant, conscientious, and proficient.

The truth is that vigilant, conscientious pilots routinely make mistakes, even in tasks at which they are highly skilled. Helmreich and his colleagues have found that on average airline crews make about two errors per flight leg and even more on challenging flights (Helmreich, Klinect, & Wilhelm, 1999; Klinect, Wilhelm, & Helmreich, 1999). And this is, if anything, an undercount because of the difficulty in observing all errors.

    Fallacy 2: If an accident crew made errors in tasks that pilots routinely handle without difficulty, that accident crew was in some way deficient—either they lacked skill, or had a bad attitude, or just did not try hard enough.

But the truth is that the most skillful, conscientious expert in the world can perform a procedure perfectly a hundred times in a row and then do something wrong on the 101st trial. This is true in every field of expertise—medicine, music, and mountain climbing just as much as aviation (Reason, 1990).”

“It must also be highlighted something called “hindsight bias”. After an accident, all know the outcome of the flight. The thorough investigation by the investigation authorities reveals many details about what happened leading up to the accident. Armed with this information it is easy for everybody to say the crew should have handled things differently. But the crew in that airplane did not know the outcome. They may not have known all of the details later revealed and they certainly did not realize how the factors were combining to create the conditions for an accident.”

“Experts do what seems reasonable, given what they know at the moment and the limits of human information processing. Errors are not de facto evidence of lack of skill or lack of conscientiousness.

In some accidents, crews may not have had access to adequate information to assess the situation and make prudent decisions on how to continue. Many bits and pieces of information may be available to the crew, who weigh the information as well as they can. But comes the question whether crews always have enough information in time to decide and to be absolutely certain that the decision is correct.”

“It is ironic that in some wind shear accidents the crew was faulted for continuing an approach even though an aircraft landed without mishap one minute ahead of the accident aircraft. Both crews had the same information, both made the same decision, but for one crew luck ran the wrong way. We do not like to admit that any element of luck still pertains to airline safety—and in fact, the element of chance in airline operations has been reduced enormously since the 1930s, as described by Ernest Gann in Fate is the Hunter (1984). But there are still a few accidents in which we should admit that the crew made decisions consistent with typical airline practice and still met disaster because risk cannot be completely eliminated.”

“Tension and trade-offs between safety and mission completion are inherent in any type of real-world operation. Modern airlines have done an extraordinary job of reducing risk while maintaining a high level of performance. Nevertheless, some small degree of risk will always exist. The degree of risk that is acceptable should be a matter of explicit public discussion, which should guide policy. What we must not do is tell the public they can have zero risk and perfect performance—and then say when a rare accident occurs: “it was the crew’s fault”, neglecting to mention that the accident crew did what many other crews had done before.”

“If the investigation of an accident or incident reveals explicit evidence of deliberate misconduct the pilot obviously should be held accountable. If the investigation reveals a lack of competence the pilot obviously should not fly again unless retrained to competency. But with these rare exceptions, identifying “pilot error” as the probable cause of accidents is dangerous because it encourages the aviation community and the public to think something was wrong with the crew and that the problem is solved because the crew is dead or can be fired (or retrained in less serious cases).”

“Rather than labeling probable cause, it is more useful to identify the contributing factors including the inherent human vulnerability to characteristic forms of error, to characterize the interplay of those factors, and to suggest ways errors can be prevented from escalating into accidents. If probable cause must be retained, it would in most cases be better to blame the inherent vulnerability of conscientious experts to make errors occasionally rather than to blame crews for making errors.”

“To improve aviation safety we must stop thinking of pilot errors as the prime cause of accidents, but rather think of errors as the consequence of many factors that combine to create the conditions for accidents. It is easy in hindsight to identify ways any given accident could have been prevented, but that is of limited value because the combination of conditions leading to accidents has a large random component. The best way to reduce the accident rate is to develop ways to reduce vulnerability to error and to manage errors when they do occur.”


“The naïve view is that pilots who make an error are somehow less expert than others. That view is wrong. The pilot who makes an error – as seen in hindsight- typically does not lack skill, vigilance or conscientiousness. He or she is behaving expertly, in a situation that may involve misinformation, lack of information, ambiguity, rare weather phenomena or a range of other stressors, in a possibly unique combination.”

“No one thing “causes” accidents. Accidents are produced by the confluence of multiple events, task demands, actions taken or not taken, and environmental factors. Each accident has unique surface features and combinations of factors.”

Human cognitive processes are by their nature subject to failures of attention, memory and decision-making. At the same time, human cognition, despite all its potential vulnerability to error is essential for safe operations.

“Computers have extremely limited capability dealing with unexpected and novel situations, for interpreting ambiguous and sometimes conflicting information, and for making value judgments on the face of competing goals. Technology helps make up for the limitations of human brainpower, but by the same token, humans are needed to counteract the limitations of aviation technology.”

“Airline crews routinely deal with equipment displays imperfectly matched to human information-processing characteristics, respond to system failures and decide how to deal with threats ranging from unexpected weather condition to passenger medical emergencies. Crews are able to manage the vast majority of these occasions so skillfully that what could have become a disaster is no more than a minor perturbation in the flow of high-volume operations.”

“But on the rare occasions when crews fail to manage these situations, it is detrimental to the case of aviation safety to assume that failure stems from the deficiency of the crews. Rather, these failures occur because crews are expected to perform tasks at which perfect reliability is not possible for either humans or machines. If we insist on thinking of accidents in terms of deficiency, that deficiency must be attributed to the overall system in which crews operate.”

“It has been described six overlapping clusters of error situations:

    Inadvertent slips and oversights while performing highly practiced tasks under normal conditions
    Inadvertent slips and oversights while performing highly practiced tasks under challenging conditions
    Inadequate execution of non-normal procedures under challenging conditions
    Inadequate response to rare situations for which pilots are not trained
    Judgment in ambiguous situations
    Deviation from explicit guidance or SOP

However, error is NOT just part of doing business, it must still be reduced and to reduce it, the factors associated with it must be understood as well as possible.”

“Uncovering the causes of flight crew error is one of the investigators biggest challenges because human performance including that of experts pilots is driven by the confluence of many factors, not all of which are observable in the aftermath of an accident. Although it is often impossible to determine with certainty why accident crewmembers did what they did, it is possible to understand the types of error to which pilots are vulnerable and to identify the cognitive, task and organizational factors that shape that vulnerability”. (Carl W.Wogt, 2007, on his Foreword to the book The Limits of expertise: Rethinking pilot error and the causes of airline accidents. Burlington, VT: Ashgate.)

“Studies have shown the most common cross-cutting factors contributing to crew errors (3):

    Situations requiring rapid response
    Challenges of managing concurrent tasks
    Equipment failure and design flaws
    Misleading or missing cues normally present
    Plan continuation bias
    Shortcomings in training and/or guidance
    Social/organizational issues”


“Studies show that almost all experienced pilots operating in the same environment in which the accidents crews were operating and knowing only what the crews knew at each moment of the flight would be vulnerable to making similar decisions and actions.”

“The skilled performance of experts is driven by the interaction of moment-to-moment task demands, availability of information and social/organizational factors with the inherent characteristics and limitations of human cognitive processes. Whether a particular crew in a given situation makes errors depends as much, or more, on this somewhat random interaction of factors as it does on the individual characteristics of the pilots.”

“The two most common themes saw in aviation accidents are Continuation Bias, –a deep-rooted tendency to continue their original plan of action even when changing circumstances require a new plan– and situations that lead to Snowballing Workload- a workload that builds on itself and increases at accelerating rate.”

Continuation bias

“Too often crew errors are attributed to complacency or intentional deviations from standard procedures, but these are labels, not explanations. To understand why experienced pilots sometimes continue ill-advised actions is important to understand the insidious nature of plan continuation bias which appears to underlie what pilots call “press-on-itis”. This bias results from the interaction of three major components: social/organizational influences, the inherent characteristics and limitations of human cognitive processes and incomplete or ambiguous information.”

“Safety is the highest priority in all commercial flight operations, but there is an inevitable trade-off between and competing goals of schedule reliability and cost effectiveness. To ensure conservative margins of safety, airlines establish written guidelines and standard procedures for most aspects of operations.”

“Yet considerable evidence exist that the norms for actual flight operations often deviate considerably for these ideals. When standard operating procedures are phrased not as requirements but as strong suggestions that may appear to tacitly approve of bending the rules, pilots may -perhaps without realizing it- place too much importance on costs and scheduling.”

“Also, pilots may not understand why guidance should be conservative; that is they may not recognize that the cognitive demands of recovering a plane from an unstabilized approach severely impair their ability to assess whether the approach will work out. For all these reasons many pilots, not only the few who have accidents may deviate from procedures that the industry has set up to build extra safety into flight operations. Most of the time, the result of these deviations are successful landings, which further reinforce deviating norms.”

“As pilots amass experience in successfully deviating from procedures they unconsciously recalibrate their assessment of risk toward taking greater chances.”

“Another inherent and powerful cognitive bias in judgment and decision making is expectation bias- when someone expects one situation, she or he is less likely to notice cues indicating that the situation is not quite what it seems. Human beings become less sensitive to cues that reality is deviating from the mental model of the situation.”

“Expectation bias is worsened when crews are required to ingrate new information that arrives piecemeal over time in incomplete, sometimes ambiguous, fragments. Human working memory has extremely limited capacity to hold individual chunks of information, and each piece of information decays rapidly fro working memory. Further, the cognitive effort required to interpret and integrate this information can reach the limits of human capacity to process information under the competing workload to flying an approach.”

Snowballing Workload

“Errors that are inconsequential on themselves have a way of increasing crews vulnerability to further errors and combining with happenstance events – with fatal results. The abnormal situations can produce acute stress, and acute stress narrows the field of attention (tunnel-vision) and reduces working memory capacity. The combination of a high workload with many other factors, as stress and/or fatigue, can severely undermine cognitive performance.”

“A particularly insidious manifestation of snowballing workload is that it pushes crews into a reactive, rather than proactive stance. Overloaded crews often abandon efforts to think ahead of the situation strategically, instead simply responding to events as they occur not thinking if that is going to work out.”

Implications and countermeasures

“Simply labelling crew errors as simply “failure to follow procedures” misses the essence of the problem. All experts no matter how conscientious and skilled are vulnerable to inadvertent errors. The basis of this vulnerability is in the interaction of task demands, limited availability of information, sometimes conflicting organizational goals and random events with the inherent characteristics and limitations of human cognitive processes. Even actions that are not inadvertent are the consequences of the same interaction.”

“Almost all airline accidents are system accidents. Human reliability in the system can be improved if pilots, instructors, check pilots, managers and the designers of aircraft equipment and procedures understand the nature of vulnerability to error.”

“For example, monitoring and checklists are essential defenses but in snowballing workload situations, when these defenses are most needed they are most likely to be shed in favor of flying the airplane, managing systems and communicating.”

“Monitoring can be more reliable by designing procedures that accommodate the workload and by training and checking monitoring as an essential task more than a secondary one.”

“Checklist use can be improved by explaining the cognitive reasons that effectiveness declines with extensive repetition and showing how this can be countered by slowing the pace of execution to be more deliberate, and by pointing to or touching items being checked.”

“Inevitable variability in skilled performance must be accepted. Because skilled pilots normally perform a task without difficulty, it doesn’t mean they should be able to perform that task without error 100% times.”

“Plan continuation bias is powerful, although it can be countered once acknowledged. One countermeasure is analyze situations explicitly, stating the nature of the threat explicitly, the observable indications of the threat and the initial plan for dealing with it.”

“Questions as what if our assumptions are wrong? How will we know? Will we know on time?, are the basis for forming realistic backup plans and implementing them on time before snowballing workload limits the pilot’s ability to think ahead.”

“Airlines should periodically review normal and non-normal procedures looking for design features that could induce error. Examples of correctable design flaws are checklist conducted during periods of high interruptions, critical items that are permitted to “float” in time and actions that require the monitoring pilot to head down during critical periods such as taxing near runway intersections.”

“Operators should carefully examine whether they are unintentionally giving pilots mixed messages about competing goals such as SOPs adherence versus on-time-performance and fuel costs. If a company is serious about SOPs adherence it should publish, train and check those criteria as hard-and-fast rules rather than as guidelines. Further, it is crucial to collect data about deviation from those criteria (LOSA & FOQA) and to look for organizational factors that tolerate or even encourage those deviations.”

“These are some of the ways to increase human reliability on the flight deck, making errors less likely and helping the system recover from the errors that inevitably occur. This is hard work, but it is the way to prevent accidents. In comparison, blaming flight crews for making errors is easy but ultimately ineffective.”


Excerpted from:

    Dismukes, R. K. (2001). Rethinking crew error: Overview of a panel discussion. In R. Jensen (Ed.), Proceedings of the 11th International Symposium on Aviation Psychology. Columbus, OH: Ohio State University.
    Darby, Rick & Setze, Patricia. Factors in Vulnerability a book review to The Limits of expertise: Rethinking pilot error and the causes of airline accidents. Dismukes, R. K., Berman, B. A., & Loukopoulos, L. D. (2007). Burlington, VT Ashgate. Aviation Safety World, May, 2007 53-54
    Dismukes, R.K., Berman, B., & Loukopoulos, L. D. (2006, April). The limits of expertise [fv: rethinking pilot error and the causes of airline accidents. Presented at the 2006 Crew Resource Management Human Factors Conference,, Denver, Colorado. (PDF 232KB)