Table of ContentsHuman Error
To this day the majority of aviation accidents are attributed in some way to some form of human error. Surprising when you consider all the effort and expense put into management, research, training and the development of new technologies such as automation. Yes, safety has vastly improved over the last 50 years, making flying one of the safest methods of getting around our planet. But still human error related accidents occur. For those interested in safety in aviation, from pilots and trainers to managers and human factors researchers, a good understanding of what human error is and how it can manifest itself as an aviation accident is fundamental.
What is human error?
Errare Humanum Est- To Err is Human
Early psychological researcher Sigmund Freud saw error as being a behavioural artefact of unconscious drives within a persons mind (Strauch, 2004). He regarded people who erred as being less effective, a theory that tainted early research on human error. Certain people were seen as being ‘accident prone’ simply because of particular traits they displayed. But later research showed these ideas to be flawed.
Researchers such as Donald Norman and Jens Rasmussen based their research on the cognitive and motor aspects of error and also reflect on the setting in which errors occur (Strauch, 2004). Indeed Norman differentiated errors into slips and mistakes, with slips being errors of execution generated by schemas, experiences, knowledge and memories, whilst he classified mistakes as errors relating to actions or decisions.
Rather than using the terms slip and mistake, Kern (1998) prefers the idea that there are errors of omission or commission. Errors of omission occur when crew members fail to carryout a required task. For example the NTSB investigation into American Airlines flight 1420, which overshot the end of the runway at Little Rock, Arkansas, in June1999, showed that a major contributor to the accident was the crew’s failure to arm the MD-80’s ground spoilers, which are designed to dissipate lift and improve braking (Dismukes, Berman & Loukopoulos, 2007). An error of commission occurs when a crew member carries out some task incorrectly or perhaps does something that is not required. The crash of Southwest Airlines flight 1455 was attributed to the crew’s execution of the approach, their excessive airspeed and an incorrect flightpath angle being primary causes of the accident.
Reason’s ideas on human error
James Reason further developed ideas on error, defining slips as minor errors of execution and introducing the idea of lapses, which can occur when a pilot becomes distracted and doesn’t complete a task or omits a step whilst performing it (Strauch, 2004). Reason is now often cited as providing the classic definition of human error;
“Planned actions that fail to achieve their desired consequences without the intervention of some chance or unforeseeable agency” (1990, p.17).
As well as slips and lapses, where actions deviate from a plan, Reason (1990) argues that mistakes; where actions conform to an inadequate plan, and violations; where actions deviate from safe procedures standards or rules, be they deliberate or erroneous, can also be the underlining elements of a human error.
For instance if a checklist item is missed or not performed, as occurred in 1996, when Continental Airlines flight 1943 landed ‘gear up’, a lapse has occurred (Dismukes et al, 2007). A mistake on the other hand is the result of a judgment failure, as happened in 1990 when Avianca flight 052 ran out of fuel because of the pilot’s decision to persevere with a landing at JFK (Helmreich & Merritt, 1998). A violation however occurs when a pilot contravenes a rule or SOP as occurred when the captain of Gulf Air Flight 072 did not adhere to standard operating procedures (SOPs) when executing a non-standard and unplanned orbit to try and avoid a go-around (Wikipedia, 2010a).
Reason (1990) also affirmed the idea that the operators, those who commit errors, do not do so in a vacuum. We can rightly assume that the vast majority of operators don’t want to have an error, especially if they are sitting at the ‘pointy end’ high above the ground. But as we mentioned earlier, errors still occur, this is because other factors or antecedents influence the operator’s performance (Strauch, 2004). These can include, but are not limited to the equipment being used, other operators in the system and even any cultural influences which may exist.
When a false stall warning occurred as TWA flight 843 lifted of from JFK in 1992, the equipment, an L-1011, became the antecedent to the pilot’s error to abort the takeoff and unsuccessfully stop the aircraft on the runway (Dismukes, et al, 2007). When American Airlines flight 1572 struck trees on approach to Bradley International Airport the antecedent was another operator in the system. The pilot’s error to proceed below the minimum decent altitude was exasperated by the approach controller’s failure to report a current altimeter setting at a time of rapidly falling barometric pressure. A cultural antecedent was present when Korean Air flight 801 flew into a Guam hillside (Stanley, 2006). The first officer’s failure to monitor and check the captain’s execution of the approach was put down to the high power distance culture within the airline.
Human error or human reliability?
Not surprisingly the occurrences of human error incidents like those noted above are often seen in a rather negative light. Traditional approaches to human error rectification and management tended to centre on blame, training and quite possibly punishment (McDonald, 2003). But as our understanding of human error and human factors in general has improved the term human error has being replaced by ‘human reliability’ (Wikipedia, 2010b). This is because the reliability of human performers is seen as a major factor in the ‘resilience of systems’ especially in large socio-technical systems such as aviation, and it is better to manage and train for improved reliability than just finding out why errors occur. So research now tends to focus on more proactive and positive ways of understanding human error.
References1. DISMUKES, K., BERMAN, B., & LOUKOPOULOS, L. (2007). The limits of expertise: Rethinking pilot error and the causes of airline accidents. Burlington, USA: Ashgate Publishing Co.2. HELMREICH, R., & MERRITT, A. (1998). Culture at work in aviation and medicine: National, organizational and professional influences. Aldershot, England: Ashgate Publishing Ltd.3. KERN, T. (1998). Flight discipline. New York: McGraw-Hill.4. McDONALD, N. (2003). Culture, systems and change in aircraft maintenance organisation. In G. EDKINS & P. PFISTER (Eds.), Aviation: Selected contributions to the Australian Aviation Psychology Symposium 2000. (pp. 39-57). Aldershot, England: Ashgate Publishing Limited.5. REASON, J. (1990). Human error. New York: Cambridge University Press.6. STANLEY, B. (2006, January 9). Korean Air bucks tradition to fix problem (Electronic version). The Wall Street Journal. Retrieved August 26, 2010 from the World Wide Web: [http://online.wsj.com/article/SB113676875085241209.html]7. STRAUCH, B. (2004). Investigating human error: Incidents, accidents and complex systems. Aldershot, England: Ashgate Publishing Ltd.8. WIKIPEDIA (2010a). Gulf Air Flight 072. Retrieved August 26, 2010 from the World Wide Web: [http://en.wikipedia.org/wiki/Gulf_Air_Flight_072]9. WIKIPEDIA (2010b). Human reliability. Retrieved August 26, 2010 from the World Wide Web: [http://en.wikipedia.org/wiki/Human_reliability]
Want to know more?
- Pilot Error
- More examples of aircraft accidents attributable to pilot errors
- Human Reliability
- A Wikipedia page which introduces the concept
- Crew Resource Management (CRM)
- A good overview of CRM from AviationKnowledge
Contributors to this page
Authors / Editors
Friday, August 27, 2010
Human error in aviation: An introduction - AviationKnowledge
via google.com