Does the term ‘human error’ just obscure the real problem?
Following most major accidents, one phrase is almost guaranteed to headline in the popular press: ‘human error’. The concept is also popular in our own discipline and profession; it is probably among the most profitable in terms of research and consultancy dollars. While seductively simple to the layperson, it comes with a variety of meanings and interpretations with respect to causation and culpability. With its evocative associations, synonyms, and position in our own models, ‘human error’ is a key anchoring object in everyday and specialist narratives. Despite our efforts to look at what lies beneath, explanations, both in industry and the judiciary, often float back what people can see, with hindsight, at the surface. Our scientists’, designers’ and analysts’ perspectives are overpowered by common sense, management and legal perspectives. And the pervasive nature of the term clouds other ways of seeing that recognise the need for adjustments, variability and trade-offs in performance.
Might ‘human error’ be the handicap of human factors? At Ergonomics & Human Factors 2014, over 70 people from various countries and industries attended a workshop discussion with this as the main question. The discussion was recorded with the agreement of the participants, and some of their insights are given in this article.
Reflections on industry perspectives
While E/HF specialists usually promote the narrative of ‘human error’ as a symptom of deeper trouble, or as something with systemic causes, one participant noted the difficulty of encouraging even this view, with reference to an industry occupational health research project: “They don’t understand or try to find an explanation of what is behind that human error… so in that sense it is very dangerous term to use in conventional workplaces. It stops everything. It stops the discussion of what might be wrong in the system.”
Another participant noted a typical industry response to unwanted events involving ‘human error’: “In the railway if there was a human error, the only answer was more training. And the dangerous thing about that is that when you have found a solution you send the driver for more training and you think you’ve solved the problem.”
It was also suggested that ‘human error’ can be a political device and serves a business need: “Sometimes I get the feeling it might be politically expedient to use ‘human error’ … some of these helicopter crashes … if it was a mechanical error, before anything has been proved … all these vehicles would have to be grounded and people would be stuck in places. So at least there is maybe an interim judgement that human error may not be necessarily correct but an important step in the process of moving forward.”
Similarly, a participant suggested that ‘human error’ is a convenient, if flawed, shorthand: “Ultimately society doesn’t want an answer it can’t understand. So maybe it’s what society needs at the moment. It’s a shorthand version of what we all know is inadequate.”
The discussion also addressed emotive connotations and defensive reactions: “The FAA has a snitch patch in their system which reports infringements of separation … the pilot in question will then be questioned and asked what happened. I came across a report in a private pilots’ journal in America which said if you are rung up by the FAA, hang up your telephone and call your lawyer. This is the direct result of an assumption that it must be somebody’s fault. It also makes it damn hard to investigate incidents.”
A pilot participant went on to mention a sociological aspect: “We hold pilots in high esteem, we put them on pedestals, we put our lives in their hands, and the same with surgeons. But … you rarely hear us in a forum talking about mistakes we make. There is this standard that we have, if you are a ‘dangerous’ surgeon or a ‘dangerous’ pilot, for whatever reason, that’s the end of your career. I don’t think we do ourselves any favours.”
The notion of ‘just culture’ – an industry response to reduce blame associated with so-called honest mistakes – was also mentioned. One participant quipped: “And the other thing is just culture. We don’t blame anyone around here but the MD does like to know who we are not blaming.”
But some organisations are thinking carefully about language. One forward-thinking organisation was cited as having implemented a just culture initiative, and in doing so changed the language: “They latched onto this idea that language is important, that words matter. ‘Investigation,’ they said, ‘is a very criminal type of word to use so we are going to move to ‘review’ and would you please come and talk to our incident reviewers about the bit that still remains that is human. Because we know we shouldn’t blame people but they are still part of the system.’ So I think there is broad spectrum of people understanding some of this and moving to it but not being able to articulate exactly what that means or apply it to themselves.”
Reflections on E/HF perspectives
Whether simplistic notions of human error are actually an issue for E/HF was subject to some discussion. One view was that, since E/HF takes a systems perspective, we do not contribute to the misuse of ‘human error’: “Chapanis’s original work said pilot error is normally designer error. Reason said what we call human error is the product of a system that has permitted the continuation of certain practices that seem to lead to what we call human error. That was repeated by Dekker about a decade later. So I don’t think anybody in the discipline really believes that. Sure, the other disciplines do, and people like the FAA and manufacturers are only too happy to blame pilots because then they don’t have to do anything. If you take that sociotechnical systems view then you see failure of systems as an emergent property of that system, which is due to complex interactions… I thought the profession had moved on and we were talking more of a system view nowadays and that the individual decomposition view is the ergonomics of the 1980s. The 2010 view is very much a systems perspective.”
But for some, the connotations of the term were seen as problematic and hard to undo: “Human error is automatically a negative. It’s like ‘near miss’, ‘whistleblowing,’ anything like that, it’s the attachment that anybody puts to that phrase or word…most of us are probably converted to where we need to be going, however we are dealing with society and we are also dealing with the safety world. Attach a label to it and it becomes an end state.”
Noting the issue of language, one participant actively tried to find alternative causal narrative: “What I did was, don’t use the words ‘human error’… this is simply a social evaluation of the behaviour after the fact. It has nothing to do with the action as it was. ‘Human error’ is a normative evaluation of behaviour. It is not a description of what happened. And it doesn’t have any causal validity at all. So how can we promote a different narrative?… [it] is very difficult but it relies on communication ability. We need to describe what happened and why it happened. That is something different from saying that it is an error. And finding a good way to do this is very difficult.”
And finally, it appears that it is not just human error that is problematic. “I find that human factors itself is a really difficult term. In the healthcare industry and when I talk about human factors I try to explain that human factors is about the system and about the tools and technology and the environment. But that doesn’t get across straight away because human factors are the factors of the human. And I think our dialogue around that is really challenging.”
Perhaps, by introducing other non-binary vocabulary and ways of thinking, we can help to shape the popular narrative, moving the human from a hazard to a resource necessary for system flexibility.
Steven Shorrock is European Safety Culture Programme Leader at EUROCONTROL, France. A longer version of this article, with links to others, is available at http://humanisticsystems.com.