The normalisation of deviance: what human biases are involved and how to manage the risk?
Carrying out tasks and procedures while working onboard a vessel results routines that we work through. In this setting, we might not question whether the work we actually carry out is exactly what we imagine it to be. However, statistics paint a different picture. If, for example, the written procedure is not a good fit in terms of the practical situations faced by the operational sharp-end crew, the crew may deviate from the procedure.
The human brain is a smart tool and works in many great ways, but when it comes to managing operational risks it is susceptible to our biases. Since the brain is focused on saving energy, we are prone to taking shortcuts both in how we reason and how we act. Should this not be the case, our brain would have to repeatedly process the same information and analyse it from bottom up. When it is not feasible to meticulously follow all the steps in a procedure or in a set of routines, we are smart and prioritise the most important actions. After all, things could work out fine for the time being even if we skip a couple of steps. As time goes on, we usually take more shortcuts, and this operational drift often happens gradually, without anyone taking notice. However, the original routines were there for a good reason, and when the external risks increase, or we run into some bad luck, an incident can occur. There is a natural human tendency to rationalise shortcuts under pressure, especially when nothing bad happens as a result. In fact, the lack of bad outcomes can reinforce the perceived “correctness” of past success instead of objectively assessing risks.
The Dunning-Kruger Effect
The psychologists David Dunning and Justin Kruger tested people’s logic, grammar, and sense of humour. They compared the participants’ perception of their own performance with the real results of their actions and discovered that those who performed in the bottom quartile had estimated their performance to be above average. Since then, this result has been confirmed repeatedly in numerous studies.
Research has also shown that people with high skills in one area overestimate their skills in other areas. This overconfidence in one’s own performance can have dangerous outcomes. Let us say a certain Captain is good at manoeuvring a vessel, even during bad weather and low visibility. If the same captain estimates that s/he for sure is good at manoeuvring in ice, though s/he is not used to it, operating in icy conditions can become a real risk.
Another risk is overestimating one’s own ability instead of trusting procedures. Even skilled persons perform better when there is a system that backs up their blind spots. An overestimation of one’s own skills causes an operational drift towards what could become failure.
What can we do to manage the risk of normalisation of deviance?
- Establish psychological safety in our company: Create an atmosphere where it is safe to talk about mistakes and where employees can trust each other to give good and constructive feedback.
- Regularly evaluate procedures – how does the work envisioned in checklists and instructions correlate to the work done?
- Train feedback processes in crew briefings and debriefings
- Make sure that your company doesn’t only depend on one or a few people. Delegating work means utilising the full capacity of the company, streamlining procedures and minimising the risk of bottlenecks.
- Use the Four Eyes Principle – let a second person review written instructions, preferably someone who has practical experience of using the instructions in their work.
The Cost of Silence: Normalization of Deviance and Groupthink (nasa.gov)
Normalization of Deviance.pdf (bridgew.edu)