Safety in mind: Normalisation of deviance

39672
main image sourced from: NASA #S81-30498; Space Shuttle Columbia (Wikimedia Commons - public domain)

Diane Vaughan is an American sociologist who has spent her career studying organisations where deviations from rules and practices become the norm—often with devastating consequences.

Your phone bleeps while you’re driving and you can’t resist the temptation to look—after all it could be important! You check your messages and continue driving without incident. Given the frequency and banality of such occurrences, you might even start to tell yourself it’s perfectly safe to regularly perform the behaviour. The increased practice leads to familiarity and ‘habit’ such that the actions become a normal part of your driving routine.

The lack of bad outcomes can reinforce the ‘rightness’ of trusting past practices instead of objectively assessing the risk, resulting in a cultural drift in which circumstances classified as ‘not okay’ slowly come to be reclassified as ‘okay’. Diane Vaughan coined the term ‘normalisation of deviance’ and defines it as ‘the gradual process through which unacceptable practice or standards become acceptable. As the deviant behaviour is repeated without catastrophic results, it becomes the social norm for the organisation.’

Vaughan developed her theory when she was investigating the space shuttle Challenger accident which exploded shortly after lift-off on 28 January 1986. She observed that the cause of the disaster was related to the repeated evidence that NASA officials continued to allow space shuttle missions despite a known design flaw with the O-rings in the solid rocket boosters (SRBs). So why was NASA allowed to launch these missions when they had overwhelming evidence of the flaw?

The problem with the O-rings on the SRBs of the space shuttle was the putty used—bubbles formed in it, and during launch gases from the SRB would pass through the bubbles and burn the O-ring. When a damaged O-ring was first observed following the second shuttle mission, NASA managers, who were under tremendous time and economic restraints, convinced themselves the problem could be fixed without grounding the shuttle fleet. Several more missions flew without any problems from the O-rings so the ‘rightness’ of the decision to keep flying was reinforced. Over the years, more cases of O-ring sealing problems were observed in returning boosters but with each successful flight, the lack of a bad outcome reinforced the notion that it was safe to continue the operations without addressing the issue.

As Vaughan observed, the fact that no negative consequence resulted from the inaction led to the deviance becoming normalised within the NASA culture.

During the launch of the Challenger on January 28, the gases burned through the O-rings and ignited the main fuel tank.

Despite the enormity of the disaster, Vaughan does not judge. She believes that no single individual was to blame. Her investigation found that no malice had been involved in the decision to let Challenger keep launching with its flawed rockets. ‘Mistake, mishap and disaster are socially organised and systematically produced by social structures,’ Vaughan said in her 1996 book The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA. ‘No extraordinary actions by individuals explain what happened, no intentional managerial wrongdoing, no rule violations, no conspiracy.’

Did NASA learn their lesson and change the culture of the organisation? Unfortunately not. In 2003, the space shuttle Columbia disintegrated over Texas as it re-entered the Earth’s atmosphere. Heat shielding tiles had been dislodged during take-off when insulating foam broke off an external fuel tank. As with the O-ring issue, this was a known problem that was not regarded as particularly serious. The lack of negative consequences occurring in the previous 22 years of shuttle flight had led to an acceptance of damaged heat shields as part of the norm of missions.

Unfortunately, tragedies resulting from NASA’s cultural drift into deviance remain evident in the practices of transport and other industries. In January 2012, the Carnival cruise ship Costa Concordia ran aground off the coast of Italy and 32 people died in the ensuing shipwreck. The captain had consciously deviated from the approved course, as he and other captains had done many times before, to give people on the shore a spectacle by passing close to an island on an unapproved course. This behaviour had become the norm with the captain and the company. Investigative journalist Andrea Vogt wrote that Carnival’s directors ‘not only tolerated, but promoted and publicised the risky ship salutes off the island of Giglio and other tourist sites as a convenient, effective marketing tool.’

When speed, cost and efficiency become more important than safety and accuracy, it is easy to understand how once-safe practices are eroded by shortcuts, with the deviant practices subsequently becoming the norm. As has become apparent from the aforementioned, frequently, we are lucky. Risk-taking behaviour does not usually have negative consequences.

As Professor Sidney Dekker says, ‘Murphy’s law is wrong. Everything that can go wrong usually goes right’.

However, as Reason’s model demonstrates, regardless of our best efforts to defend against bad outcomes, holes in our defences are susceptible error that may lead to serious incidents and accidents. Now consider the likelihood of the error if the organisation or individual has not adopted a defence to a known threat. The presence of reduced defences increases the likelihood of failure, such as in the disasters noted.

Physicist Richard Feynman, in an appendix to the official Challenger inquiry report said a similar thing in brutal, vivid language. ‘When playing Russian roulette, the fact that the first shot got off safely is little comfort for the next,’ before going on to say that when new materials, high-energy systems and thin technical margins were involved, ‘we don’t even know how many bullets are in the gun.’

So what are the solutions? After the two shuttle disasters at NASA key recommendations were:

  1. Don’t use past success to redefine acceptable performance.
  2. Require systems to be proven safe to operate to an acceptable risk level rather than the opposite.
  3. Appoint people with opposing views or ask everyone to voice their opinion before discussion.
  4. Keep safety programs independent from those activities they evaluate.

In addition, Vaughn states that the best solution for the normalisation of deviance is, ‘being clear about standards and rewarding whistle blowers. Also, create a culture that is team-based such that each person would feel like they were letting their colleagues down if they were to break the rules. Finally, the importance of a top-down approach to safety cannot be overstated. If the employees see executives breaking rules, they will feel it is normal in the company’s culture. Normalisation of deviance is easier to prevent than to correct.’

10 COMMENTS

    • Practical? Surely you don’t mean that a mode of behavior that can lead to disaster is practical, meaning acceptable and worth following? Or perhaps you meant the opposite, it’s unclear.

  1. I have spent the majority of my career as a petroleum engineer in the upstream oil/gas industry with a strong interest in safety. Normalisation of deviance is a recognized issue and played a major role in some of the most spectacular accidents in our industry. For instance on Macondo (Gulf of Mexico, 2010) see the book Disastrous Decisions by Andrew Hopkins.* As a pilot I have always been aware of the similarities of safety issues between the aviation and petroleum industries. I try to avoid falling in the normalisation trap by rigorous adherence to checklists, rules, regulations and standard operating procedures. Many of these have been written in blood!

    *https://www.amazon.com/Disastrous-Decisions-Organisational-Causes-Blowout/dp/1921948779

  2. I agree with the author. Normalization of deviance is rampant, and condoned by many (most?) organizations. I see “safety meetings” galore, but behavior changes? Nope. People (even self described “members of the Safety Committee”) ignore the basic rules because… because… because.

    And the problem is… humans.

  3. Observe any airline in the world and you will find exactly what is quoted above; “When speed, cost and efficiency become more important than safety and accuracy, it is easy to understand how once-safe practices are eroded by shortcuts, with the deviant practices subsequently becoming the norm.”
    That’s why airliners lose aircraft-not Pilot Error which is a meaningless cop-out phrase to get airline management off the hook.

  4. Just as the organisation has a Safety Officer who dishes out Safety messages by the reams, there is at least one Unsafety Renegade within. He is the one who sustains the Normalisation of Deviance. The sustenance is conducted in the areas beyond the normal workplace ( in the lunch room etc) where safety procedures are derided and scoffed upon. I have found Unsafety Renegades in the same person who is supposed to promulgate safety. They dismiss unsafe acts and unsafe procedures by the wave of a hand or by the shake of their head. They try to bully and suppress those who follow safety procedures by calling them wimps and bedwetters.

    Till someone gets injured.

Comments are closed.