Flight Safety Australia begins a series on the ideas that spur safety.
Foodstuffs play a prominent role in Professor James Reason’s thinking about safety. In an anecdote up there with Isaac Newton’s apple moment, Reason turned his professional attention to safety in the 1970s after a domestic ‘disaster’ where he absent-mindedly put cat food in his teapot. (The cat and the pot are given equal billing with an airliner and nuclear power station on the cover of Reason’s memoir, A Life in Error.)
The model is a metaphor for the way circumstances arise and retreat like the holes in Swiss Emmentaler cheese. It springs from the understanding that there are at least four types of failure required to allow an accident to happen; failures of organisational influences, supervision, preconditions and specific acts.
In the case of an aircraft crash, the specific acts would be crew actions, forgetting a checklist item, for example. Supervision might refer to pairing two inexperienced pilots together, or a shy first officer with an overbearing captain; preconditions could include fatigue, or a noisy radio channel with frequent interruptions; and organisational factors could include an airline culture that places great value on on-time departure, thereby creating subtle pressure to get through checklists quickly.
An early version of the model imagines each of these failure types as a hole in a slice of Swiss cheese. (Reason credits an Australian safety expert, Rob Lee with the Swiss cheese name.) Reason says these holes are not fixed, as in real cheese, but open and close as circumstances change. An accident happens when all four layers have holes through which the series of events reason calls an ‘accident trajectory’ is able to pass. The slices are the layers of defence.
In subsequent versions of the model (co-developed with Dante Orlandella), Reason has simplified, and no longer specifically identifies these four failure types as the ‘cheese slices,’ instead the defences can be any action, policy or barrier put in place.
The language has changed too with latent failures described as latent conditions since 1997, for example, making the point a condition can exist that is not necessarily a flaw. For the same reason, newer versions of the model do not refer to ‘unsafe decisions’ or managerial failures, but rather to organisational features. Reason now talks of active failures and latent conditions as the two things that cause holes in the cheese defences.
James Reason is among the loudest voices in arguing the limitations of his model, which he has come to see as a metaphor, more than a precise description. In a 2005 paper he writes ‘The Swiss cheese model does not provide a detailed accident model or a detailed theory of how the multitude of functions and entities in a complex socio-technical system interact and depend on each other. That does not detract from its value as a means of communication.’
The model, and Reason’s accompanying work on human error and violations, which he finds are related to environment design and organisational policy, plays a useful role in being an easily understandable way of reminding accident investigators and safety managers to look up and out beyond immediate cause, to consider the context of an accident, and put measures in place to stop it happening again.
In the oil and gas industry, a bowtie is not a somewhat antiquated item of male fashion, but a means of accident analysis and prevention. The concept first appeared in print at the University of Queensland in 1979, and came into more widespread use after the Piper Alpha oilrig fire of 1988. What it has in common with the Swiss cheese model is the idea that accidents can be defended against with appropriate measures, put in place before the event. The bowtie also urges us to examine what can be done after the event, to minimise bad consequences.
The bowtie consists of three main parts.
- In the centre is hazard, which is the activity being analysed (flying an aircraft for example). Categorised under the hazard is the ‘top event’, which is the ‘knot’ in the centre of the bowtie. It is the thing you want to avoid, because it signifies the worst has happened—the hazard has asserted itself. For flying an aircraft one plausible top event might be ‘lose control in flight’.
- On the left side of the bowtie are the threats. These are things that might contribute to the top event. In our example they could be ‘pilot insufficiently trained’, ‘severe weather’ and ‘wake turbulence from larger aircraft’.
- On the right side of the bowtie are the consequences. In our example these could include, ‘aircraft departs from assigned altitude’, ‘passengers injured by abrupt manoeuvres’ and ‘aircraft crashes into ground’.
With the hazard, top event, threats and consequences defined and set out, it becomes, if not a simple matter, at least a comprehensible one to assign defences to the threats and consequences.
A refinement to the model is to add escalation factors, sometimes called defeating factors. These are events or complications that can neutralise defences. If ‘pilot receives accurate weather forecast’ is a defence in our example, the countervailing escalation factor might be ‘internet coverage unreliable at isolated airport’ making the defence less robust.
The bowtie method enables hazard analysis to be made visible, and clearly displays the links between threats, preventative measures, consequences and consequence-mitigating measures for any incident. A simple bowtie can be drawn on a piece of paper, but software is available for comprehensive and consistent application of the model. For organisations interested in using the bowtie, the United Kingdom Civil Aviation Authority is a good place to start, with extensive information on the bowtie model, including templates covering the ‘significant seven’ air transport hazards.
Rob Lee says the bowtie and Reason’s Swiss cheese are similar, in that both focus on conditions rather than accidents, with the bowtie expanding on the Swiss cheese model by including control of these conditions. This is the foundation of safety management.
[…] Source: Flight Safety Australia […]