Gravity’s judgement

6360
image: Tim Felce (Airwolfhound) | CC 2.0

The Clutha Pub, 169 Stockwell Street, Glasgow, Scotland
29 November 2013

For a helicopter pilot there are many bad situations, but this would have to be one of the worst.

Judge for yourself.

You’re a police helicopter pilot with impressive qualifications and experience, flying the homeward leg in a modern helicopter. It’s got a well-trained crew and all the aviation mod-cons—digital displays, two reliable turbine engines and a stabilised automatic flight control system.

But then the ominous glow of a caution light:

Low Fuel 1.

And then another:

Low Fuel 2.

The danger implied by that caution instantly dominates the modern cockpit. Every pilot knows it’s one of the non-negotiables in aviation.

No fuel. No fly. Ever.

The thing of it is though there’s still another 76 kg (or 20 minutes of fuel) in a centre tank. Apparent ‘finger trouble’ has meant the transfer switch is off and the remaining fuel is in a ‘so close yet so far’ isolated state in the centre tank. Just one switch selection, just a few millimetres of movement of a small metallic rod from ‘off’ to ‘on’ makes the immediacy of the low fuel crisis go away. And yet inexplicably, even with the pilot’s excellent record and substantial aircraft, command and instructional time, the switch never moves, and the pilot keeps on flying as though there is no ominous glow of those two fuel lights. Which is quite surprising because every pilot should know a low fuel light is the light that rules them all. That’s because …

No fuel. No fly. Ever.

With low fuel lights on, the pilot has 10 non-negotiable minutes to find the nearest airport, suitable park or unoccupied footy ground and land. Since it takes only a few minutes to do that—it’s a vertically unchallenged flying machine that can hover after all, and this one has a pilot with night vision goggles—the low fuel light shouldn’t be too much of a drama. Land at said nearest airport, unoccupied park or unoccupied footy ground and sheepishly ring for a refuel truck (and then your boss to fess up to the low octane faux pas). Nothing more to be said. Unless you ignore the insistently glowing fuel cautions—and ignore them over the busy nightlife of a Glasgow pub district.

Inexplicably this is what the pilot of police helicopter G-SPAO, an EC135, did on the evening of 29 November 2013 in the vicinity of the packed Clutha Pub—not just once but three times. That’s how many times the master caution and aural warning were acknowledged while low fuel lights from the number one and two cells continued to glow.

This raises an obvious question to which we don’t have an answer: Why, oh why, would an experienced, above-average pilot-instructor, push on and turn a bad situation into an appalling one?

Before we go any further, I wish to make a rather serious confession. I am a recovering ‘judgementalist’. Perhaps you don’t agree. Perhaps you don’t know enough about me to agree or disagree. Perhaps you wish I wasn’t using a neologism. Perhaps you’re now losing interest because I’ve used the technical term for a new word as well as a new word (‘judgementalist’). Perhaps you don’t really have the time or inclination to put up with cutesy prose like this and you’re about to move on. After all, aren’t we supposed to be talking about an aircraft accident? Or perhaps it’s none of the above. Which now makes it at least one of the above which means, in turn, I need to include you in my (and now our) confession.

We are all recovering judgementalists.

I know this because you, like me, are making judgements all the time (and then recovering from them). You have been doing it since you started the article. You were doing it through my ‘cutesy prose’ paragraph above. And you’re doing it now: Should I keep reading? Why has he said what he’s said? I don’t like what he’s said. He hasn’t even thought about …

Psychology tells us we are always making fast, intuitive judgements about pretty much everything and then, with a fair degree of lag, rationalising these judgements with time-delayed sophisticated thought. Psychologist Jonathon Haidt uses the metaphor of a rider and an elephant to illustrate. In The Righteous Mind he tells us ‘a six-word summary of the social intuitionist model’ is ‘intuitions come first, strategic reasoning second’. He likens these fast-action intuitions to an elephant and the strategic-reasoning to a rider pretty much going where the elephant wants to go. This means our intuitions are, more often than not, rationalising our reason rather than (as we’d probably like to think) our reason rationalising our intuition.

Let that just sit for a moment as you ‘judge’ the pilot in the final moments of G-SPAO.

Ten minutes after the fuel low warning, the first engine flamed out. A single engine failure in a highly capable two-engined machine isn’t a good thing but it isn’t a catastrophic thing—unless the engine failure is due to fuel starvation.

A minute or so later the second engine flamed out. Without engine power to the rotor the pilot needed to immediately and assertively lower the collective to the floor. This would have flattened the blade pitch which would have allowed the rotor blades to enter autorotation. If the pilot could have got the machine into ‘auto’, the rate of descent airflow would have acted as a propulsive, aerodynamic force to keep the rotor spinning and the descent rate to a hair-raising but controllable 2500 foot per minute.

From there, if there happened to be a nearby space (which Google Earth shows there was), the rotational energy could have been converted via a flare and assertive pitch-pull to allow a survivable and even (if it was done well) relatively gentle landing—something the pilot had been trained to do at least every six months for pretty much his whole career.

But, perhaps because of night context, the overwhelming cognitive effects of multiple emergencies at once (engine failure, dual engine failure, electrical failure due to the engine failure etc.) the collective of G-SPAO got lowered, raised, lowered slightly and then raised (where it remained) with the machine still hundreds of feet above the ground.

The rotor blades quickly became the victim of their own unpowered drag: coning and then rapidly decaying into aerodynamic irrelevance. Witnesses who couldn’t see the machine, but could hear it, described hearing whip-crack and ‘backfire’ noises. If they could have peered into the night sky and seen the EC135’s rotor blades, they would have been able to literally count each individual blade as they slowly flopped around their arcs—each blade smashing against its lead-lag stop generating the ‘whip-cracks’ and ‘backfiring’.

Ask any helicopter pilot about their worst nightmare and this is it. A helicopter with barely rotating blades is 3000 kg of dead-weight metal. The controls are unresponsive. Most of the displays are dark and the pilot is just a passenger living out a nightmare in which they are the pilot-in-command and the pilot-out-of-control all at the same time.

It took only seconds from cruise height for the three metric tonnes of dead-weight helicopter to ballistically impact the crowded Clutha Pub. Inside, general pub banter, Friday night ales and end-of-week conversations were violently and cruelly cut short. Seven patrons of the pub were killed while 11 others were critically injured from debris and crush injuries. All three of the crew also died. In the sad aftermath, investigators drained 76 kg of unburnt fuel from the centre tank.

The implications of this accident can be seen in any number of ways, but let’s stick with how we as readers are ‘judging’ it.

As Haidt points out our gut reaction, that is, our intuited initial assessment, is literally judgemental which means it’s an assessment leveraging off intuition not methodical or factual analysis. And we are doing this ‘instantly and constantly’ all the time while our slower, more comprehensive thinking constantly plays catchup—often unsuccessfully. Hence my neologism and my confession: we are all recovering ‘judgementalists’.

This probably means right now you’ve already got a gut feel, an intuited value assessment of the pilot on that awful night even though you have only the most minimal of facts. The internal judgement might range from partially gracious to the flat-out bolshie: ‘What kind of pilot ignores two low fuel lights multiple times?’ ‘How could he leave the centre tank switch off?’ How incompetent do you have to be to so badly mismanage fuel?’

On and on we could judgementally go even though, out of a 180-page report and some 50,000 words of techno-causative analysis, I have only given a mere 483 words of initial narrative—a mere 0.0097 per cent of the gathered facts which are replete, by the way, with the clause ‘for unknown reasons’. Moreover, as at publication of this article, a Scottish Fatal Accident Inquiry (similar to a coronial inquest) intends to take at least six months to further investigate.

If we are honest, not just with serious accidents, but with nearly everything, our intuited judgement is always ready to push open the door to a judgemental conclusion. The sad fact is that as aviation professionals, or even just as commonsensical human beings, we like to think we reason in a pleasant union with the facts but often, demonstrably, we don’t—not initially. Study after study shows this. And this isn’t just for the cockpit. Throw in some good old-fashioned group think, some social-cultural power dynamics and the vagaries of human nature and you can see why some boardroom (and crew room) decisions aren’t in the good, bad and ugly taxonomy.

So, the next question I think we should ask is, what can each of us do to recover from our continuing judgementalism?

I have what I think are three helpful questions aimed at encouraging me to ‘judge justly’ and to deter me from ‘judging judgementally’. I’m hoping they will be helpful to you as well although you should know I’m not saying they necessarily tame the elephant, but they should make it obvious there is an elephant in the room in the first place.

Firstly, I ask myself to ask more questions. There are always more facts, more information out there. For example, if as I read this accident account my mental elephant stampedes its way into ‘I would never be so dumb as to run out of fuel’, I forgo the opportunity to get curious, ask deeper questions and therefore, perhaps, get closer to a hitherto unrevealed cause. The accident report mentions the fact the fuel indicating system had been written up several times for erroneous indications. All pilots understand the possible implications of a warning that isn’t really a warning. It’s the case of the aero-boy who called aero-wolf. Of course, we don’t know for sure if this contributed to the catastrophe, but it surely couldn’t have helped.

We could ask other questions. What if the pilot was cognitively compromised through fatigue, startle or some other psycho-social duress? Again, we don’t really know but a questioning trajectory like this is likely to bear more fruit than a judgemental ‘that was dumb, and I surely wouldn’t’ sentiment.

The second question I ask is ‘what if the accident pilot is actually a way better pilot than me?’ Our elephant is likely to assume it is bigger, better and smarter than anyone else’s elephant which means I’m likely to assume the accident pilot was way worse than me which again provides a ‘rationale’ for the accident. Again, this closes down curiosity and any possibility of deeper analysis. On the other hand, if I assume the pilot is better than me, I get to ask questions such as ‘how could a good pilot become such a “bad” pilot’. How could a professional, with as much acumen, training and as many CRM courses as me, wind up in such a terrible situation? Again, such questions don’t bring us to investigative nirvana, but they increase the potential for more comprehensive evaluation.

The third question I ask is ‘what happens when I put myself there?’ I don’t just mean in the cockpit. I mean the pressures and conditions of a busy cockpit and airspace as well as the pressures and conditions of police helicopter culture or the culture of the country or even in, and not limited to, the upbringing, nurture and psycho-social background of the pilot. When I do this, I may not get precise, quantitative causal-factor answers but I at least might get some broader qualitative indicators.

There’s so much more that could be asked but since I’ve already exceeded my word count, I’ll finish with this quote: ‘When you judge others, you do not define them, you define yourself’. Rudolf Jerome Ragay is echoing two-thousand-year-old wisdom that says:

Do not judge or you too will be judged. Do not condemn or you too will be condemned. With the measure you use it will be measured to you.

Judge for yourself. I just ask you to judge justly.

Further Reading:

Aircraft Accident Report AAR 3/2015 – G-SPAO, 29 November 2013

The Righteous Mind: Why Good People Are Divided by Politics and Religion by Jonathan Haidt 

4 COMMENTS

  1. So very well articulated thank you, yes we can judge (it is human nature) but as so eloquently pointed out, we should put ourselves in the situation of this pilot and understand the reasons for the accident not judge the reasons.

    Any number and type of events can lead to a fateful (or survivable) outcome.

  2. Thank you for an excellent article and lesson. For many years now I have worked as a professional investigator working to understand the causes of industrial workplace accidents. Very often when I get a call asking me to investigate an incident, the caller will start telling me ‘why’ they think it happened. I always stop them – and ask them to tell me nothing more than ‘what’ happened. This is entirely about stopping me from making judgements or assumptions about causes. A great challenge for any investigator is to keep an open mind. If we judge too early, we will go looking for evidence to support our judgements, rather than to truly understand what happened.

    • Laurie,
      I think I would much rather you had written this article above as you have not been judgmental, whereas our well articulated author above has been from the outset. We the readers have been hounded and accused of this judgmental attitude right or wrong and with no knowledge of us as individuals. While we are all fallible as it is in human nature to be, we can only hope that fallibility does not descend upon us at a critical moment as it apparently has for the pilot on that fateful flight. Be questioning of yourself and your actions at critical times whether it be while driving, flying or operating machinery human fallibility can creep in at any time with unfortunate consequences.

  3. In my “judgement” this article is unreadable. So much dribble that the incident is just an excuse to drone on about the authors favorite whipping post. Blah, blah blah!

Comments are closed.