A psychological perspective on VFR-into-IMC accidents makes some alarming findings, but suggests new solutions
By Anthony Stanton and Robert Wilson
One flight lasted just over a minute, the other spent the best part of an hour blindly flying in increasingly tense circles. Both ended the same way, with the destruction of the aircraft and death of everyone on board.
The crash in 2017 of a Socata TB-10 Tobago at Mount Gambier, South Australia, just after take-off and the 2012 destruction of a restored de Havilland Dragon near Gympie, Queensland, were 2 of 107 occurrences of VFR flight into IMC recorded by the Australian Transport Safety Bureau (ATSB) between 2010 and 2019.
VFR into IMC, that is, flying into cloud or darkness and losing control or making a controlled flight into terrain, is a longstanding killer. Reports from the 1970s through to the 21st century cite similar occurrence figures, confirming that little headway has been made on this safety issue.
A recent attempt to make sense of why these crashes continue, despite pilot training, weather information, electronic flight bags and extensive safety promotion, looked beyond these factors. The study is based on PhD research exploring the applied psychology of rule-related behaviour of pilots. It was conducted by CASA Sport and Recreation Aviation Branch Manager, Anthony Stanton, under the supervision of the founder of Griffith University’s Safety Science Innovation Lab, Professor Sidney Dekker, Professor Patrick Murray and Professor Gui Lohmann. One of the outcomes of the research has been the clear identification of cognitive biases associated with pilots’ intention to conduct VFR flight
A large proportion of respondents have a mistaken, elevated appreciation of their own skill levels.
Pilot, know thyself: understanding built-in biases
‘You’ve got to ask yourself one question: “Do I feel lucky? Well, do ya, punk?”’
– Clint Eastwood, in Dirty Harry, 1971
‘Ego is not a dirty word’
– Skyhooks, 1975
Cognitive biases have been part of our make-up for as long as human beings have existed. They have come to prominence as an academic speciality since the Nobel prize-winning work of Daniel Kahneman and Amos Tversky, popularised as the book Thinking, Fast and Slow.
Biases and shortcuts are an inherent part of what Kahneman and Tversky call fast thinking, the quick reactive and often unconscious processes we use to get about in the world and make sense of it. And biases and shortcuts are by no means inherently bad things. Without them, the prospect of choosing from a multitude of similar-looking products in the supermarket or hardware store might prove paralysing, as it does for some unfortunate people. Most of us effortlessly resort to biases and shortcuts (such as a preference for brand names, the second-cheapest product, or most expensive) for a quick and easy way to make decisions. But, while some biases get us back from Bunnings or Woolworths without mental exhaustion, others in aviation can kill us. To control them, we need to understand them.
The study found 5 biases that are particularly relevant to the problem of VFR flight into IMC:
- Confirmation bias – the tendency for a person to seek out information that is consistent with an individual’s existing beliefs or expectations when confronted with unusual situational factors. Confirmation bias is an adaptive, positive testing and confirmatory strategy. In the setting of VFR flight into IMC, confirmation bias might result in a pilot subconsciously searching for environmental cues that the weather conditions are slightly above the minimum required, steady, or improving, when the opposite is true.
- Anchoring bias – the tendency for a person to rely substantially on the first piece of information (the anchor) that is received and make estimates or judgements based on the anchor. This first piece of information becomes an arbitrary benchmark for all other information. In the setting of VFR flight into IMC, an anchoring bias might result in pilots placing too much emphasis on earlier (good) weather forecasts and evaluate the actual weather experienced through the lens of the original forecast as better than it actually is. A pilot may perceive a ceiling of 500 ft as good after many days of 200 ft ceilings, or bad after many days where the cloud and visibility have been unlimited. A more relevant reference point for decision-making would be fixed minimums, whether personal, operator or regulatory.
- Framing bias – the tendency for a person to respond differently to the same information and choices, depending on how the information is presented to, or received (framed) by, the decision-maker. Simply put, a decision can be framed as a gain or loss. Kahneman and Tversky showed that when a decision is framed positively, as a gain, a person is likely to be more risk-adverse. When the same decision is framed as a loss, people tend to exhibit more risk-seeking behaviours. They called this prospect theory.
In the setting of VFR flight into IMC, the framing effect plays a role when pilots are considering whether to divert or continue, when faced with adverse weather. If a pilot perceives a diversion as a gain (safety is assured), they are more likely to adopt a risk-averse decision and divert.
Alternatively, pilots who perceive a diversion as a loss (hassle and inconvenience) tend to adopt greater risk-seeking behaviours. In a 1995 simulator experiment, researchers O’Hare and Smitheram found pilots did exactly as prospect theory suggested.
- Sunk cost bias – interacts with framing bias/prospect theory and is the tendency to continue a decision, an endeavour or an effort to preserve an investment of money, effort or time.
As the goal, such as arrival at the destination, becomes closer, people may tend to change their decision frame from a gain frame to a loss frame. The mid-point of a flight can be a significant psychological turning point for pilots when faced with adverse weather decisions, regardless of the distance flown. An analysis of 77 general aviation cross-country accidents in New Zealand between 1988 and 2000 found weather-related accidents occurred further away from the departure aerodrome and closer to the destination than other types of accidents.
- Self-evaluation bias – the tendency for a person to mistakenly overestimate or underestimate their ability, attributes or personality traits. There are various kinds of self-evaluation biases. Those who are less capable tend to grossly overestimate their ability, as opposed to those who are experts who tend to slightly underestimate their ability. The Dunning-Kruger Effect is a self-evaluation bias where people with relatively lower knowledge and skill tend to substantially overestimate their ability. Conversely, those who are highly competent tend to slightly underestimate their ability
The FIGJAM problem: a layer of Dunning-Kruger
‘You are a legend. Your self-invention matters. You are the artist of your own life.’
– Lady Gaga, 2017
FIGJAM is a distinctively Australian acronym, first deployed as bitter sarcasm, but is also, alas, relevant to aviation safety. The second to sixth letters stand for ‘I’m good, just ask me’. The first letter is a vulgarism to add emphasis, and not hard to guess.
The study found disturbing levels of FIGJAM among general aviation pilots (although the authors do not use this term). The results showed those who believed their skill as a pilot was above average was more than 8 times those who considered themselves below average. ‘A large proportion of respondents have a mistaken, elevated appreciation of their own skill levels,’ the authors remark.
These results, taken with the earlier findings of a misconceived, overinflated self-evaluation of ability, suggest a hazardous connection between self-evaluation cognitive biases that result in flawed perceptions of ability, that then led to decisions of intentional VFR flight into IMC.
The study explored the relationship between 3 psychological constructs – the intention to continue on into adverse weather, the perception of risk and perceived ability.
The study asked 419 pilots about:
- their past VFR flight-into-IMC behaviour
- whether they perceived VFR flight into IMC was safe for them
- whether the perceived VFR flight into IMC was easy for them
- whether they were confident they could safely perform VFR flight into IMC
- whether they perceived they had the skills and ability required to safely conduct VFR flight into IMC.
These questions produced a disturbing but significant result. It found that as a pilot’s perception of their own ability increased, so did the probability that the pilot would answer they would continue VFR flight into IMC in the adverse weather scenario presented.
‘These results, taken with the earlier findings of a misconceived, overinflated self-evaluation of ability, suggest a hazardous connection between self-evaluation cognitive biases that result in flawed perceptions of ability, that then led to decisions of intentional VFR flight into IMC,’ the authors say.
Dunning, Kruger and Catch-22
‘The best lack all conviction, while the worst are full of passionate intensity.’
– W. B. Yeats, The Second Coming, 1919
In Joseph Heller’s surreal anti-war novel Catch-22, airmen could be considered insane for willingly continuing to fly dangerous combat missions, but any request to be grounded, on grounds of insanity, was ipso facto proof of being sane.
Dunning-Kruger plus FIGJAM add up to a comparable paradox, and perhaps an explanation why VFR-into-IMC tragedies persist despite decades of safety education and promotion.
The study authors describe a general aviation Catch-22 where, ‘Given their perception of ability and their lack of metacognition, it is reasonable to suggest that this at-risk group (of pilots with high self-reported ability) are also those most unlikely to attend safety seminars or read ATSB research reports.
It could be said that the current approach may well be campaigning to the wrong audience and does little to meaningfully change human behaviour.’
In other words: those who would most benefit from education and training are the most resistant to it.
New thinking: inside the Gympie Box
Therefore, the study proposes a multifaceted interventional strategy.
Among the recommendations are:
- treat intentional and unintentional flight into IMC as separate issues because one is the result of inadequate training, poor understanding of confirmation and anchoring cognitive biases, and the other is the result of a conscious decision influenced by framing, sunk cost and self-evaluation biases.
- introduce cue-based weather training in both high-fidelity simulation devices and the real world. Rather than suspend flight training on IMC days, training with IFR-qualified instructors should explore decision-making and cue identification.
- emphasise technology as well as behaviour change: ‘We should look at the systems, both inside and outside the aircraft, that might better support pilots in these settings,’ the authors say. ADS-B and electronic flight bags allow general aviation craft to transmit their location and be tracked. Safety systems could be developed by integrating disparate information sources to provide a proactive adverse weather advisory service to pilots considered at risk. This service should be initiated by the air traffic service provider rather than the pilot. This strategy is given further weight by the fact that a rectangular zone extending from Gympie in Queensland to Sydney (825 km by 380 km) encompasses more than half of all fatal VFR flight-into-IMC accidents since 2006. ‘In most cases, surveillance is potentially available in those locations presently,’ the study says.
A rectangular zone extending from Gympie in Queensland to Sydney (825 km by 380 km) encompasses more than half of all fatal VFR flight-into-IMC accidents since 2006.
The study concludes: ‘The safety message here is to encourage consideration of a solution that is location driven, technology focused, and system-based, rather than focusing on solutions that have people do better or perform work differently.’
Stanton, A. A., Dekker, S. W., Murray, P. S., Lohmann, G. VFR flight into IMC, 2020 to 2019 and a focus on hazardous cognitive biases [Unpublished doctoral dissertation]. Griffith University.