By Kreisha Ballantyne
Every pilot, controller and engineer needs to know about these five psychological traps.
In aviation, knowledge is power. As pilots, engineers and controllers, we pride ourselves on how our aircraft and systems work but rarely consider how we function. Cognitive psychology—the study of how and why we think as we do—has some potentially life-saving insights.
Task saturation: enough already
‘Task saturation is having too much to do without enough time, tools or resources to do it,’ says Eric Barfield, chairman of the US National Business Aviation Association (NBAA) Safety Committee. ‘That can lead to an inability to focus on what really matters.’ As task saturation increases, a pilot, air traffic controller or maintenance engineer may start shutting down, unable to continue to perform safely.
Consider this report from a Beech Bonanza pilot:
‘I was on short final, the gear was down and I suddenly realized: I couldn’t remember if I’d been cleared to land. I mean, I just couldn’t be sure. I radioed the tower, and they confirmed I had, but I had absolutely no memory of the readback. The weather was marginal, but within my capabilities. On board were my father-in-law and his partner, and we were flying an approach I’d not conducted before. I wouldn’t call it a high stress situation, but there were unfamiliar factors, which distracted me from the usual routine.’
This 3000-hour commercial pilot had reached saturation point.
To experience task saturation myself, I took part in an experiment at Macquarie University, conducted by master’s student, Danielle Moore, and aviation psychologist, Professor Mark Wiggins. I was given a challenging approach, including a strong crosswind, in a C172 over five repetitive attempts. It was no surprise for me to learn that in each subsequent flight, my performance deteriorated significantly. By the end of the fifth flight, not only was I ready for a stiff drink, I felt utterly overwhelmed, to the point of wanting to give up the controls altogether.
The aim of the study is to understand differences between the approaches pilots take to different problems. ‘Once we’ve identified the differences, we can intervene with a training or procedural strategy that’s unique for that pilot,’ Wiggins says. This would allow researchers to calculate pilots’ individual limitations in order to identify key areas that can be improved in future training sessions.
The obvious suggestion for me would be time with an instructor, focused on crosswind landings.
‘The idea is to use psychological methods to improve safety in the aviation industry by ensuring pilots, particularly those who don’t fly regularly, are receiving the appropriate training,’ Wiggins says. ‘We are looking for factors which determine the ways in which pilots can maintain their skills, so we can then identify loss of skills, particularly of the cognitive skills: the decision-making, situational awareness and information processing procedures.’
Three ways to ease task saturation
Communication: The NBAA Safety Committee lists task saturation among its top ten issues for raised awareness. ‘Until you recognise the risk, you can’t effectively address it,’ Barfield explained.
Self-assessment: Barfield suggests ‘an honest self-assessment to gauge your ability to properly compartmentalise and focus on the critical task at hand.’
Collaboration: Offloading tasks to a co-pilot, crewmember or colleague is another way to relieve task saturation.
Task fixation: what invisible gorilla?
Task fixation—the sister of task saturation—is another common psychological concept that demands further understanding in an aviation context. Fixation causes all cognitive capacity to be focused on one task. If this task is something other than flying the aircraft, then the potential for an accident rises exponentially. Fixation is not just a single pilot issue, but can also befall entire crews as demonstrated in the case study of Eastern Air Lines flight 401.
In a study on pilot fixation as a factor in aviation accidents at Embry-Riddle Aeronautical University, author Timothy N. Timmons identifies three primary causes of fixation: equipment problems, abnormal situations and target fixation.
The Australian Transport Safety Bureau (ATSB) sees equipment problems as causes of pilot preoccupation. Malfunctioning gear indicator lights, pitot/static problems and erroneous instruments are just some of the things that cause pilots to fixate.
The second primary cause is an abnormal situation, which disrupts the orderly sequence of normal events. The pilot often tends to focus all cognitive capacity on resolving the abnormality, even when there is no emergency, such as an unsettled or airsick passenger.
The final primary cause of fixation is target fixation. In this case, the pilot concentrates exclusively on a task that is secondary to basic aircraft control, such as navigating around terrain, resulting in the aircraft being flown into the ground, or a mid-air collision.
Try the fixation challenge below by counting the basketball passes.
Attribution error: I could never be so silly
‘When you blame others, you give up your power to change.’ Robert Anthony
Fundamental attribution error is a person’s tendency to place undue emphasis on internal characteristics to explain someone else’s behaviour in a given situation, rather than considering external factors. In other words, when we see someone doing something, we tend to think it relates to their personality rather than the situation the person might be in. Many of us do this every day on the road, saying: ‘other drivers run red lights because they’re reckless—I only run red lights when they change on me’.
When aircraft accidents are viewed through this psychological lens there is a tendency to blame the unfortunate pilot and minimise the role of external factors. Often, other pilots are quick to point out how they may have done things differently. While some consider this as part of an unhealthy culture of blame, QBE Airmanship Ambassador Matt Hall believes that openly discussing airmanship is key to avoiding attribution error.
Speaking at the Australian Bonanza Society’s pilot proficiency program last year, Hall discussed the importance of disclosure of mistakes amongst pilots.
‘We have all had situations in our lives where things go horribly wrong,’ Hall says. ‘Making sure your mindset is correct before initiating a task, and talking openly about your mistakes when things go wrong, allows for a mindset routine and a sense of airmanship.’
Preparation and planning are the keys to a strong mindset, and a mindset routine is the goal of a successful pilot. ‘Just because nothing has happened to me in the last 1000 flights, doesn’t mean it’s not about to strike in the next 30 seconds. It’s important to bring the right level of focus and correct mindset to any task that offers a risk,’ Hall declares.
The point is it could happen to you. Of course you are not stupid, of course you’re not careless, or reckless—but neither were most of those who have crashed before you. It’s an illusion to think that they were.
Vigilance decrement: a watched pot never boils
Another common factor in flying operations—vigilance decrement—is defined as deterioration in the ability to remain vigilant for critical signals with time, as indicated by a decline in the rate of the correct detection of signals. Put simply people do a poor job of monitoring automated systems because most of the time it’s so dull and demanding that our minds start to drift.
In World War II, Norman Mackworth studied the tendency of sonar and radar operators to fail to detect events near the end of their watch. Mackworth had operators watch an unmarked clock face over a two-hour period. A single clock hand moved in small equal increments around the clock face, but occasionally made larger jumps. Participants were tasked to report when they detected the larger jumps.
Mackworth concluded that a vigilance decrement—the declining ability to detect events over time—was universal, and not a reflection of poor discipline or commitment.
‘The classic example of vigilance decrement is the situation where the aircraft is on autopilot, and in the cruise,’ says Wiggins. ‘There’s a natural disinclination to complete a scan on a regular basis. It’s perfectly normal that this would occur, as there’s a lack of stimulation in the environment, which basically causes the brain to seek stimulus from other sources. We don’t take account of the fact that the human brain is a finely tuned organism. It doesn’t need too much load, because of course you reach overload, but we require a degree of load to remain energized, to remain attentive to a particular task.’
Confirmation bias: jumping to conclusions
In his book, Thinking: Fast and Slow, Nobel Prize winner Daniel Kahneman summarises the research he conducted on cognitive biases, including confirmation bias. The book focuses on two modes of thought: System 1, which is fast, instinctive and emotional; and System 2 which is slower, more deliberative, and more logical.
The study highlights several decades of academic research to suggest that people place too much confidence in human judgment. The most important thing to understand about confirmation bias, in aviation terms, is that you don’t control it consciously.
‘Humans like to have control,’ says Wiggins. ‘When we have a hypothesis, we like to confirm that hypothesis because it gives us the illusion of control. We like predictability. In the absence of predictability, we feel a degree of uncertainty, and that’s not comfortable.’
‘Bias occurs when prior knowledge, or an expected outcome, influences the perceptions, interpretations and decisions we make in relation to what we are observing,’ says David Wiman, Senior Safety Programs Specialist at Airservices Australia. ‘It can lead a person to see what they expect to see, or hear what they expect to hear. Recent aviation incidents, where expectation bias was a factor, include aircraft crossing runways when told to hold short, and responding to take off clearances directed to other aircraft.’
Watch an interesting example of confirmation basis here.
Airservices Australia’s air ground communications active listening tips
- When you listen—just LISTEN. Do not listen in parallel with performing unrelated concurrent tasks.
- Always use standard phraseology.
- If in doubt—CHECK.
- Listen before you speak—pause before transmission.
A broader view of human factors is now standard practice for ATSB investigators when examining the chain of events that led to an accident.
‘First of all, we need to identify loss of the cognitive skills: the decision-making, situational awareness, information processing,’ Wiggins says. ‘Then we must develop cost-effective training strategies that will allow those pilots to maintain their skills, so that they’re not going to become a statistic.’
Being aware of your limitations can identify your risk areas. The first line of defence is acknowledgement—accept that these concepts exist, question yourself and maintain a conscious awareness of your skills, currency and decision-making.
And remember the words of the 19th century German poet Goethe: ‘The man with insight enough to admit his limitations comes nearest to perfection’.
Confirmation Bias–Thinking: Fast and Slow, by Daniel Kahneman, Allen Lane, 2011