The fallibility of the expert

5084
image: Civil Aviation Safety Authority

In aviation, we are trained to become experts—be it a pilot or an air traffic controller. By definition, experts have the ability to recognise highly complex patterns of cues, which they have developed over time, to be superior in solving problems and are familiar with domain-relevant circumstances.

Experts can easily apply prior learning to new situations and make automatic assessments with little conscious consideration, known as ‘involuntary-automaticity’. This automatic processing is an important factor in our drive to become more efficient as an industry and cope with increasing traffic growth into the future.

While ‘involuntary-automaticity’ allows us to make decisions quickly and efficiently, repeated reliance upon involuntary-automaticity and/or failure to appropriately question automatic thoughts may result in us falling prey to cognitive biases, which can impair our decision-making ability—it’s a catch 22.

The most important thing to understand is that we process information automatically with little conscious consideration—we are not aware it is happening. We can become aware of it if we’re specifically looking out for it, or if it’s pointed out to us by someone else, but the majority of the time we only become aware of cognitive biases after the fact. We are also more likely to suffer from cognitive biases when we become comfortable and it can happen to the most experienced experts.

As airspace users, it’s important to be aware of the most prevalent cognitive biases.

Reductive bias: the tendency to interpret a highly-complex system in a simplistic manner based on prior experience.

For example, a controller may expect the same projection of speed for two aircraft due to familiarity with the types. This type of bias (same type = same speed) can be further confounded by associating a particular flight number call sign with a particular aircraft type (see expectation bias below), without cross-checking respective aircraft types. It’s an easy mental shortcut, and may have actually been correct on multiple occasions (so the shortcut is reinforced).

Carrying on from the previous example, if both aircraft are on climb following each other with more than the minimum spacing required, the controller may base their expectation that the separation standard will continue to exist on the expectation that the aircraft will operate at similar speeds with negligible closing. The controller may not perceive or comprehend when one aircraft’s radar return reflects a significant increase in speed, because it goes against their expectation. This is called expectation bias, where previous patterns and information held in memory influence what we currently perceive.

Expectation bias: where previous patterns and information held in memory influence what we currently perceive.

Another example is a pilot may hear the following controller instruction: ‘Airline 123 hold short of runway 06’ as ‘Airline 123 line up runway 06’ if they expect to be next in sequence to take off. The expectation can be so strong it influences what we see and hear.

Availability heuristic: the tendency to adopt recent information/strategies to solve complex problems, despite potential lack of suitability.

For example, a controller may instruct an aircraft to deviate 10NM right of track to avoid weather because the previous aircraft required the deviation, however, the weather may have moved and the deviation is not required.

Plan continuation bias: the tunnel vision-like tendency to adhere to a current plan of action, without sufficient consideration of the benefits of alternate plans. For example, a tower controller may persevere to maintain a pre-determined landing sequence despite the fact that aircraft are slow to depart due to wind conditions.

So what can we do?

Being aware that our limitations can help us identify our risk areas. The first line of defence is acknowledgement—accept that these biases exist, question yourself and maintain a conscious awareness of your skills, currency and decision-making.

Ask yourself the following questions:

  • Am I oversimplifying this situation?
  • What assumptions have I made and how can I validate them?
  • Have I missed any significant cues?
  • Am I confident that my solution is most appropriate for this scenario, or am I being influenced by recent events?
  • Is sticking to my first plan of action really the best course of action?

Further suggestions:

  • Cross-check information-be sensitive to the perils of expectation bias and be disciplined in the application of basic scans and processes to validate assumptions (e.g. call sign/type correlations)
  • Ensure correct read-backs between pilots and air traffic controllers (this requires your active and focused attention. Avoid engaging in other activities while listening to read-backs)
  • Take note of incident investigation lessons learnt
  • Manage physiological and workplace factors or influences that are associated with, and can exacerbate, expectation bias (e.g. fatigue, shift-handover etc.)
  • Maintain a healthy level of self-questioning/checking (‘healthy paranoia’-always seek to make sense of all the available information by being as engaged at the conscious level as you can in the process of ‘conscious enquiry’)
  • Draw a ‘line in the sand’ (e.g. ask for a pilot report) at an appropriate time to prompt a review of a situation to validate your plan and any assumptions made

6 COMMENTS

  1. No such thing as an expert, just someone who knows more than the next person! Don’t forget the very guys that keep these hunks of junk in the air, the ENGINEERS!

  2. Nice article. Expectation bias is often evident in accident reports where automation does uncommanded an/or unexpected things. Reductive bias is also a frequent ‘between the lines’ factor in mishaps. Maybe your next article might discuss cognitive traps such as risk compensation and risk homeostasis.

  3. ‘Involuntary-automaticity’ – thanks for that. So much of it around. Appreciate these type of exploratory articles in human behavior and would like more. If we all deconstructed this advice, it would make for safer practices on the ground and in the air.

  4. Walter raised an interesting point – don’t forget the Engineer, or more especially the RPT LAME who is called to answer for delaying a commercial flight by the operations manager.
    All those company dollars squandered on technical training while the LAME attempts to resolve an elusive major defect in the interest of sound Risk Management.

  5. Two of the above comments are relevant and valid. The other two are not. Anyone can work it out. This is a flying safety forum predominantly related to the application of human factors. Please stop trolling the authors, as I can’t see the point. Get a life.

Comments are closed.