Who’s the boss?

3623
© General Electric

The human side of automation.

How do we safely use aviation automation when the entire world is becoming addicted to letting machines do the hard work?

Whether you can have too much of a good thing is no mere philosophical question. It has a direct bearing on aviation safety when the good thing in question is cockpit automation.

Automation in aviation can mean many things, from a simple wing-leveller autopilot, to a fully integrated suite of electronics that manages the entire flight from climb-out to touchdown. Most modern discussion of automation refers to integrated avionics combining the functions of an autopilot with a flight management system: an aircraft that knows where it’s going and how to get there.

On balance automation is a good thing. Even reports pointing out its problems say this.

‘These systems have contributed significantly to the impressive safety record of the air transportation system,’ the US Federal Aviation Administration says in its 2013 report on Operational use of flight-path management systems.

But the FAA report goes on to say ‘flight crews sometimes have difficulties using flight path management systems’.

The FAA’s Flight Deck Automation Working Group concluded: ‘modern flight-path management systems create new challenges that can lead to errors. Those challenges include complexity in systems and in operations, concerns about degradation of pilot knowledge and skills, and integration and interdependence of the components of the aviation system’.

Pilots and regulators are not the only ones with a simmering mistrust of automation. Down on the ground there’s a sceptical school of thought that says automation is having insidious effects on everyday life and that aviation should not think it can be immune to these.

Writing in The Atlantic Monthly, author on technology, business and culture, Nicholas Carr, said: ‘Automation turns us from actors into observers. Instead of manipulating the yoke, we watch the screen. That shift may make our lives easier, but it can also inhibit the development of expertise.’

Carr has emerged as a fierce critic of technological utopianism with magazine articles such as Is Google making us stupid? and his 2011 critique of the internet The Shallows: What the Internet is Doing to Our Brains. His latest book, The Glass Cage, to be published in September, looks at aviation automation, among other targets.

Speaking to Flight Safety Australia, he emphasised the importance of practice, both to develop skills, and to maintain them.

To become skilled, ‘One has to be deeply engaged in a difficult task under a wide range of circumstances’, Carr says.

‘As commonly designed, automation ends up reducing both the challenges a pilot or other worker faces and the intensity of his or her engagement in the work. Essentially, it turns the person into more of an observer and a less of an actor. Rich skills aren’t given an opportunity to development, or rich skills, if already developed, start to become rusty.’

Carr is concerned that systems are becoming focused on technology, rather than people. ‘People are being pushed further out of the loop,’ he says. ‘The dominant design ethic is what human factors experts describe as “technology-centered automation.” Designers and programmers try to shift as many tasks as possible to the computer, and then allocate to the human worker only those tasks that the computer is not yet capable of doing. Considerations of computer capabilities are given precedence over considerations of human capabilities.’

‘What drives this is a desire to gain the efficiencies that high-speed computers can provide, even if it means a loss of human talent. It’s a trend that’s very hard to reverse or even resist, particularly for profit-seeking businesses, which naturally seek to make their operations as efficient as possible.’

Human skill decay becomes a serious problem when automation fails. Carr points to researchers who have found instances of this on flight decks, at sea, in factories and power station control rooms.

‘When automation fails, we often see a worker not only experiencing a sudden and unexpected spike in workload, but also a sense of disorientation. Because the worker has not had enough to do, he tunes out and loses situational awareness, and when something goes wrong he often reacts in a slow or incorrect way. The combination of decayed skills and lost situational awareness can increase the odds of an accident.’

Another problem is what is called ‘the automation paradox’. Even when an automated system operates properly, it can overload a worker when a crisis occurs, Carr says.

‘In addition to dealing with the crisis, a worker often has to monitor computer screens and enter data into a computer. That ends up overloading the worker at the worst possible moment. Both these phenomena have been seen in aviation (such as Qantas flight 32, where the electronic centralized aircraft monitor ECAM overloaded the crew with an estimated 120 messages) and elsewhere, such as in factory control rooms or on battlefields.’

Even in our personal lives, we often experience automation paradox, Carr says. ‘If you’re using a GPS system while driving and the system gives you an incorrect or confusing instruction, you often find yourself wrestling with the GPS unit even as you’re driving—a dangerous example of overload.’

There are some philosophically different models of automation out there. Carr notes that over the last decade electronic stability control systems have become widespread on motor vehicles. These systems are ‘automation in the background’. They automatically monitor the vehicle’s intended and actual paths over the road, but only intervene when these paths diverge, such as when the vehicle is skidding. Until then the driver has control.

‘Unfortunately, there are other new automotive technologies, such as adaptive cruise control and automated lane centering, that may have the opposite effect, encouraging the driver to tune out and lose situational awareness (and, in the long run, skill).’

Professor Sidney Dekker is not convinced that having automation ‘in the background’ overcomes the fundamental problems arising from the failure of humans and automation to communicate. ‘The problem with automation in the background is the “strong and silent” issue. If it has a lot of authority but does not involve the human user in what it is doing (because it sits in the background), that can actually amplify automation surprises’, he says.

Dekker notes that both Boeing (Turkish Airlines flight 1951) and Airbus (Air France flight 447) aircraft have had crashes following the crew being surprised and confused by the behaviour of automated systems. ‘If we want to train for these things, we should probably first have a better idea of how to represent automation failures in the cockpit. This is where human factors or cognitive systems engineering research come in.

And on the issue of how to keep basic skills sharp in an age of automated flight, Dekker, a glider pilot since he was 14, has an old-school solution. ‘Deliver, with each jet that leaves the factory, a couple of gliders to the purchasing airline, while extracting the promise that its pilots will be put through basic glider training up to solo, so that they know how aerodynamics are supposed to work again.’

Further information:

Federal Aviation Administration, 2013 ‘Operational use of flight path management systems’

LEAVE A REPLY