Design for living—why human machine interface matters

10264

Long before the concept of human machine interface (HMI) emerged in computer engineering it was an important topic in aviation.

At first, aviation HMI related only to how the pioneering pilot’s hands and feet worked the controls—by 1914 this had shaken down into the stick and rudder arrangement still used today. As aviation has developed, HMI has expanded to cover not just the physical interface, but also the mental engagement between humans and flying machines that seem to think for themselves.

Modern challenges in HMI include cockpit design, which covers how to present information to the pilot; and automation design, covering the question of who should do what, in dividing the task of flying between humans and computers.

In short, HMI covers both physical and cognitive interface with the machine.

Darn these hooves! I hit the wrong switch again! Who designs these instrument panels, raccoons? © www.CartoonStock.com | W.B. Park
© www.CartoonStock.com | W.B. Park

A breakdown in communication between pilots and machines has been implicated in crashes ranging from American Airlines flight 965 in Colombia in 1995 (the crew accidentally erased approach waypoints from the flight management computer), to the China Airlines flight 140 crash at Nagoya, to the uncontrolled descent of Air France flight 447 into the Atlantic Ocean in 2009. Human machine interface in its crudest sense was implicated in the Brazilian mid-air collision of 2006, when, in all probability, a footrest allowed a pilot’s boot to select a brand-new business jet’s transponder to off, and the display warning of this state was just one small alert among many indicators.

Beyond these headlines are events known to those involved, their managers, and, in a just aviation culture, regulators. There are innumerable instances of air transport pilots expressing bafflement, frustration and occasionally fear, at the apparently inscrutable behaviour of autopilots, flight management computers, flight modes and control law regimes. While few of these incidents are serious in themselves, any one of them, in combination with other difficulties, might be the catalyst that turns an incident into a tragedy.

‘The irony that the more advanced a control system is, so the more crucial may be the contribution of the human operator.’

Ghost in the machine—lessons from industry

Industrial experts have known the principles underlying HMI for decades. In a much-referenced 1983 paper, The Ironies of Automation, industrial psychologist Lisanne Bainbridge challenged the popular view that human operators could be engineered out of industrial processes.

Bainbridge wrote of, ‘The irony that the more advanced a control system is, so the more crucial may be the contribution of the human operator.’ The reasoning was that, in an automated system the human operator would be called on only when things went wrong, and would therefore require more skills and knowledge than an operator doing ordinary work.

However, Bainbridge noted that the move towards automation would weaken rather than strengthen a machine operator’s skills. She quoted studies going back to the 1960s that showed machinery operators lost their previously highly refined skills when automation was introduced; and in this deskilled state, had difficulty in intervening effectively when automation failed.

Bainbridge also noted how ‘automatic control can “camouflage” system failure by controlling against the variable changes, so that trends do not become apparent until they are beyond control’.

Two years after her paper this precise sequence was part of the near crash of China Airlines flight 006. The Boeing 747SP rolled into a vertical dive after an engine lost power and the autopilot applied rudder to keep the aircraft in balance. When the autopilot was switched off the rudder input disappeared and one of the closest calls in the history of passenger aviation began.

Aviation specialists persist in seeing the problem as one of human failings, perhaps because it is easier and quicker to retrain humans than redesign multi-million dollar systems.

Returning primacy to the pilot

Returning primacy to the pilot is a theme of recent discussion in HMI. FAA chief human factors advisor, Kathy Abbott, told a CASA workshop on human factors and the automated flight deck that the entire system of aviation need to make more room for human pilots.

‘One of the challenges we’re seeing in airspace design is we’re seeing these fantastic procedures using RNAV but one of the questions we need to ask when we’re designing these procedures is whether or not we really need that level of precision,’ she said. ‘That precision may eliminate the opportunity for pilots to fly without the autopilot and removing that level of precision may permit manual operations.’

International Federation of Airline Pilots human factors chairman, Dave McKenney, told the workshop the issue was best viewed not in terms of the intrusion of automation but as flight path management. ‘We need a broader perspective that covers design philosophy, operational policy and regulatory standards,’ he said.

‘Most importantly we need to change the focus from “manage the automation” to fly the flight path. We need to reengage pilots into flight path management at all times: the job of flying hasn’t changed, it’s just that the tools have changed.’

New tools: new risks?

But have new tools brought new risks? Cognitive scientist and design engineer, Donald Norman, is scathing in his critique of computer design. In the Design of Everyday Things (1988, revised 2013) he thunders, ‘Now turn to the computer, an area where all the major difficulties of design can be found in profusion … designers of computer systems seem particularly oblivious to the needs of users …’.

Norman’s proposed solution is user-centred design, a term introduced in the first edition of Design of Everyday Things. User-centred design means putting the needs of the user before other considerations, Norman says.

Design should:

  • make it easy to determine what actions are possible at any moment
  • make things visible, including the conceptual model of the system, the alternative actions, and the results of actions
  • make it easy to evaluate the current state of the system
  • follow natural mappings between intentions and the required actions; between actions and the resulting effect; and between the information that is visible and the interpretation of the system state.

In other words, make sure that

  1. the user can figure out what to do, and
  2. the user can tell what is going on. (Don Norman, 1988, p188)

Norman sees aircraft automation and flight deck design as falling short of these standards. Recently he had this to say about automation and flight deck design. ‘There is a new breed of strong, silent types now flying our airplanes … Strong silent types that take over control and then never tell you what is happening until, sometimes, it is too late. In this case, however, I am not referring to people, I am referring to machines.’

He is also critical of the physical aspects of flight deck design.

‘Even the comfort of the flight crew is ignored. Only recently have decent places to hold coffee cups emerged. In older planes the flight engineer has a small desk for writing and for holding manuals, but the pilots don’t. In modern planes there are still no places for the pilots to put their charts, their maps, or in some planes, their coffee cups. Where can the crew stretch their legs or do the equivalent of putting the feet up on the desk? And when it is mealtime, how does one eat without risking spilling food and liquids over the cockpit? The lighting and design of the panels seem like an afterthought, so much so that a standard item of equipment for a flight crew is a flashlight.’

Norman says the cramped inconsiderate design of cockpits is a symptom of a more serious problem. ‘If comfort is ignored, think how badly mental functioning must be treated?’ he asks.

Noting the many clearances, reports and reminders pilots need in a flight he asks.

‘Why hasn’t this need been recognised? The need for mental, cognitive assistance should be recognised during the design of the cockpit. Why don’t we build in devices to help the crew? Instead, we force them to improvise, to tape notes here and there, or even to wedge pieces of paper at the desired locations, all to act as memory aids. The crew needs external information, knowledge in the world, to aid them in their tasks. Surely we can develop better aids than empty coffee cups?’

‘The principle here is that people are good at the high-level supervision, so let them do it. Machines are good at precise, accurate control, so let them do that.’

Pride and prejudice: why we tolerate poor design

Any pilot who has talked frankly with others will have a list of terrible aircraft features; the coat-hanger material door latches on the Piper PA-38 Tomahawk, the groin-grabbing throttle on early Jabirus, the wall-like instrument panels of high-performance Cessna singles. Then there are, even in 2015, turbine engines that can still be hot-started due to manual ignition controls, console-mounted trackballs that can be a long reach away and cabin pressure alerts that sound like take-off configuration warnings. If ever these alerts go off they will do so at the precise time when oxygen-starved aircrew are at the greatest risk of confusion.

Pride and commitment, often cited as vital pilot virtues, may be part of the reason these horrors persist.

Norman takes a systems approach. ‘One of the things that stands out when talking to designers and long-term users of poorly designed systems is that these people take great pride in their skills. They had to go through great difficulties to master the system, and they are rightfully proud of having done so. That, by itself, is all right. The problem is that the difficulties become a test of the person or group. Then, rather than ease the situation for the next people, it is used as a sort of initiation rite. The hardy survivors of the experience claim to share a common bond and look with disdain upon those who have not been through the same rites. They share horror stories with one another.’

Norman argues people who have already mastered a bad system end up with a vested interest in its continuation.

‘“Tough,” they will say (or maybe just think to themselves). “It is supposed to be difficult, it is how we separate those with ability from those without. Besides, all of us had to spend a lot of time learning it, losing a lot of work along the way, so why shouldn’t you?”’

A better way

Norman argues that automation has been designed without regard to interaction with humans.

In Turn Signals are the Facial Expressions of Automobiles he says, ‘Alas, there is too much tendency to let the automatic controls do whatever activities they are capable of performing, giving the leftovers to people. This is poor system design. It does not take into account the proper mix of activities, and it completely ignores the needs and talents of people.’

Instead humans are required to do something that experiments have repeatedly shown we are no good at: stay vigilant when nothing is happening.

Instead, the proper role of the system operator in the sky, the factory, or the power station should be one of making continuous high-level decisions. ‘The principle here is that people are good at the high-level supervision, so let them do it. Machines are good at precise, accurate control, so let them do that. This way, until the machines can do everything, the person is always in the loop, always exerting high-level control. Instead of calling people into the situation only when it is too late, have them there all the time.’

1 COMMENT

  1. I was researching HF for a paper when I cam across this article. I just want to point out that the information you have on the AA 965 is misleading and is it not true. You have in your article that the “crew accidentally erased the waypoints”. However, according to the NTSB findings, the flight crew was not aware of the fact that the “R” identifier that appeared on the approach (Rozo) did not correspond to the “R” identifier (Romeo) that they entered and executed as an FMS command.
    One of the AA965 pilots selected a direct course to the Romeo NDB believing that it was the Rozo NDB, and upon executing the selection in the FMS permitted a turn of the airplane towards Romeo, without having verified that it was the correct selection.

    Just and FYI

Leave a Reply