The unseen hand: automation, human factors and drones

2373
Technical illustration of a drone
image: Adobe stock (modified) | blacklight_trace | conceptcafe

Margo Marchbank talks to industry professionals on how the powerful combination of automation, human capabilities and human limitations affects uncrewed aviation in unique ways.

There’s an ambivalence in aviation about automation, expressed in a joke so old it probably predates most modern flight deck automation systems. In this hoary old gag, the ideal flight crew is said to be a pilot and a dog. The pilot is there to feed the dog and the dog is there to bite the pilot if they touch anything. Accidents such as Air France Flight 447 and the Ethiopian and Lion Air Boeing 737 Max crashes make the point in no uncertain terms that the mix of machine and human can be disastrous.

By contrast, some form of automation has been part of uncrewed aviation from the beginning. But the same questions apply. What role does the human operator play in uncrewed operations, what human factors issues are at play and how does this affect the safety of these operations?

In the loop

As an air transport first officer and chief remote pilot for the University of Adelaide, Mitchell Bannink has his foot in both camps; he is well placed to understand the role of human factors in traditional aviation and uncrewed operations. He says consideration of traditional crewed aviation human factors, such as threat and error management and situational awareness, are readily transferrable to drone operations, where you must understand airspace parameters and identify and manage risks.

What is new for drones are issues with the human-machine interface and automation. ‘You can’t hear the rushing wind to tell you you’re in a dive [unlike a piloted aircraft] and you have to counteract that by automation, which then leads to greater automation reliance,’ he says.

Automated systems for drones are in transition and, hence, the role humans play in their operation, he says. Although civilian drone operations are still mostly at the human in-the-loop stage, where humans pilot and operate them remotely and make decisions at all stages of the flight, ‘we are one or 2 steps into the journey’ towards human on-the-loop and, ultimately, human out-of-the-loop operations. On-the-loop means the person is not in direct control at all times but takes control over any decisions the machine makes. ‘It pushes human control further from the centre of the automated decision-making,’ Bannink says. ‘While there is still human oversight, artificial intelligence [AI] initiates action without needing human pre-approval, as it would in a human in-the-loop operation.’

And human out-of-the-loop? AI-powered drones are expected to fly autonomously, without human intervention, only reporting back after an operation is complete. ‘It’s not too far in the future,’ Bannink says. In a March 2023 update, Wing said it was looking to expand its model so its drones could deliver, travel and charge throughout the day in whatever pattern was most efficient, without needing to return to a central point of origin to power up their batteries.

AI-powered drones are expected to fly autonomously, without human intervention, only reporting back after an operation is complete.

With the increasing automation of uncrewed systems, the role of human beings is changing and, with it, the human factors focus as it relates to uncrewed systems. Human factors aims to ‘optimise the relationship between the human operator and other elements of the system’ and has traditionally focused on issues such as situational awareness, human performance and human physiology and threat and error management. But uncrewed operations demand a refined focus.

Command and control

At a very high-level, operations can be segmented into visual line of sight and beyond visual line of sight (where the operator can’t see the drone).

Dr Alan Hobbs, a human factors researcher at San Jose State University and NASA’s Ames Research Center, focuses on uncrewed systems that are capable of operating in all classes of civil airspace alongside conventional aircraft. ‘The Federal Aviation Administration has said that for this to occur, these aircraft would need to be IFR-equipped, controlled by a pilot from the ground, comply with ATC and meet other requirements’ he says.

Remotely piloted aircraft systems (RPAS) have a higher accident rate than crewed aircraft.

Unique identifiers

Hobbs says that while there are some parallels to crewed aviation – there is still a pilot in command – the way humans interact with uncrewed systems is different in several critical ways. ‘The first is the reduced sensory information available to the pilot,’ he says. ‘With the lack of sensory clues, the [drone] pilot may have no idea they are hitting turbulence, heavy rain or hail and they can’t smell smoke or feel the buffeting of the airframe in a storm.

‘Second, command and control is via a radio link – some people even refer to this as ‘fly-by-wireless’. Remote pilots have to be prepared for a potential loss of link – no link can be 100% effective all of the time, so learning to manage that is critically important.’

Third is the high – and increasing – reliance on automation. Hobbs says this raises the issue of the human ability to monitor such systems. We are simply not that good at monitoring automation during times of low workload. ‘There’s a risk of seeing people in a low workload situation – when nothing much is going on – being the victim of the startle effect when they jump from that monotonous ‘ops normal’ to an emergency.’

Rather than designing remote pilot stations with comfortable chairs and subdued lighting, perhaps consideration could be given to creating an environment which could counter the danger of low workload sleepiness, he says.

With the lack of sensory clues, the [drone] pilot may have no idea they are hitting turbulence, heavy rain or hail.

The in-flight transfer of control is another factor to be considered. With RPAS operations, the pilot shift handover is more dynamic than the handover in a crewed aircraft. ‘The remote pilot may be handing over to another pilot on a different continent and, after the handover, may go home – that doesn’t happen inflight in crewed operations,’ Hobbs says.

Finally, there is the remote pilot station, to use ICAO terminology, although some refer to it as the ground control station. It is very different to the flight deck of a crewed aircraft, with more scope for interruptions and managerial staff entering and ‘putting their oar in’. ‘The sterile cockpit concept could be applied to the remote pilot station,’ he says. ‘However, rather than using altitude as a marker as happens in airlines, for example, enforcing a sterile cockpit below 10,000 feet, the principle could be applied to phase of flight or times of crew transfer.’

Size matters

There are different human factors considerations for smaller uncrewed systems such as the Aerosonde. Cameron Devries is a senior program manager with Textron Systems Australia, which in 1995 pioneered the Aerosonde uncrewed aerial system – a simple, robust, ruggedly designed drone with a small operational footprint. Devries says their philosophy is for current automation to support humans-in-the-loop – to support more effective human decision-making by reducing the cognitive load of the remote plot.

He says this takes 3 different tacks:

  1. To reduce human cognitive load by automating regular checklist items; to check system A for condition Z and then have the system check the check.
  2. Automating emergency procedures in case of systems failures. For example, if an engine fails, emergency procedures will be triggered, such as minimising the electrical load and returning to base. When the operator sees the error signal pop up, they know the system is going to take some automated emergency procedure steps up front, giving the operator some cognitive space to deal with the emergency.
  3. The increasingly complex and powerful artificial intelligence (AI) and machine learning in the uncrewed system can undertake real-time in-flight monitoring that can warn the operator something may be about to occur. These tools are very good at processing vast amounts of data and can recognise failures seconds, minutes or even hours before they occur. The operator, armed with the knowledge that widget X may fail in future, can then decide what action, if any, to take.

Pilot or controller?

Devries believes humans will remain in the loop, or on the loop, for some time. ‘The ongoing role of automation will be to support human decision-making,’ he says. ‘The way [artificial intelligence and machine learning] systems are built currently means we don’t get detail on the AI thought process. Until there is sufficient trust in how the AI reaches a decision, there will be a need for human intervention in making the final decision.

‘As the systems mature and there is movement from automation to autonomy, and from one remote pilot operating one uncrewed aircraft to operating many uncrewed aircraft, it is conceivable that the pilot almost becomes an area air traffic controller. We’re very good at ATC and although there may be additional human factors issues in the high-stress ATC environment, we can learn from the lessons of the past.’

Learning from the past is important, Devries says. Just as there was collaboration when crewed flight was first introduced into civil airspace, now, as autonomous systems and systems with autonomy are coming of age in civil airspace, ‘we have a unique opportunity for the civil regulator and the uncrewed systems industry to work together to develop a safe ecosystem’.

Technical illustration of a drone
image: Adobe stock (modified) | blacklight_trace | conceptcafe

Automation and the human-machine relationship

The case of an RQ-4B Global Hawk, which crashed 6.8 miles from Grand Forks Air Force base, North Dakota, on 6 August 2021, highlights the problematic nature of automation and the human-machine relationship. According to the US Air Force Accident Investigation Board report released in April 2022, the aircraft had been in the air for 14 hours, when the ground control workstation locked up. In the event of such a failure, the RQ-4 was autonomously pre-programmed to return to base. However, in this case, the remote pilot did not sever the ground link to the aircraft, leading to the aircraft being at a higher altitude than it should. The aircraft attempted a missed approach but, because of its altitude, missed the runway and made a controlled flight into terrain north of the base. The report found that if the remote pilot had severed the link to the RQ-4, it ‘would have descended in accordance with published procedures and been on a normal approach and route to landing’.

CASA’s RPAS and AAM Strategic Regulatory Roadmap

Learn about our long-term plan for safely integrating emerging technologies into Australia’s airspace and future regulatory system, alongside traditional aviation. This includes our plan to publish acceptable industry consensus standards for highly automated RPAS (2026–2031).

Read CASA’s RPAS and AAM Strategic Regulatory Roadmap.

LEAVE A REPLY

Please enter your comment!
Please enter your name here