Humans retain a vital role in most high-risk industries, particularly in their abilities to problem solve and respond to unanticipated events. The demands on these operators have been increasing and are likely to reach a point where current human and technological capabilities will be severely strained. This situation has in the past, and will in the future, necessitate greater levels of automation.

Automation currently provides numerous efficiency and safety gains across high-reliability industries. Thankfully, there is a wealth of international lessons learnt from previous attempts to introduce automation from which we can leverage.

Definition of automation

Automation: The performance of functions by a system (hardware and/or software) that were previously performed by human operators.

This definition is deceivingly simple as automation also often changes the operators’ roles, introducing new tasks in unexpected ways. An irony of automation is that as automation increasingly supplants human control, it becomes increasingly important to consider the contribution and role of the human operators.

The key to improved safety and effectiveness of automation is in its design. Specifically, considering the wider system, i.e. machines, procedures, roles and tasks, rather than thinking that a human task can be neatly and simply substituted by an automated system (the ‘substitution myth’). Therefore it is important that appropriate guidelines based on research and industry experience are applied when designing, introducing and altering systems.

Additionally, both the implementation and associated training is critical for ensuring that automation is fit for purpose.

Pitfalls in automation

Automation has many potential benefits, including reducing operator workload and increasing operational efficiencies. However, there are numerous potential pitfalls with automation that are often only discovered during its implementation and operation.

Unknowingly introducing automation with these latent deficiencies often leads to operators losing a critical aspect of their cognitive loop, struggling to use the system or using the system in ways that were not intended. There are three key categories of automation pitfalls that require mitigation during design and implementation. These are feedback changes, task structure changes and system trust changes.

Feedback changes

Feedback changes occur when automation diminishes or eliminates operator feedback, leaving the operator less prepared to deal with failures. From an operator’s perspective this feedback deficiency often leads to out-of-the-loop unfamiliarity, where the ability to detect automation failures, deviations from expected performance parameters and resume manual control is diminished.

Automation tends to distance operators from the ‘process’, which often frees operator cognition for other tasks. Though system designers often view this extra availability of cognitive resources as the major benefit of automation, they sometimes fail to consider that some of the previous manual tasks actually supported operators in ‘building the picture’. The result is that the operator in the manual condition tends to perform better in cases of failure due to their greater cognitive effort in manually engaging the system and their improved ‘picture’. Such distancing from the process is a major contributor to out-of-the-loop familiarity.

Furthermore, critical cues are often unavailable as the system takes over some of the process allowing the operator to shift their focus on to other tasks. There is often a consciously designed decrease in operator feedback in order to free up operator cognition. This may be an acceptable operational trade-off during normal operations, but makes failure recovery (the operator getting back in-the-loop) difficult and time consuming. Couple this with research consistently showing operators’ tendency to rely complacently on the automation (either through a high degree of trust, or simply just not having time to monitor the automation: Li et al., 2014; Parasuraman and Manzey, 2010; Sanchez et al., 2014), and there is a significantly high risk of operators not only being out of the loop, but also having to struggle to get back in. Additionally, operators are often unaware of the degree to which they are out of the loop.

Skill loss is another result of feedback changes via automation. In this case operators’ skills atrophy as they go unexercised due to the automation taking over control. This is a well known phenomenon in the aviation industry for example, where pilots intentionally disengage the autopilot in an attempt to maintain their skills.

Task structure changes

Task structure changes often occur with the introduction of automation and are associated with what is known as clumsy automation.

Clumsy automation refers to the situation where automation makes easy tasks easier and hard tasks harder. Clumsy automation is a common phenomenon, as designers often find it a lot more cost effective to automate easier tasks rather than the more complex tasks. With clumsy automation, the operators’ understanding of the context and situation awareness is reduced due to the out-of-the-loop unfamiliarity noted above. Additionally operators can often lack awareness of the task structure changes resulting in clumsy automation. Consequently, workload reduces in an already low workload environment and increases during high workload situations.

The unfortunate tendency of operators to more willingly delegate tasks to automation during low workload situations, compared to high workload situations, increases the effects of clumsy automation. For example, in the Air Traffic Management (ATM) field automated conflict detection systems can sometimes exhibit signs of clumsy automation. The controllers tend to rely on the system during low workload periods and then, as the conflict list becomes difficult to manage, revert back to traditional methods of conflict detection and scanning. In this situation the controller is merely adapting to the system’s deficiencies, inadvertently masking the presence of the clumsy automation. Superficially, the system looks as though it is behaving correctly. This becomes a significant operational risk during abnormal and high workload situations.

Sophisticated automation often eliminates a large number of manual tasks, but also introduces complex cognitive tasks that may appear superficially easy. This leads to less emphasis on training and a subsequently poor understanding of the automation. It is also relatively common for operators to use automation to conserve cognitive effort rather than improve overall performance, i.e. efficiency and effectiveness, by shifting that cognitive effort over to other tasks.

System trust changes

System trust changes relate to the amount of trust the operators place in the automation and the resulting impact on their behaviour.

Overreliance and complacency are inherent in the category of system trust and have arguably the most profound impact on automation effectiveness. However, at the core of overreliance and complacency is operator trust. The greater the operators’ trust of the automation, the greater their tendency to rely on it. The appropriateness of this level of reliance depends on the match between operator trust and the true capabilities of the automation.

Over trusting automation, where trust exceeds system capabilities, can lead to misuse of the system and with time, overreliance. Distrust, where trust falls short of system capabilities, can lead to disuse and the operators under relying on the system.

Ultimately, under reliance and overreliance may be appropriate given the cost of the operators monitoring the accuracy of the automation. If the monitoring cost is extremely high and the automation has proven its high reliability, then system overreliance may be warranted. The cost of not detecting failures due to overreliance on automated tasks that are safety critical may have catastrophic consequences if automation failures are missed by the operators.

Overarching principles of human-centred automation

The late Dr. Charles Billings of the NASA Ames Research Centre, developed a set of principles for designing and implementing automation. These principles are used as a basis for the information in Table 1 and are augmented with the latest research and lessons learnt in the field of automation. Table 1 provides a high level view of the most critical aspects of automation and is to be used as an input to capability definition documentation such as the Concept of Operations.

Table 1 – Principles of human-centred automation

Principle

Rationale

The operator remains in control

Research has consistently evidenced the importance of the operator having decision making authority over the system, given that it is not possible to anticipate every failure mode in highly integrated systems

The operator remains actively involved

Almost all the research into effective automation has evidenced the importance of ensuring that the operator remains an active participant rather than a passive monitoring agent. This ensures the operator is less likely to suffer from fatigue, low vigilance, complacency, overload and underload

The operator remains informed

As the ultimate decision maker it is important that the operator has the necessary information to assess the situation and the effectiveness/performance of the automation itself

Additionally, ‘automation surprises’ where operators are surprised by the unexpected actions of (or lack of) automation are a known hazard potentially leading to degraded performance

The impact of technical failure upon the operator is tolerable

Highly automated technology is less reliable than is desirable

Automation is designed to be tolerant of human error

The operator cannot be expected to operate equipment properly all the time

Automation operates transparently

Complex automation is difficult to use safely because it is difficult to anticipate the consequence of every action in a highly integrated set of systems

The operator is trained to understand how to perform the tasks and use the technology

Automation cannot mitigate lack of experience

The operator is not overloaded or underloaded for extended periods

As automated technology is added new tasks (monitoring and control) are added also

The differential impact of automation introduced over time is planned for

Deployment of automation is not linear across a whole system – different parts of the system operate at different levels of automation

The division of authority and responsibility between automation and operator is clearly defined, tenable and effective

The division of authority and responsibility between different parts of the system needs to be clear, explicit, understood and implementable

Please click here to read more about Cortexia’s Automation services.

Cortexia: Leading Human Factors Consultancy Australia