Saturday, May 7, 2016

Beware of Automation

    Automation has been used throughout history to reduce the equipment operators workload.  The use of autopilots on planes, and cruise control on vehicles was designed to reduce workload during operation.  There are 3 levels of automation as compared by Prinet, J. C., Terhune, A., & Sarter, N. B. (2012), and consist of manual, intermediate, and full automation.  Manual control is considered to simply be a state in which there is no automation.  The system is under complete manual control of the operator.  This type of control is best suited for simple tasks in reduced workload environments.  Performance suffers, but there is a reduction is loss of equipment due to automation failures.  Intermediate control can be defined as control in which there is both manual and automated control happening simultaneously.  This was the preferred level of control over full automation (Prinet, J.C. et al, 2012).  Full automation is control of a system in which there is no operator control or influence in operation (Prinet, J.C. et al, 2012).
Except for the UAV losses, full automation resulted in the highest performance on target detection and re-planning tasks combined. Still, participants overall preferred the intermediate LOA. This suggests that their preference was a combined function of performance and trust, both of which ranked between full automation and manual mode, as well as the potential consequences of automation failures. The loss of even one UAV can jeopardize the operator’s ability to successfully complete the mission. Only during high workload, when it was extremely difficult to divide their attention between the re-planning and target detection tasks, did participants prefer full automation (Prinet, J.C. et al, 2012)
    Automation has both its advantages and disadvantages in performance.  Automation is not always reliable; it infrequently fails due to hardware or software issues or simply does not perform as desired or expected.  Without a doubt, in non-failure automation of systems, automated performance exceeds human performance and reduces workloads.  However, when automation fails to properly perform; the results are often catastrophic (Onnasch, L., Wickens, C. D., Li, H., & Manzey, D., 2014).
These catastrophic effects may result from human’s reduced monitoring of highly reliable automation at the time it fails, trusting it too much (Parasuraman & Riley, 1997) and losing situation awareness (Endsley & Kiris, 1995). This is sometimes described as a form of complacency (Parasuraman, Molloy, & Singh, 1993) or an automation-induced decision bias (Mosier & Skitka, 1996). Indeed, operators occasionally over-rely on automation and exhibit complacency because the highly (but not perfectly) reliable automation functioned properly for an extended period prior to this first failure (Parasuraman et al.,1993; Parasuraman & Manzey, 2010; Yeh, Merlo, Wickens, & Brandenburg, 2003) (Onnasch, L. et al, 2014).
    When the operator is taken out of the loop with full automation, the possibility for failure and complacency increase.  To get the most from automation, the operator must use automation for its intended purpose and that is to reduce operator workload doing mundane tasks and focus their attention to more complex cognitive tasks (Onnasch, L. et al, 2014)
    There has been a new focus on flight deck automation to be more supportive due to pilots misdiagnosis of automated flight information and automated warnings.  It is suggested that the interaction between human and automation must start well before a failure occurs and recovery is solely dependent on the quick and accurate intervention of the operator (Geiselman, E. E., Johnson, C. M., Buck, D. R., & Patrick, T., 2013).
    Any system where automation is to be used must embrace both interface design concepts as well as effective operator training to access the benefits of automation in both workload and safety.  The goal should be to improve the interface between human and machine in order to reduce error and the compounding effects of automated surprise and confusion that have the potential to lead to catastrophic conclusions (Geiselman, E. E. et al., 2013).

Reference:
Geiselman, E. E., Johnson, C. M., Buck, D. R., & Patrick, T. (2013). Flight deck automation: A call for context-aware logic to improve safety. Ergonomics in Design: The Quarterly of Human Factors Applications, 21(4), 13-18.
Onnasch, L., Wickens, C. D., Li, H., & Manzey, D. (2014). Human performance consequences of stages and levels of automation: An integrated meta-analysis. Human Factors: The Journal of Human Factors and Ergonomics Society, 56(3), 476-488. doi:10.1177/0018720813501549
Prinet, J. C., Terhune, A., & Sarter, N. B. (2012). Supporting dynamic re-planning in multiple uav control: A comparison of 3 levels of automation. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 56(1), 423-427.

No comments:

Post a Comment