Assignment: Human Factors & Ergonomics

Handbook of Human Factors & Ergonomics, second edition, G. Salvendy (Ed.), Wiley, 1997
1
AUTOMATION SURPRISES
N.B. Sarter, D. D. Woods, and C.E. Billings Cognitive Systems Engineering Laboratory
The Ohio State University
The road to technology-centered systems is paved with user-centered intentions. (Woods, 1994)
1 INTRODUCTION
In a variety of domains, the development and introduction of automated systems has been successful in terms of improving the precision and economy of operations. At the same time, however, a considerable number of unanticipated problems and failures have been observed. These new and sometimes serious problems are related for the most part to breakdowns in the interaction between human operators and automated systems. It is sometimes difficult for the human operator to track the activities of their automated partners. The result can be situations where the operator is surprised by the behavior of the automation asking questions like, what is it doing now, why did it do that, or what is it going to do next (Wiener, 1989). Thus, automation has created surprises for practitioners who are confronted with unpredictable and difficult to understand system behavior in the context of ongoing operations. The introduction of new automation has also produced surprises for system designers/purchasers who experience unexpected consequences because their automate systems failed to work as team players.
This chapter describes the nature of unanticipated difficulties with automation and explains them in terms of myths, false hopes, and misguided intentions associated with modern technology. Principles and benefits of a human-centered rather than technology-centered approach to the design of automated systems are explained. The chapter points out the need to design cooperative teams of human and machine agents in the context of future operational environments.
Automation technology was originally developed in hope of increasing the precision and economy of operations while, at the same time, reducing operator workload and training requirements. It was considered possible to create an autonomous system that required little if any human involvement and therefore reduced or eliminated the opportunity for human error. The assumption was that new automation can be substituted for human action without any larger impact on the system in which that action or task occurs, except on output. This view is predicated on the notion that a complex system is decomposable into a set of essentially independent tasks. Thus, automated systems could be designed without much consideration for the human element in the overall system.
However, investigations of the impact of new technology have shown that these assumptions are not tenable (they are what could be termed the substitution myth). Tasks and activities are highly interdependent or coupled in real complex systems.
Handbook of Human Factors & Ergonomics, second edition, G. Salvendy (Ed.), Wiley, 1997
2
Introduction of new automation has shifted the human role to one of monitor, exception handler, and manager of automated resources.
As a consequence, only some of these anticipated benefits of automation have, in fact, materialized — primarily those related to the improved precision and economy of operations, i.e., those aspects of system operation that do not involve much interaction between human and machine. However, other expectations were not met, and unanticipated difficulties were observed. These problems are primarily associated with the fact that even highly automated systems still require operator involvement and therefore communication and coordination between human and machine. This need is not supported by most systems which are designed to be precise and powerful agents but are not equipped with communicative skills, with comprehensive access to the outside world, or with complete knowledge about the tasks in which it is engaged. Automated systems do not know when to initiate communication with the human about their intentions and activities or when to request additional information from the human. They do not always provide adequate feedback to the human who, in turn, has difficulties tracking automation status and behavior and realizing there is a need to intervene to avoid undesirable actions by the automation. The failure to design human-machine interaction to exhibit the basic competencies of human-human interaction is at the heart of problems with modern automated systems.
Another reason that observed difficulties with automation were not anticipated was the initial focus on quantitative aspects of the impact of modern technology. Expected benefits included reduced workload, reduced operational costs, increased precision, and fewer errors. Anticipated problems included the need for more training, less pilot proficiency, too much reliance on automation, or the presentation of too much information (for a more comprehensive list of automation-related questions see Wiener and Curry, 1980). Instead, it turned out that many of consequences of introducing modern automation technology were of a qualitative nature, as will be illustrated in later sections of this chapter. For example, task demands were not simply reduced but changed in nature. New cognitive demands were created, and the distribution of load over time changed. Some types of errors and failures declined whereas new error forms and paths to system breakdown were introduced.
Some expected benefits of automation did not materialize because they were postulated based on designers assumptions about intended rather than actual use of automation. The two can differ considerably if the future operating environment of a system is not sufficiently considered during the design process. In the case of cockpit automation, the actual and intended use of automation are not the same because, for example, air traffic control procedures do not match the abilities and limitations designed into modern flight deck systems and the various operators of highly advanced aircraft have different philosophies and preferences for how and when to use different automated resources.
Finally, design projects tend to experience severe resource pressure which almost invariably narrows the focus so that the automation is regarded as only an object or device that needs to possess certain features and perform certain functions under a narrowed range of conditions. The need to support interaction and coordination between the machine and its human user(s) in the interest of building a joint human- machine system becomes secondary. At this stage, potential benefits of a system may be lost, gaps begin to appear, oversimplifications arise, and boundaries are narrowed. The consequences are challenges to human performance.
Handbook of Human Factors & Ergonomics, second edition, G. Salvendy (Ed.), Wiley, 1997
3
Actual experiences with advanced automated systems confirm that automation does, in fact, have an effect on areas such as workload, error, or training. However, its impact turns out to be different and far more complex than anticipated. Workload and errors are not simply reduced, but changed. Modified procedures, data filtering or more-of-the-same training are not effective solutions to observed problems. Instead, the introduction of advanced automation seems to result in changes that are qualitative and context-dependent rather than quantitative and uniform in nature. In the following sections, some unexpected effects of automation will be discussed. They result from the introduction of automated systems that need to engage in, but were not designed for, cooperative activities with humans.
2 UNEXPECTED PROBLEMS WITH HUMAN-AUTOMATION INTERACTION
2.1 Workload – Unevenly Distributed, Not Reduced
The introduction of modern technology was expected to result in reduced workload. It turned out, however, that automation does not have a uniform effect on workload. As first discussed by Wiener (1989) in the context of modern technology for aviation applications, many automated systems support pilots most in traditionally low workload phases of flight but are of no use or even get in their way when help is needed most, namely in time-critical highly dynamic circumstances. One reason for this effect is the automation’s lack of comprehensive access to all flight-relevant data in the outside world. This leads to the requirement for pilots to provide automation with information about target parameters, to decide how automation should go about achieving these targets (e.g., selecting level and type of automated subsystem to invoke), to communicate appropriate instructions to the automation, and to monitor the automation closely to ensure that commands have been received and are carried out as intended. These task requirements do not create a problem during low workload phases of flight but once the descent and approach phases of flight are initiated, the situation changes drastically. Air traffic control (ATC) is likely to request frequent changes in the flight trajectory, and given that there is not (at this stage) a direct link between ATC controllers and automated systems, the pilot has the role of translator and mediator. He needs to communicate every new clearance to the machine, and he needs to (know how to) invoke system actions. It is during these traditionally high-workload, highly dynamic phases of flight that pilots report an additional increase in workload. Wiener (1989) coined the term “clumsy automation” to refer to this effect of automation on workload – a redistribution of workload over time rather than an overall decrease or increase because the automation creates new communication and coordination demands without supporting them well.
Workload is not only unevenly distributed over time but sometimes also between operators working as a team. For example, the pilot-not-flying on many advanced flight decks can be much busier than the pilot-flying as (s)he is responsible for most of the interaction with the automation interface which can turn a simple task (such as changing a route or an approach) into a “programming nightmare.”
The effect on workload was also unexpected in the sense that the quality rather than the quantity of workload is affected. For example, the operator’s task has shifted from active control to supervisory control by the introduction of automated systems. Humans are no longer continuously controlling a process themselves (although they still sometimes need to revert to manual control) but instead they monitor the performance of highly autonomous machine agents. This imposes new attentional
Handbook of Human Factors & Ergonomics, second edition, G. Salvendy (Ed.), Wiley, 1997
4
demands, and it requires that the operator knows more about his systems in order to be able to understand, predict, and manipulate their behavior.
2.2 New Attentional and Knowledge Demands
The introduction of modern technology has created new knowledge and attentional requirements. Operators need to learn about the many different elements of highly complex systems and about the interaction of these elements. They need to understand input-output relationships to be able to anticipate effects of their own entries. In addition to knowing how the system works, they need to explore “how to work the system”, i.e., operators must learn about available options, learn and remember how to deploy them across a variety of operational circumstances, and learn the interface manipulations required to invoke different modes and actions. Finally, it is not only the capabilities but also the limitations of systems that need to be considered.
Empirical research on human-automation interaction (e.g., Sarter and Woods, 1994a) has shown that operators sometimes have gaps and misconceptions in their model of a system. Sometimes operators possess adequate knowledge about a system in the sense of being able to recite facts, but they are unable to apply the knowledge successfully in an actual task context. This is called the problem of “inert” knowledge. One way to eliminate this problem is through training that conditionalizes knowledge to the contexts in which it is utilized.
Since the complexity of many modern systems cannot be fully covered in the amount of time and with the resources available in most training programs, operators learn only a subset of techniques or “recipes” to be able to make the system work under routine conditions. As a consequence, ongoing learning needs to take place during actual operations and has to be supported to help operators discover and correct bugs in their model of the automation. Recurrent training events can be used to elaborate their understanding of how the automation works in a risk-free environment.
Another problem related to knowledge requirements imposed by complex automation technology is that operators are sometimes miscalibrated with respect to their understanding of these systems. Experts are considered well calibrated if they are aware of the areas and circumstances for which they have correct knowledge and those in which their knowledge is limited or incomplete. In contrast, if experts are overconfident and wrongly believe that they understand all aspects of a system, then they are said to be miscalibrated (e.g., Wagenaar and Keren, 1986).
A case of operator miscalibration was revealed in a study on pilot-automation interaction where pilots were asked questions such as, “Are there modes and features of the Flight Management System (FMS) that you still don’t understand?” (Sarter and Woods, 1994a; these kinds of questions were asked in an earlier study by Wiener, 1989). When their responses to this question are compared with behavioral data in a subsequent simulator study, there is some indication that these “glass cockpit” pilots were overconfident and miscalibrated about how well they understood the Flight Management System . The number and severity of pilots’ problems during the simulated flight was higher than was to be expected from the survey. Similar results have been obtained in studies of physician interaction with computer based automated devices in the surgical operating room (Cook et al., 1991; Moll van Charante et al., 1993).
Handbook of Human Factors & Ergonomics, second edition, G. Salvendy (Ed.), Wiley, 1997
5
Several factors contribute to miscalibration. First, areas of incomplete or inaccurate knowledge can remain hidden from operators because they have the capability to work around these areas by limiting themselves to a few well practiced and well understood methods. In addition, situations that force operators into areas where their knowledge is limited and miscalibrated may arise infrequently. Empirical studies have indicated that ineffective feedback on the state and behavior of automated systems can be a factor that contributes to poor calibration (e.g., Wagenaar and Keren, 1986; Norman, 1990; Cook et al., 1991).
The need for adequate feedback design is related not only to the issue of knowledge calibration but also to the attentional demands imposed by the increased autonomous exhibited by modern systems. Operators need to know when to look where for information concerning (changes in) the status and behavior of the automation and of the system or process being managed or controlled by the automation. Knowledge and attentional demands are closely related, because the above mentioned mental model of the functional structure of the system provides the basis for internally guided attention allocation. In other words, knowing about inputs to the automation and about ways in which the automation processes these inputs permits the prediction of automation behavior which, in turn, allows the operator to anticipate the need for monitoring certain parameters. This form of attentional guidance is particularly important in the context of normal operations.
In case of anomalies or apparently inconsistent system behavior, it can be difficult or impossible for the user to form expectations. Therefore, under those circumstances, the system needs to provide external attentional guidance to the user to help detect and locate problems. The system interface needs to serve as an external memory for the operator by providing cues that help realize the need to monitor a particular piece of information or to activate certain aspects of knowledge about the system.
Two frequently observed ways in which attention allocation can fail is (a) a breakdown in the “mental bookkeeping” required to keep track of the multiple interleaved activities and events that arise in the operation of highly complex technology, and (b) a failure to revise a situation assessment in the presence of new conflicting information. In the latter case, called fixation error, evidence that is not in agreement with an operator’s assessment of his situation is missed, dismissed, or rationalized as not really being discrepant.
The above problems — gaps and misconceptions in an operator’s mental model of a system as well as inadequate feedback design – can result in breakdowns in attention allocation which, in turn, can contribute to a loss of situation, or more specifically, system and mode awareness.
2.3 Breakdowns in Mode Awareness and “Automation Surprises”
Norman (1988, p. 179) explains device modes and mode error quite simply by suggesting that one way to increase the possibilities for error is to “. . . change the rules. Let something be done one way in one mode and another way in another mode.” Mode errors occur when an intention is executed in a way appropriate for one mode when, in fact, the system is in a different mode. In simpler devices, each system activity was dependent upon operator input; as a consequence, the operator had to act for an error to occur.
Handbook of Human Factors & Ergonomics, second edition, G. Salvendy (Ed.), Wiley, 1997
6
With more advanced systems, each mode itself is an automated function that, once activated, is capable of carrying out long sequences of tasks autonomously in the absence of additional commands from human supervisors. This increased autonomy produces situations in which mode changes can occur based on situational and system factors. This capability for “indirect” mode changes, independent of direct and immediate instructions from the human supervisor, drives the demand for mode awareness. Mode awareness is the ability of a supervisor to track and to anticipate the behavior of automated systems (Sarter and Woods, 1995).
Breakdowns in mode awareness result in “automation surprises.” These automation surprises have been observed and reported in various domains (most notably flightdeck and operating room automation, e.g., Sarter and Woods, 1994a; Moll van Charante et al., 1993) and have contributed to a considerable number of incidents and accidents. Breakdowns in mode awareness can led to mode errors of omission in which the operator fails to observe and intervene with uncommanded and/or undesirable system behavior.
Early automated systems tended to involve only a small number of modes that were independent of each other. These modes represented the background on which the operator would act by entering target data and by requesting system functions. Most functions were associated with only one overall mode setting. Consequently, mode annunciations (indications of the currently active as well as planned modes and of transitions between mode configurations) were few and simple and could be shown in one central location. The consequences of a breakdown in an operator’s awareness of the system configuration tended to be small, in part because of the short time- constant feedback loops involved in these systems. Operators were able to detect and recover from erroneous input relatively quickly.
The flexibility of more advanced technology allows and tempts automation designers to develop much more complex mode-rich systems. Modes proliferate as designers provide multiple levels of automation and various methods for accomplishing individual functions. The result is a large number of indications of the status and behavior of the automated system(s), distributed over several displays in different locations. Not only the number of modes but also, and even more importantly, the complexity of their interactions has increased dramatically.
The increased autonomy of modern automated systems leads to an increase in the delay between user input and feedback about system behavior. These longer time- constant feedback loops make it more difficult to detect and recover from errors and challenge the human’s ability to maintain awareness of the active and armed modes, the contingent interactions between environmental status and mode behavior, and the contingent interactions across modes.
Another contributing factor to problems with mode awareness relates to the number and nature of sources of input that can evoke changes in system status and behavior. Early systems would change their mode status and behavior only in response to operator input. More advanced technology, on the other hand, may change modes based on sensor information concerning environment and system variables as well from input by one or multiple human operators. Mode transitions can now occur in the absence of any immediately preceding user input. In the case of highly automated cockpits, for example, a mode transition can occur when a preprogrammed intermediate target (e.g., a target altitude) is reached or when the system changes its mode to prevent the pilot from putting the aircraft into an unsafe configuration.
Handbook of Human Factors & Ergonomics, second edition, G. Salvendy (Ed.), Wiley, 1997
7
Indirect mode transitions can arise in another way as side effects of a direct operator input to automated systems. This potential is created by the fact that the effects of operator input depend on the status of the system and of the environment at the time of input. The user intends one effect but the complexity of the interconnections between different automated subsystems and modes may mean that other un-intended changes occur automatically. Thus, an action intended to have one particular effect can have a different effect or additional unintended side effects due to automation designs that increase system coupling. missing these side effects is a predictable error form that is accentuated because of weak feedback concerning mode status and transitions and when there are gaps or misconceptions in user’s mental model of the system. Incidents and accidents have shown that missing these side effects can be disastrous in some circumstances.
2.4 New Coordination Demands
When new automation is introduced into a system or when there is an increase in the autonomy of automated systems, developers often assume that adding “automation” is a simple substitution of a machine activity for human activity (the substitution myth). Empirical data on the relationship of people and technology suggests that is not the case. Instead, adding or expanding the machine’s role changes the cooperative architecture, changing the human’s role often in profound ways. Creating partially autonomous machine agents is, in part, like adding a new team member. One result is the introduction of new coordination demands. When it is hard to direct the machine agents and hard to see their activities and intentions, it is difficult for human supervisors to coordinate activities. This is one factor that may explain why people “escape” from clumsy automation as task demands escalate. Designing for coordination is a post-condition of more capable machine agents. However, because of the substitution myth, development projects rarely include specific consideration how to make the automation an effective team player or evaluation of possible systems along this dimension.

Don't use plagiarized sources. Get Your Custom Essay on
Assignment: Human Factors & Ergonomics
Get a 15% discount on this Paper
Order Essay
Quality Guaranteed

With us, you are either satisfied 100% or you get your money back-No monkey business

Check Prices
Make an order in advance and get the best price
Pages (550 words)
$0.00
*Price with a welcome 15% discount applied.
Pro tip: If you want to save more money and pay the lowest price, you need to set a more extended deadline.
We know that being a student these days is hard. Because of this, our prices are some of the lowest on the market.

Instead, we offer perks, discounts, and free services to enhance your experience.
Sign up, place your order, and leave the rest to our professional paper writers in less than 2 minutes.
step 1
Upload assignment instructions
Fill out the order form and provide paper details. You can even attach screenshots or add additional instructions later. If something is not clear or missing, the writer will contact you for clarification.
s
Get personalized services with My Paper Support
One writer for all your papers
You can select one writer for all your papers. This option enhances the consistency in the quality of your assignments. Select your preferred writer from the list of writers who have handledf your previous assignments
Same paper from different writers
Are you ordering the same assignment for a friend? You can get the same paper from different writers. The goal is to produce 100% unique and original papers
Copy of sources used
Our homework writers will provide you with copies of sources used on your request. Just add the option when plaing your order
What our partners say about us
We appreciate every review and are always looking for ways to grow. See what other students think about our do my paper service.
Nursing
Top notch quality!
Customer 452453, February 16th, 2023
Other
Great work! Thank so much!
Customer 452707, March 1st, 2022
Other
I requested two pages and only have one page and the second pages is references.
Customer 452475, August 22nd, 2022
Other
GREAT JOB THANK YOU.
Customer 452813, July 18th, 2022
Other
GREAT
Customer 452813, June 20th, 2022
IT, Web
Great job on the paper.
Customer 452885, February 7th, 2023
Nursing
The writer went above and beyond as usual. Always a great experience with these writers.
Customer 452707, December 4th, 2022
Social Work and Human Services
Excellent Work!
Customer 452587, August 24th, 2021
Nursing
Always a job well done. I really appreciate the hard work.
Customer 452453, January 4th, 2021
Human Resources Management (HRM)
Thanks.
Customer 452701, August 15th, 2023
IT, Web
Excellent job on the paper!
Customer 452885, December 28th, 2022
Human Resources Management (HRM)
Thanks. I am very pleased with my paper.
Customer 452701, August 1st, 2023
Enjoy affordable prices and lifetime discounts
Use a coupon FIRST15 and enjoy expert help with any task at the most affordable price.
Order Now Order in Chat

We now help with PROCTORED EXAM. Chat with a support agent for more details