Manuscript - NCSU COE People

advertisement
HUMAN FACTORS ISSUES IN IMPLEMENTATION OF AA TO COMPLEX SYSTEMS
Kheng-Wooi Tan, David B. Kaber, and Jennifer M. Riley
Mississippi State University
Mississippi State, MS
Mica R. Endsley
SA Technologies
Atlanta, GA
This paper presents a constrained review of human factors issues relevant to adaptive automation (AA) including
designing complex system interfaces to support AA, facilitating human-computer interaction and crew interactions
in adaptive system operations, and considering workloads associated with AA management in the design of human
roles in adaptive systems. This work is intended to compliment earlier reviews, which have offered detailed
information on topics central to AA (including dynamic function allocation strategies and triggering methods).
The review demonstrates the need for research into user-centered design of dynamic displays in adaptive systems.
It also points to the need for discretion in designing transparent interfaces to facilitate human awareness of modes
of automated systems. Finally, the review identifies the need to consider critical human-human (or crew)
interactions as well as AA induced operator workload in designing adaptive systems. This work describes
important branches of a developing framework of AA and contributes to the general theory of human-centered
automation.
INTRODUCTION
Adaptive automation (AA) has been described as a form of
automation that allows for dynamic changes in control
function allocations between a machine and human operator
based upon states of the collective human-machine system
(Hilburn et al., 1997; Kaber & Riley, 1999). Interest in
dynamic function allocation (or flexible automation) has
increased within the recent past as a result of hypothesized
benefits associated with the implementation of AA over
traditional automation, or technology-centered automation
such as alleviating the operator out-of-the-loop performance
problem and associated issues including loss of situation
awareness (SA) and high mental workloads. Though the
expected benefits of AA are encouraging, there are unresolved
issues regarding its use. For example, there is currently a lack
of common understanding of how human-machine system
interfaces should be designed to effectively support
implementation of AA.
A theory of human-centered automation closely related to
AA is that complex systems should be designed to support
operator achievement of SA through meaningful and taxing
involvement of operators in control operations. In fact, AA has
been proposed as a vehicle for moderating operator workload
or maintaining it within predetermined acceptable limits,
based on task or work environment characteristics, in order to
facilitate and preserve good SA. Therefore AA can be
considered a form of human-centered automation. Humancentered automation is concerned with SA because it has been
found to be critical in terms of successful human operator
performance in complex and dynamic system operations
(Endsley, 1995).
Unfortunately, the relationship between SA and workload
presents a conundrum to designing automation. Under low
workload conditions associated with high levels of system
automation, operators may experience boredom and fatigue
due to lack of cognitive involvement, or interest in, control
tasks. Operators of autonomous systems are often forced into
the task of passive monitoring of computer actions rather than
active task processing. Decreased task involvement can
compromise operator SA, and consequently, system
performance under failure modes, because operators with poor
SA may find it difficult to reorient themselves to system
functioning during unpredicted events. Conversely, cognitive
overload may occur when operators must perform very
complex, or a large number of, tasks under low levels of
system automation (e.g., manual control). Increasing task
requirements beyond that which the human is cognitively
capable of can lead to feelings of frustration and defeat, as
well as a loss of confidence in ability to complete the task.
The challenge for AA research is to identify the optimal
workload level, or functional range, under which operator SA
and total system performance will not suffer.
The key issues that must be addressed to meet this need
include determining how the design of automation or AA
methods effect operator workload and how system information
should be communicated to operators to facilitate SA. In
general, optimal levels of workload may be defined as those
resulting in good SA and associated with positive task
outcomes including meaningful operator work experiences,
improved system performance, and increased system safety.
Studies have demonstrated positive results in terms of
operator SA when applying AA as an approach to humancentered automation of complex systems (Kaber, 1997).
Unfortunately, at this point there exists no theory of AA that
can optimally address SA and workload tradeoffs across all
types of complex systems (e.g., air traffic control, production
control, and telerobotic systems). This paper seeks to address
this issue by supporting the concept of human-centered
automation offered above and presenting an understanding of
aspects of the relationship of AA to SA and workload not
previously explored in detail.
WORKLOAD AND AA
Unfortunately, it has been observed through empirical study of
AA that operators of complex, dynamic systems may
experience workloads above desired levels, based on system
design, as a result of concentrating on control function
allocations
and
maintaining
task
responsibilities
simultaneously (Kaber & Riley, 1999; Scerbo, 1996).
There are two general cases in which workload increases
may occur in applications of AA. Operators may perceive
increased cognitive load in monitoring computer management
of function allocations between themselves and automated
subsystems, in part because of anxiety about the timing of
allocations and the need to complete a particular task during
system operations, or an additional load on the visual channel
in perceiving task-relevant information on "who is doing
what". The second case of additional operator workload due to
AA may involve the human managing function allocation or
dynamically modulating the degree of system autonomy in
addition to performing routine operations. Under these
circumstances workload increases may be greater than
associated with monitoring computer-based dynamic control
allocations (i.e., AA).
Kaber & Riley (1999) objectively assessed operator
workload in dual-task performance using a secondary task
measure. They found that AA of the primary task was
associated with workload levels (secondary task performance)
slightly greater than expected based on primary task control
allocations. The authors' objective was to maintain secondary
task performance as part of dual-task functioning within 20%
of optimal secondary task performance observed during
operations not requiring primary task control. Average
experiment performance levels were within approximately
30% of optimal secondary task performance. Kaber & Riley
(1999) attributed this to the need for subjects to monitor
automated dynamic control allocations or to manage them,
which was not considered in establishing optimum secondary
task performance baselines or the design of the dual-task
paradigm.
This is an important issue in the implementation of AA
that needs to be considered by future research in order to
ensure that AA achieves the objectives of human-centered
automation (i.e., moderating workload and maintaining SA).
Methods for dealing with AA-induced workload must be
devised. A critical step to developing such techniques would
be to evaluate operator workload associated with the
implementation of general AA strategies separate from system
task workload. These workload components could then be
used to drive AA design.
INTERFACE DESIGN FOR AA
Implementation of AA may introduce added complexity into
system functioning and control. Consequently, operators
require advanced interfaces, which are useful for dealing with
this complexity in order to enhance system performance.
There are many general human factors interface design
principles for complex systems that may have applicability to
interfaces for AA. However, what is needed at this point are
high-level and specific interface design recommendations that
are presented in the context of systems to which AA is most
common, such as aircraft.
The application of AA to complex systems, like aircraft,
often increases the amount of information an operator must
perceive and use for task performance, including data on
system automation configuration and schedules of control
function allocations. On the basis of Palmer et al. (1995)
recommendations, interfaces for AA must support integration
of such data regarding "who is doing what" with task relevant
data. And, they should ensure that all information is presented
in a cohesive manner; therefore, function allocation
information should have meaning to current task performance.
Designing AA interfaces in this way can aid in the
development of complex and accurate situation models critical
to operator SA. Both SA and mental model development can
be affected by system response feedback to a user's actions
through an interface in addition to consistently displayed
system state information. Feedback allows the operator to
evaluate the system state in relation to his or her control
actions, goals, and expectations of system functioning. Both
individual and team feedback of knowledge of system states
and responses has proved to optimize human-machine
performance (Krahl et al., 1999). Lack of feedback forces the
human into an open-loop processing situation in which
performance is generally poor (Wickens, 1992).
Introducing dynamic displays into adaptive system
interface design is currently a critical research issue. A
number of AA research studies have been conducted to
establish interface design approaches to offset some
performance disadvantages associated with dynamic displays,
specifically mode awareness problems (Sarter, 1995). Ballas et
al. (1991) studied direct manipulation which was anticipated
to improve performance through: (1) reduced information
processing distance between the users’ intentions and the
machine states; and (2) direct engagement without undue
delay in system response and with a relatively transparent
interface. Ballas et al. (1991) found that negative effects of
changing controls/displays (dynamic displays) could be offset
by using direct manipulation and maintaining a consistent
interface style. They also speculated that direct manipulation
would enhance SA in assessment tasks in particular, and,
consequently, have the potential to reduce automation and
dynamic display-induced performance disadvantages. These
findings are supported by other research that has shown that
adaptive systems providing indirect manipulation and opaque
interfaces have negative effects on human-computer
communication and overall system performance (Ballas et al.,
1991; Scerbo, 1996).
On the basis of dynamic display, and direct and indirect
manipulation interface research in the context of adaptive
systems, Scerbo (1996) encouraged designers of AA interfaces
to include as many information formats as possible to allow
data to flow more freely between the human and system. In
this way, operators may be able to communicate more
naturally since information translation would not be limited to
one or two formats.
AA AND HUMAN-COMPUTER INTERACTION
Communication is a critical factor in achieving effective
human-automation integration (Scerbo, 1996). To optimize
performance, the human operator and machine must be able to
communicate and exchange information about goals, plans,
and intentions freely and effortlessly. This sharing of
information between parties may change with changes in
system function allocation and levels of automation associated
with flexible automation systems (i.e., AA) and this must be
considered in AA design and possibly interface design.
In order to ensure effective human-automation
communication under AA, a working relationship between the
human and the system must be developed (Bubb-Lewis &
Scerbo, 1997). Muir (1987) offered some suggestions for
developing this relationship in adaptively automated systems,
including providing operators with information such as the
machine’s areas of competence; training operators in how the
system works; providing them with actual performance data;
defining criterion levels of acceptable performance; and
supplying operators with predictability data on the reliability
of the system. Also, appropriate feedback mechanisms and
reinforcement during training are important. According to
Woods, Roth and Bennett (see Bubb-Lewis & Scerbo, 1997),
presenting the machine’s current state, goals, knowledge,
hypotheses, and intentions to the human in a clear and
accurate manner will serve to improve human-machine
communication
and
prevent
potential
information
misinterpretation. This is important because information
misinterpretation has been identified as a key problem in
performance that may serve to undermine human-machine
interaction.
In order to deal with the potential for operator lack of
mode awareness in controlling adaptive systems through a
consistent approach to HCI and encourage effective attention
allocation strategies in using adaptive interfaces, two
recommendations can be made in regard to structuring humancomputer interaction. First, as advocated in the interface
design for AA section, the behavior, purpose, and intentions of
adaptive systems should be made transparent to users
(Billings, 1997). That is, communication between the human
and automation should be open and clear. Billings (1997) also
offered that information relevant to combined humanautomation task performance and the underlying functions or
relationships should be presented clearly. In some situations,
however, adaptive system behavior can hardly be anticipated
making the possibility of conveying it limited. Secondly,
Sarter (1995) recommended external attentional guidance be
facilitated for users through timely salient system indicators
enabling them to project future system states particularly in
circumstances when system behavior may be unclear, or used
in adaptive systems with silent behavior (mode transition
without salient signals).
AA AND CREW INTERACTION
The introduction of AA in dynamic complex systems not only
can affect the interaction between the human operator and
machine, but it may impact the social interactions of human
operators, such as flight crewmembers in an aircraft. Norman
(see Petridis et al., 1995) pointed out that crew coordination
among pilots is supportive of vigilance and SA and may lead
to improvements in system status monitoring and early
detection of system performance deviations from desired
levels. Petridis et al. (1995) offered that the key element of
effective complex system performance is crew coordination
through (verbal) communication.
AA may radically and qualitatively alter the
communication structures between humans in controlling
complex systems. Using the example of aviation automation,
as noted by Wiener (1985), mounting evidence proves that
crew coordination and interaction in advanced automated
commercial aircraft cockpits are qualitatively different than in
the traditional aircraft cockpits. There is an increase in
requirement for operators to actively interact with automated
systems in operations such as programming the system for
control allocations via dynamic data entry and interfaces
(Bowers et al., 1996). Consequently, human-human
communication may occur at lower rates (see Bowers et al.,
1996). Another characteristic of such systems potentially
inhibiting effective crew coordination is that operators can
usually collect system information without vocal
communication among each other (Mosier & Skitka, 1996).
Costley, Johnson, and Lawson found less crew verbal
communication in flight observations due to advanced cockpit
automation (see Billings, 1997). In addition, they also noted
that the crewmembers were apt to communicate via automated
systems instead of person-to-person (see Bowers et al., 1996).
Contrary to these observations, Straus and Cooper (see
Petridis et al., 1995) found that verbal communication was
more prevalent among flight crews under automated
conditions. However, human task experience may interact
with the implementation of AA to impact crew coordination.
Straus and Cooper also found that more task-related
communication resulted in better pilot performance (see
Petridis et al., 1995). However, instead of permitting more
effective team planning and coordination, an increased
frequency of verbal communication among pilots (especially
heterogeneous aircrews) might pose an additional cognitive
load as a result of, for example, aiding each other in
interacting with automated systems (see Billings, 1997). For
example, the aircrew in the American Airlines Flight 965
incident in Cali, Columbia in 1995 communicated
continuously towards resolving a Flight Management System
(FMS) programming problem unfortunately to the neglect of
monitoring primary flight displays and aircraft altitude.
Segal & Jobe (1995) made similar observations to Wiener
(see Billings, 1997) that cockpit automation (in general) made
it difficult for flight crews to communicate non-verbally,
specifically to observe the actions of each other (e.g., copilot’s view of pilot in programming the FMS may be
obstructed) due to the design of flight deck layout. They also
observed that automation created new task demands for crew
that limited interaction during certain system operations. Segal
noted that automated systems have the potential to modify or
alter the nature of nonverbal information available to flight
crewmembers, for example, by simply reducing the control
actions of operators (see Bowers et al., 1996). Additionally,
some automation interfaces allow one crewmember to give
instructions to the system without permitting the other to infer
the nature of the input from observable behavior alone (Sarter,
1995). This may compromise nonverbal communication and
isolate operators from each other’s situation assessment and
decision-making processes and responses, thus increasing the
possibility of errors in crew coordination and flight
performance (Mosier & Skitka, 1996).
SUMMARY
ACKNOWLEDGEMENTS
In this paper, we presented a review of literature on issues in
the implementation of AA and related the works to the
objectives of a theory of human-centered automation.
Automation research has yet to define optimal strategies to
AA across a broad spectrum of systems that maintain
workload in-check and facilitate and support operator system
performance and situation awareness. We believe the hurdles
to this include designing appropriate mechanisms for humanmachine communication in adaptive systems, structuring
human and computer communication in adaptive system
control to ensure effective performance under any mode of
automation, designing AA to support crew communications
for team coordination, and designing AA to consider operator
workload in function allocation management and system/task
responsibilities.
With respect to system interface design and AA, general
human factors guidelines can, and should, be applied to the
design process to account for human information processing
limitations in the context of specific tasks. Dynamic displays
linked to changes in adaptive system function allocation have
resulted in problems, due to clutter and difficulty of use.
Interfaces of adaptively automated systems that use a
consistent style regardless of the level of automation but that
provide operators with direct control of system functions and
make automation states transparent to users have shown
promise. Further applied work is needed in this area.
Research has pointed to lack of operator mode of
automation awareness due to the purposed states, intentions,
and actions of computer control not being predictable or
clearly communicated to the human. Studies also identify
system automation transparency as the key to improving
operator mode or situation awareness. Dynamic displays that
meet operator information requirements under various modes
of automation have been recommended. The critical issue
associated with both of these initiatives is designing interfaces
that do not pose high information processing loads on
operators in terms of data perception and translation due to
high information density, and that actually meet operator SA
requirements.
In general, automation has been found to often increase
human-human verbal communication because of functions
being off-loaded to computers and the complexity of an
operator’s role in system control increasing. Unfortunately, the
capability of flight crews to communicate electronically
through automated systems seems to have subtracted from a
potential increase in crew verbal communication and other
forms of non-verbal communication. With this in mind, future
work is needed to identify critical crew interactions in
complex adaptive systems and to design AA strategies to
support them.
Unfortunately, in designing AA strategies most research
has not considered the additional workload imposed on
operators in interacting with computer systems to manage
dynamic function allocations. For AA to be effective in terms
of workload moderation this must be a concern in the design
of adaptive systems.
This work was supported by the National Aeronautics and
Space Administration (NASA) Langley Research Center
under Award No. NCC1330-99050488.
The opinions
expressed are those of the authors and do not necessarily
reflect the views of NASA.
REFERENCES
Ballas, J. A., Heitmeyer, C. L., and Perez, M. A. (1991). Interface styles for
adaptive automation. In Proceedings of the 6th International Symposium
on Aviation Psychology (Vol. 1, pp. 108-113). Columbus, OH.
Billings, C. E. (1997). Aviation automation: The search for a human-centered
approach. Mahwah, NJ: LEA.
Bowers, C. A., Oser, R. L., Salas, E., and Cannon-Bowers, J. A. (1996). Team
performance in automated systems. In R. Parasuraman & M. Mouloua
(Eds.), Automation and human performance: Theory and applications
(pp. 243-263). Mahwah, NJ: LEA.
Bubb-Lewis, C., & Scerbo, M. (1997). Getting to know you: Humancomputer communication in adaptive automation. In M. Mouloua & J.
M. Koonce (Eds.), Human-automation interaction: Research and
practice (pp. 92-99). Mahwah, NJ: LEA.
Endsley, M. R. (1995). Towards a theory of situation awareness in dynamic
systems. Human Factors, 37(1), 36-64.
Hilburn, B. J., Byrne, E., and Parasuraman, R. (1997). The effect of adaptive
air traffic control (ATC) decision aiding on controller mental workload.
In M. Mouloua & J. M. Koonce (Eds.), Human-automation interaction:
Research and practice (pp. 84-91). Mahwah, NJ: LEA.
Kaber, D. B. (1997). The effect of level of automation and adaptive
automation on performance in dynamic control environments. (Tech.
Work. Doc. ANRCP-NG-ITWD-97-01). Amarillo, TX: Amarillo
National Resource Center for Plutonium.
Kaber, D. B., & Riley, J. M. (1999). Adaptive automation of a dynamic
control task based on secondary task workload measurement.
International Journal of Cognitive Ergonomics, 3(3), 169-187.
Krahl, K., LoVerde, J. L., and Scerbo, M. K. (1999). Skill acquisition with
human and computer teammates: Performance with team KR. In M. K.
Scerbo & M. Mouloua (Eds.), Automation technology and human
performance (pp. 144-148). Mahwah, NJ: LEA.
Mosier, K. L., & Skitka, L. J. (1996). Human decision makers and automated
decision aids: Made for each other? In R. Parasuraman & M. Mouloua
(Eds.), Automation and human performance: Theory and applications
(pp. 201-220). Mahwah, NJ: LEA.
Muir, B. M. (1987). Trust between humans and machines, and the design of
decision aids. International Journal of Man-Machine Studies, 27, 527539.
Palmer, M. T., Rogers, W. H., Press, H. N., Latorella, K. A., and Abbott, T. S.
(1995). A crew-centered flight deck design philosophy for high-speed
civil transportation (HSCT) aircraft. (NASA Technical Memorandum
No. 109171). Hampton, VA: NASA Langley Research Center.
Petridis, R. S., Lyall, E. A., and Robideau, R. L. (1995). The effects of flight
deck automation on verbal flight-relevant communication. In
Proceedings of the 8th International Symposium on Aviation Psychology
(Vol. 1, pp. 216-220). Columbus, OH.
Sarter, N. B. (1995). ‘Knowing when to look where’: Attention allocation on
advanced automated flight decks. In Proceedings of the 8th International
Symposium on Aviation Psychology (Vol. 1, pp. 239-242). Columbus,
OH.
Scerbo, M. W. (1996). Theoretical perspectives on adaptive automation. In R.
Parasuraman & M. Mouloua (Eds.), Automation and human
performance: Theory and applications (pp. 37-63). Mahwah, NJ: LEA.
Segal, L., & Jobe, K. (1995). On the use of visual activity communication in
the cockpit. In Proceedings of the 8th International Symposium on
Aviation Psychology (Vol. 1, pp. 712-717). Columbus, OH.
Wickens, C. D. (1992). Engineering psychology and human performance (2nd
ed.). New York: HarperCollins.
Wiener, E. L. (1985). Beyond the sterile cockpit. Human Factors, 27(1), 7590.
Download