COGNITIVE CONFLICTS IN AVIATION: MANAGING COMPUTERISED CRITICAL ENVIRONMENTS Denis Besnard & Gordon Baxter

advertisement
COGNITIVE CONFLICTS IN AVIATION:
MANAGING COMPUTERISED CRITICAL
ENVIRONMENTS
Denis Besnard & Gordon Baxter
School of Computing Science
University of Newcastle upon Tyne
Department of Psychology
University of York
DIRCshop, March 2005, Edinburgh
Background
• Purpose of automation in critical environments: increase reliability of the
system
• But there have been side-effects: unpredictability.
• HMI can be the weak link: the operator might not understand what the
system is doing
• Let’s have a look at some cognitive mechanisms involved
DIRCshop, March 2005, Edinburgh
Outline
•
•
•
•
•
•
•
Human-machine interaction in aviation
What is a cognitive conflict?
Illustration: The Cali crash & Analysis
Aligning the mental model and the process
Glass cockpit assistants
Transparent flightdeck
Conclusion
DIRCshop, March 2005, Edinburgh
Human-machine interaction in
aviation
•
•
•
•
Glass cockpit aircraft is mainly piloted through FMC and autopilot
Airmanship is now only ONE of the required skills
Pilots depend more and more on automation
Dependability then becomes a matter of understanding automation’s
behaviour
DIRCshop, March 2005, Edinburgh
What is a cognitive conflict?
• It is an incompatibility between an operator’s mental model and the
behaviour of the system under control
• 2 dimensions in conflicts:
– Nature (event is expected or unexpected)
– Status (detected or hidden)
• Conflicts often manifest themselves as a surprise: “What’s going on?”
DIRCshop, March 2005, Edinburgh
The Cali B757 crash, 1996
• B757 approaching Cali (Columbia) from the North
• Crew intending to fly around the airport and land on Northbound
runway
• ATC suggests direct landing on Southbound runway
• Crew has to reprogram the descent: big increase in workload
• Wrong beacon entered in Flight Management Computer (FMC)
• Veering towards North-East for 1 minute, like
• Crew recovered but aircraft was on collision course
• Crash into 12.000 feet mountain, killing almost all of 163 onboard
DIRCshop, March 2005, Edinburgh
© Jeppesen Inc
DIRCshop, March 2005, Edinburgh
DIRCshop, March 2005, Edinburgh
Analysis
• The unexpected veering off of the aircraft created the cognitive conflict
• Wrong beacon selected partly because of a frequency heuristic
• Combination of one slip and several mistakes
– acceptance of ATC guidance without charts at hand
– continuation of the initial descent
– not rejecting the approach despite lack of time
DIRCshop, March 2005, Edinburgh
Aligning the mental model and the
process
• Contributing factors to conflicts: high tempo, low predictability,
undetected error, mistakes/erroneous knowledge.
• Modern aviation is the result of an automation philosophy (FMS,
ACAS, EGPWS, …)
• Side effects (e.g. mode confusion, opacity, …) were not always
anticipated
• Automation is a matter of distributing decisions across agents:
importance of putting the operator and the system in step
• Two design options: Glass cockpit assistants & Transparent systems
DIRCshop, March 2005, Edinburgh
Glass cockpit assistants
• Joint cognitive systems (Hollnagel & Woods, 1999). Automation has to
maintain a model of the operator
• Some experimental systems have been around for years: Hazard
Monitor (1997); CASSY (1997), CATS (1994; 2001), GHOST (2003).
• The assistants interpret human actions against operational context and
send proactive advice/warnings
• Dimensions of importance: timeliness, intention, integration,
troubleshooting support, evolution
DIRCshop, March 2005, Edinburgh
Callantine,
2001
Data gathered during flight
Interprets pilot’s actions
Derive operational context
Various ways to perform task
Mismatches?
Flags error and displays a few lines of text to cue pilot.
DIRCshop, March 2005, Edinburgh
Dehais et al., 2003
Tested a PC-driven interface
that prevents fixation errors
(GHOST)
- Results show that blanking,
blinking and fading catch
pilots’ attention
- Text-based messages are
then displayed
DIRCshop, March 2005, Edinburgh
Transparent systems
• HMI is more reliable if the operator understands the behaviour of the
system
• Dimensions of importance:
– Predictability (reduce the complexity induced by the technology)
– Computers as monitoring-advisory systems (automation should
only take last resort decisions)
DIRCshop, March 2005, Edinburgh
Conclusion
• Failure of STSs is rarely a mere technical issue
• Automation’s decisions must be understood by humans
• Glass-cockpit has reduced the number of tasks but has increased their
complexity: the cognitive demand has only shifted.
• Cognitive mismatches can be facilitated by computerised environments
because of opacity of automation’s decision rules
DIRCshop, March 2005, Edinburgh
Zis is ze end of ze tolk
Questions?
DIRCshop, March 2005, Edinburgh
Download