Recommended Practice for Human Factors Evaluation Development

advertisement
Recommended Practices for Human Factors Evaluation Development
Process for Advanced Avionics
Lisette Lyne and R. John Hansman
MIT International Center for Air Transportation
Department of Aeronautics & Astronautics
Massachusetts Institute of Technology
Cambridge, MA 02139 USA
July 2001
Table of Contents
1. Introduction ............................................................................................................. 2
2. Objectives................................................................................................................. 3
3. Recommendations .................................................................................................. 4
3.1
Integrated Human Centered Systems Approach ........................................... 4
3.1.1
Model the System and Operator (or Operators) as a Closed Loop
Feedback Process ...................................................................................... 4
3.1.2
Determine the Information that the Operator Requires to Perform the
Task ............................................................................................................ 5
3.1.3
Use the Information Requirements to Determine the
Display/Automation Requirements ......................................................... 5
3.1.4
Develop Prototype Systems ...................................................................... 5
3.1.5
Perform Simulation Evaluations ............................................................. 5
3.1.6
Integrated Simulation Testing ................................................................. 5
3.1.7
System Evaluation..................................................................................... 5
3.1.8
Field Development Phase ......................................................................... 5
3.2
Operator Directed Process ............................................................................... 6
3.2.1
Functional Analysis................................................................................... 7
3.2.2
Automation Model .................................................................................... 7
3.2.3
Incremental Instantiation ......................................................................... 7
3.2.4
Training Material...................................................................................... 8
3.2.5
Certification ............................................................................................... 8
3.2.6
Configuration Management ..................................................................... 9
4. Recommended Practices for Fulfilling FAR Part 23 Regulations ........... 10
4.1
Human Factors Requirements ....................................................................... 10
4.2
Proposed Human Factors Evaluation Plan .................................................. 13
5. Other Issues/Comments ...................................................................................... 17
5.1
Limitation of Using Low Fidelity Simulators to Capture Accurate
Performance Data ........................................................................................... 17
5.2
Evaluation Measures ...................................................................................... 17
6. References .............................................................................................................. 18
Appendix A - Guidance for Reviewing Certification Plans to Address
Human Factors for Certification of Transport Airplane Flight decks (Policy
Statement Number ANM-99-2)................................................................................ 19
Appendix B – Human Factors in Certification Presentation ............................. 20
Appendix C - Application Guide for Obtaining a supplemental Type
Certificate (AC 21-40) ................................................................................................ 21
1
1.
Introduction
Advanced avionic systems are currently being developed for use in general aviation aircraft. The
avionics include both primary flight displays and multi-functional displays. In order to support
the human factors development of such displays, a research project was undertaken to review
current FAA guidelines relating to human factors requirements necessary for certification, and
other relevant FAA documentation. FAR Part 23 is commonly used for certifying avionics for
general aviation aircraft. Specifically, Part 23 is used for normal, utility, acrobatic, and commuter
category aircraft. Part 23 was reviewed and the human factors requirements listed in this
document. The human factors requirements are presented in the regulations in very general
terms, and focus primarily on presentation of warning information, location of instruments,
visibility of instruments, pilot workload, and warning, caution, and advisory light color schemes.
Little guidance is given in the regulations on how to measure, test and satisfy the human factors
related regulations.
A variety of FAA documents were reviewed in order to find further guidance for developing a
human factors evaluation plan. The list of documents reviewed can be found in the reference
section. The “Human Factors and Operations Checklist for Standalone GPS Receivers” [8] gives
a very detailed series of bench tests and flight tests to be performed, including detailed test
procedures and evaluation considerations. Although this document provides valuable test
procedures, methods of evaluation and measures, the measures and procedures are targeted at one
piece of equipment and have not been developed in a way that can be generalized. The document
also considers the human factors testing as a final task to be performed after the unit has been
designed and built, rather than an on-going process during the design and development stages.
The most comprehensive document identified was “Guidance for Reviewing Certification Plans
to Address Human Factors for Certification of Transport Airplane Flight Decks” by Sharon Hecht
of the Northwest Mountain Region [9]. The purpose of the document is to provide guidance to
FAA certification teams that will enable them to conduct an effective review of an applicant’s
Human Factors Certification Plan, submitted at the beginning of a certification application. The
document outlines nine steps, which provide the suggested format for a certification plan.
Sections 3 and 4 of the document provide an outline of the certification requirements and
methods of compliance using the example of FAR 25. Various methods of compliance are
suggested such as engineering evaluations or analyses, mock-up evaluations, part-task
evaluations, simulator evaluations and in-flight evaluations. Appendix A of the document also
provides discussions on how the human factors requirements specified in FAR Part 25 can be
satisfied. Although the document provides a more detailed description on how to satisfy the
certification requirements for Part 25, the descriptions lack specific guidance and the document
does not assist the developer in feeding human factors considerations into the design and
development phases.
This document is intended to be a working document and will be updated as further relevant
documentation is found. The current state of the document reflects the compilation of material
found as a result of an initial literature review. There may however be other relevant documents
that have not been included due to time constraints. This information will be incorporated in
future updates.
2
2.
Objectives
In the absence of a comprehensive FAA human factors guide, this document has been complied
with two aims. The first is to provide the user with an approach to incorporate human factors into
the design and development phases, by using an integrated human centered systems approach and
an Operator Directed Process. The second is to provide a suggested evaluation plan in
accordance with Part 23 regulations, which can be partially or fully implemented during the
various phases as suggested by the integrated human centered systems approach. The overall aim
of performing human factors evaluations and incorporating the results into the design and
development phase is to ensure the equipment is developed in accordance with the following
principles: to meet the intended function, prevent designs that are susceptible to misuse, identify
usability issues, minimize input error and enhance safety.
3
3.
Recommendations
This section provides a detailed description of two methods of incorporating human factors into
the design and development phases of a product. The first is the Integrated Human Centered
Systems approach and the second is the Operator Directed Process, the latter of which focuses
primarily on complexity.
3.1
Integrated Human Centered Systems Approach
The Integrated Human Centered Systems Approach applies known techniques of human centered
design but maintains a Systems Engineering methodology in the development process. The
human is considered as a functional component of the closed loop information system. System
level trades are considered to evaluate the allocation of capability and responsibility between the
human and other components of the information systems such as the sensors, displays, or
automation systems.
A key element of the integrated approach is practical consideration for the actual operating
environment. Many proposed information system elements which look good on paper, in theory
and in static models fail in dynamic operation. A simple example would be a decision aid, which
did not consider real world behavior such as the variability in pilot response time to controller
instructions or the possibility of a blocked communication.
The key steps in the Integrated Human Centered Systems Approach are described below.
3.1.1
Model the System and Operator (or Operators) as a Closed Loop
Feedback Process
The first step in the process is to create a model of the system with the operators represented as
single elements, or as more complicated subsystems if necessary. These elements will have
inputs in the form of sensory data, and outputs in the form of control actions on various other
system elements. Fig. 1 shows an example of a closed loop model of a portion of the current
aerospace system. In this model, Air Traffic Control, the Pilot and the Airline Operations Center
(AOC) are all considered functional elements.
Flight
Strips
Surveillance:
Enrout e: 12.0 s
Terminal: 4.2 s
ADS:
1s
Decision
Aids
ATC
Displ ays
Fli ght Plan
A mendments
Vectors
Voic e
Pi lot
AOC:
Airl ine
Operati ons
Center
I ni tial
Clearances
ACARS
(Datalink)
CDU
MCP
Trajectory
Commands
St at e
Com mands
Flight
Management
Comput er
Controls
Manual Cont rol
A ut opil ot
Autothrus t
Display s
Aircraft
St ate
Navigation
Fig. 1. Simple Closed Loop Model of the ATM System.
4
3.1.2
Determine the Information that the Operator Requires to Perform the
Task
Information requirements are defined by the inputs necessary for the operator to perform and
manage the necessary tasks. Typically a functional analysis and time line analysis of the
operation are conducted to determine a base set of information requirements. For evolutionary
systems it is more applicable to identify key issues and obtain operational insight by conducting
focused interviews and surveys of operators currently using similar systems.
3.1.3
Use the Information Requirements to Determine the
Display/Automation Requirements
Once the information requirements have been identified, the functional requirements for the
display can be derived. These requirements often highlight issues that will need to be dealt with
before continuing. It is useful at this stage to identify information which may be unobservable or
difficult to display, and perhaps re-examine its impact on task requirements.
3.1.4
Develop Prototype Systems
Based on the results of the information requirements analysis and an assessment of technological
capability, prototype information systems are developed to explore various system options and to
address issues. The systems are typically developed on rapid prototyping part-task simulators
based on graphical workstations which allow easy exploration of different system options at the
cognitive level. In many cases, fundamental issues are identified and resolved in the prototyping
process when the degree of fidelity is matched to the functional requirements of the task.
3.1.5
Perform Simulation Evaluations
Iterative simulation evaluations of prototype information system options are conducted using
controller subject populations. Both performance metrics and subjective metrics are used for
evaluation purposes.
3.1.6
Integrated Simulation Testing
For some development systems, it is necessary to run more complex simulation studies to
investigate the interaction dynamics between multiple agents (controllers & pilots). This can be
conducted in distributed simulation facilities with combinations of real and simulated systems.
3.1.7
System Evaluation
Based on the result of the simulation evaluations, system level assessments are conducted with
regard to the potential benefits and impact of the information system. This would include
development requirements, system effectiveness, safety implications and cost-benefit analysis.
3.1.8
Field Development Phase
For those systems which have favorable cost-benefit profiles, preliminary systems are developed
for field studies with live controllers and ultimately live aircraft. The results of these field studies
are used to develop system specifications which are used to procure operational systems. In some
cases it has been found beneficial to transition the technology into the field in incremental stages.
The technology is first introduced to operating central facilities in a non-interfering “shadow
5
mode”. Once operational issues have been resolved and controller acceptance has been obtained
in this manner, the technology is more easily incorporated into operational facilities. This method
has been used in initial TCAS implementation.
3.2
Operator Directed Process
The Operator Directed Process is another development process for developing systems using
automation with which humans interact. This process allows the consideration of the human
operator early in the development process. This process is similar to the Integrated Human
Centered Systems Approach, but is expanded to incorporate training material, certification and
configuration management.
Functional
Analysis
Automation Model is derived from
Functional Analysis, operator and
expert user input.
Automation
Model
Specification changes
must be consistent with
Automation Model.
Training
Material
Training material and procedures are
Derived from Automation Model.
Training Representation is created.
Software
Specification
Software specification is
derived from Training Material.
Software
System is certified against
Automated Model.
Certification
Configuration Management verifies and
maintains consistency with Automated Model.
Configuration
Management
Figure 2: Operator Directed Process (Waterfall Model)
The Operator Directed Process (ODP) is shown schematically in Figure 2. The major difference
in this process is that the training material is the source of the system specification rather than
vice versa. Developing training material early forces consideration of fundamental issues in
human-machine interaction early in the process. This contrasts with existing development cycles
that use training material to document system design. The intent is to develop a less error-prone
and more understandable system by requiring consistency between the training material,
procedural usage, and the software, and by limiting the complexity of the system through the
articulation of a model for the operator. This enables the explicit consideration of the human
operator early in the development process.
An iterative version of the ODP is shown in Figure 3.
6
Figure 3: Operator Directed Process (Iterative Model)
3.2.1
Functional Analysis
The first stage of this process is to determine the functionality that the automation system
requires. This analysis needs to be based on the existing environment in which the automation
must function and the anticipated operational and procedural usage of the automation. Several
other researchers have published work to guide this process (Boy 1998, Vicente 1999).
3.2.2
Automation Model
The key element of Operator Directed Process is the creation of an Automation Model suitable
for the pilot. It is derived based on the functional analysis and input from current design
engineers, operators, and expert users. This is a representation of the automation which can be
articulated and used operationally by the pilot and is a necessary construct for effective
monitoring. The purpose of creating this model early in the process is to use it to limit the
complexity of the automation, either by limiting the behaviors and functionality of the system, or
by consistently abstracting the system at a higher level. This model is intended to be a high level
description of the system which captures the philosophical and design goals which lead to
specific design criteria at more detailed levels. The primary goal of the automation model is that
it must be capable of describing and explaining all the behaviors of the system that matter, and all
of the derived operational procedures.
3.2.3
Incremental Instantiation
The waterfall model of system design consists of a linear set of steps, which are followed to
create a product. The waterfall model flows information and design considerations “downstream”
to be dealt with by the next stage. The major stages of this process are needs analysis and
7
specification, design, implementation, testing, and maintenance and upgrades. Similarly, for
software, these typically consist of the creation of functional analysis, followed by a software
specification, system instantiation and finally the development of documentation. Boehm (1983)
has shown that this development approach is inappropriate, and can be proven to be incorrect.
This is due to the unknowns in the development process, which requires the design of systems in
the absence of complete understanding of the problem to be solved, or its solution. By contrast,
the Operator Directed Process utilizes Boehm’s (1983) “spiral” model. This consists of a series of
repeating stages of iteration, where updates are made to an operational prototype of the final
system. In Figure 3, the sections are delineated by gray boxes to indicate that these encompass the
necessarily iterative stages of design and require human-in-the-loop testing. It is recognized that
in order to effectively design, document and evaluate early revisions of a system, it may be
necessary to create and evaluate prototypes in a manner consistent with the spiral model. The
reverse arrows shown in Figure 3 show the manner in which “downstream” events can impact
earlier stages and result in another iterative cycle. Determining when to iterate is dependent on
the size of the system. Simple systems may be able to be validated by inspection, whereas more
sophisticated systems may require full simulations in order to determine their effectiveness.
3.2.4
Training Material
One of the concerns is that any complex engineering model may not be an appropriate
representation for pilots. To overcome this issue, the ODP process derives training material based
on the automation model. This derivation assures that the proposed automation system can be
presented in a form amenable to training. The training material description of the system can then
be presented to pilots for feedback. In designing this process, few limitations have been placed on
the form or content of the training material. Rather than attempting to prescriptively specify the
form, structure, or nature of the training material, the goal is to explicitly require the
consideration of the specifics of knowledge transfer to the pilot. Domain-specific training experts
are likely to have an understanding of the appropriate material and how it should be presented.
For some applications the presentation of a structural model of the system may be sufficient
training. For others, a detailed explanation of how the system is to be used procedurally in various
operational scenarios may be more appropriate (Sherry 1999, Leveson 1998).
3.2.5
Certification
The current aircraft certification processes were originally designed for the mechanical and
electrical aspects of aircraft airframes. This approach has been successful, as shown by the
reduction in airframe-related incidents. Unfortunately, it does not appear that the approach is as
effective in the fields of software design or human factors, likely due to their implicit complexity.
The human factors aspects of certification have been recognized as being inadequate: Current
standards for type certification and operations have not kept pace with changes in technology and
increased knowledge about human performance. For example, flight crew workload is the major
human performance consideration in existing Part 25 regulations; other factors should be
evaluated as well, including the potential for designs to induce human error and reduce flight
crew situation awareness. (FAA, 1996) Currently, certification authorities do not have the means
or criteria available to require aircraft designers to create systems which address human factors
issues. With the exception (noted above) of workload issues, certification authorities do not have
the means to conduct an evaluation of human factors issues early in design. This has resulted in
the evaluation of aircraft flight decks being conducted during flight tests when a design is nearly
finalized at the end of the development cycle. At this stage, changes are both expensive and
difficult to make. After design is completed, flight-testing is also able to consider human factors
issue. However, if problems are found at this stage, it is again too expensive to change the
8
automation, and procedures are often designed to compensate. By imposing a process-oriented
solution, it may be possible to minimize the use of procedures in fixing design vulnerabilities.
3.2.6
Configuration Management
The concerns outlined above are focused in type certification of aircraft and components. This
refers to the certification of initial equipment from the primary manufacturer. A secondary
concern is that changes made to the system need to be approved as “Supplementary Type
Certificate” (STC). Any individual or company can apply to modify an existing type-certified
airplane through the STC process, but may not be aware of the design decisions made by the
original manufacturer. The “philosophy” of the flight deck, the operating assumptions, and other
consistencies designed into the system are not currently documented as part of the certification
process, and so cannot be considered during the STC process. As such, it is possible for approval
of a flight deck modification which is not consistent with the original manufacturer’s design. This
lack of “rationale capture” is a concern in current aircraft and certification processes. The basis
for design decisions is not documented during development, nor is it required by certification.
This lack of documentation makes it difficult for inconsistencies to be discovered and evaluated
by regulatory agencies, and for the underlying basis for design to be used when upgrades and
changes are made to these systems. If the Automation Model can be captured during initial design
and made explicit to parties who modify aircraft, it may be possible to maintain more consistent
systems through the life cycle of the systems (Littman, 1987).
9
4.
Recommended Practices for Fulfilling FAR Part 23 Regulations
This section outlines a number of human factors requirements as outlined in FAR Part 23 and
proposes a five step evaluation plan consisting of engineering evaluations and analyses, mock-up
evaluations, part-task evaluations, simulator evaluations and in-flight evaluations.
4.1
Human Factors Requirements
Table 1 outlines a number of human factors requirements as outlined in FAR Part 23.
FAR Section (Part 23)
Requirement
Relevant discussion from FAR Part
25, extracted from “Guidance for
Reviewing Certification Plans to
Address Human Factors for
Certification of Transport Airplane
Flight Decks” by Sharon Hecht [9].
General Human
Factors Requirements
23.1309 (b) (3)
23.1321(a)
23.1321(e)
(3) Warning information must
be provided to alert the crew to
unsafe system operating
conditions and to enable them
to take appropriate corrective
action. Systems,
controls, and associated
monitoring and warning means
must be designed to minimize
crew errors that could create
additional hazards.
(a) Each flight, navigation, and
power plant instrument for use
by any required pilot during
takeoff, initial climb, final
approach, and landing must be
located so that
any pilot seated at the controls
can monitor the airplane's flight
path and these instruments with
minimum head and eye
movement. The power plant
instruments for
these flight conditions are those
needed to set power within
power plant limitations.
(e) If a visual indicator is
provided to indicate
malfunction of an instrument, it
10
The applicant may wish to perform
analyses of the visual angles to each
of the identified instruments. Final
assessments of the acceptability of
the visibility of the instruments may
require a simulator with a high
degree of geometric fidelity and/or
the airplane.
Demonstrations and tests intended
to show that these indications of
instrument malfunctions, along with
must be effective under all
probable cockpit lighting
conditions.
23.1523
23.1543
other indications and alerts, are
visible under the expected lighting
conditions will typically employ the
use of production quality hardware
and careful control of lighting
conditions (e.g., dark, bright
forward field, shafted sunlight).
Simulators and aircraft are often
used, although supporting data from
laboratory testing may also be
useful.
The minimum flight crew must The applicant may choose to use
be established so that it is
workload analyses (such as timesufficient for safe operation
line analysis) to evaluate certain
considering-workload issues. Other evaluations
[(a) The workload on individual of workload typically involve
crewmembers and, in addition
trained pilots in either a high
for commuter category
fidelity simulation or in actual
airplanes, each crewmember
airplanes. There are a number of
workload determination must
possible workload assessment
consider the
techniques that can be successfully
following:
employed. An efficient means for
(1) Flight path control,
selecting test conditions is to focus
(2) Collision avoidance,
on those operational and/or failure
(3) Navigation,
scenarios that are likely to result in
(4) Communications,
the highest workload conditions.
(5) Operation and monitoring of Dispatch under the Minimum
all essential airplane systems,
Equipment List (MEL) also should
(6) Command decisions, and
be considered, in combination with
(7) The accessibility and ease of other failures that are likely to result
operation of necessary controls in significantly increased workload.
by the appropriate crewmember Since no objective standard for
during all normal and
workload is available, applicants
emergency operations when at
may wish to compare the workload
the
in the new/modified airplane with
crewmember flight
that in a well-understood,
station.]
previously certified airplane.
(b) The accessibility and case of
operation of necessary controls
by the appropriate
crewmember; and
(c) The kinds of operation
authorized under Sec. 23.1525.
For each instrument-The applicant may choose to use
(a) When markings are on the
computer modeling to provide
cover glass of the instrument,
preliminary analysis showing that
there must be means to
there are no visual obstructions
maintain the correct alignment
between the pilot and the
of the glass cover with the face instrument markings. Where head
of the dial; and
movement is necessary, such
(b) Each arc and line must be
analyses also can be used to
11
System-Specific HF
Requirements
23.1381(b)
wide enough and located to be
clearly visible to the pilot.
[(c) All related instruments
must be calibrated in
compatible units.]
measure its magnitude. Other
analysis techniques can be used to
establish appropriate font sizes,
based on research-based
requirements. Mock-ups also can
be helpful in some cases. The data
collected in these analysis and
assessment can be used to support
final verification in the flight deck,
using subjects with vision that is
representative of the pilot
population, in representative
lighting conditions.
The instrument lights must be
installed so that their direct
rays, and rays reflected from
the windshield or other surface,
are shielded from the pilot's
eyes.
The applicant may be able to
develop analytical techniques that
identify potential sources of glare
and reflections, as a means for
reducing the risk of problems
identified after the major structural
features have been committed.
Mock-ups also may be a useful
means for early assessments.
However, analysis results typically
must be verified in an environment
with a high degree of geometric and
optical fidelity. Both internal (e.g.,
area and instrument lighting) and
external (e.g., shafting sunlight)
sources of reflection should be
considered.
If warning, caution, or advisory
lights are installed in the
cockpit, they must, unless
otherwise approved by the
Administrator, be-(a) Red, for warning lights
(lights indicating a hazard
which may require immediate
corrective action);
(b) Amber, for caution lights
(lights indicating the possible
need for future corrective
Compliance with this requirement is
typically shown by a description of
each of the warning, caution, and
advisory lights. Evaluations may
also be useful to verify the
chromaticity (e.g., red looks red,
amber looks amber) and
discriminability (i.e., colors can be
distinguished reliably from each
other) of the colors being used,
under the expected lighting levels.
These evaluations can be affected
Specific Crew
Interface
Requirements
23.1322
12
action);
by the specific display technology
(c) Green, for safe operation
being used, so final evaluation with
lights; and
flight quality hardware is
(d) Any other color, including
sometimes needed. A description
white, for lights not described
of a well-defined color-coding
in paragraphs (a) through (c) of philosophy that is consistently
this section, provided the color
applied across flight deck systems
differs sufficiently from the
can be used to show how the design
colors
avoids “possible confusion.”
prescribed in paragraphs (a)
through (c) of this section to
avoid possible confusion.
[(e) Effective under all probable
cockpit lighting conditions.]
Table 1: Human Factors Requirements
4.2
Proposed Human Factors Evaluation Plan
Engineering Evaluations or Analyses
(These assessments can involve a number of techniques, including such things as procedure
evaluations (complexity, number of steps, nomenclature, etc); reach analysis via computer
modeling; time-line analysis for assessing task demands and workload; or other methods,
depending on the issue being considered.)
All requirements have been taken from Part 23 except basic functionality and identification
testing.
Regulation
Basic functionality and
identification testing
Equipment Location
Instrument Reflections
Requirement
Information must be
easily obtained from the
display and accurately
input.
Test
Poster boards, tree diagrams
outlining functionality, natural
mappings etc.
Flight, navigation and
Ergonomic computer modeling
power plant instruments
must be located so the
pilot can monitor flight
path and instruments with
minimum head and eye
movements
Pilots eyes must be
Computer modeling of direct and
shielded from direct and
reflected rays.
reflected rays of the
instruments
Table 2: Engineering Evaluations and Analysis
13
Mock-up Evaluations
These types of evaluations use physical mock-ups of the flight deck and/or components. They are
typically used for assessment of reach and clearance, thus, they demand a high degree of
geometric accuracy.
Regulations
Equipment Location
Requirement
Test
Flight, navigation and
Physical mock-up to assess reach
power plant instruments
and clearance, measure visual
must be located so the
angles to each instrument and
pilot can monitor flight
measure head and eye movements
path and instruments with required to read/input information
minimum head and eye
into the instruments.
movements
Table 3: Mock-up Evaluations
Part-Task Evaluations
These types of evaluations use devices that emulate (using flight hardware, simulated systems, or
combinations) the crew interfaces for a single system or a related group of systems. Typically
these evaluations are limited by the extent to which acceptability may be affected by other flight
deck tasks.
Regulation
Basic functionality and
identification testing
Requirement
Information must be easily
obtained from the display and
accurately input.
Test
Correctness of information
extraction, performance and
user input (e.g. no. of errors)
Warning Information
Warning information must be
able to alert crew to unsafe
system operating conditions.
Warning systems must lead to
pilots taking corrective actions
and minimize crew errors
Excessive workload associated
with flight path control,
collision avoidance,
navigation, communications,
operation and monitoring all
essential airplane systems and
command decisions must be
avoided.
Pilots must be able to maintain
correct alignment of the cover
glass (when markings are on
the cover glass of an
instrument) with the face of
the dial.
Pilots must be able to see
Recognition of alerts, no. of
correct actions and no. of
errors.
Workload
Alignment of cover glass
Visibility of instrument
14
Workload analyses for each of
the separate systems
Ability to maintain alignment
using the instrument, test for
no. of correct responses.
Ability to see markings with
markings
clearly all markings on an
instrument
Visibility and color coding of
warning, cautionary and
advisory lights
Ensure adherence of warning,
cautionary and advisory lights
with the following colorcoding scheme, red (warning),
amber (caution) and green
(safe). Lights must be visible
under all probably lighting
conditions, verify chromaticity
and discriminability.
Table 4: Part-Task Evaluations
varying lighting conditions,
head positions etc, test for no.
of correct responses.
Ability to see, identify and
discriminate lights under all
probably lighting conditions.
Test no. of correct responses.
Simulator Evaluations
These types of evaluations use devices that present an integrated emulation (using flight
hardware, simulated systems, or combinations) of the flight deck and the operational
environment. They also can be “flown” with response characteristics that replicate, to some
extent, the responses of the airplane. Typically, these evaluations are limited by the extent to
which the simulation is a realistic, high fidelity representation of the airplane, the flight deck, the
external environment, and crew operations. The types of pilots (test, instructor, airline) used in
the evaluations and the training they receive may significantly affect the results and their utility.
Basic functionality and
identification testing
Information must be easily
obtained from the display and
accurately input.
Correctness of information
extraction, performance and
user input (e.g. no. of errors)
of all systems, tested during a
real flight scenario.
Warning Information
Warning information must be
able to alert crew to unsafe
system operating conditions.
Warning systems must lead to
pilots taking corrective actions
and minimize crew errors
Pilots must be able to see
visual malfunction indicators
under all probably cockpit
lighting conditions.
Excessive workload associated
with flight path control,
collision avoidance,
navigation, communications,
operation and monitoring all
essential airplane systems and
command decisions must be
avoided.
Pilots must be able to maintain
correct alignment of the cover
Recognition of alerts, no. of
correct actions and no. of
errors, tested using all systems
during a real flight scenario
Visual Malfunction
Workload
Alignment of cover glass
15
No. of correct identification of
visual malfunction indicator
with varying lighting
conditions.
Workload analyses during a
real flight scenario.
No. of correct instrument
readings during a real flight
Visibility of instrument
markings
Visibility and color coding of
warning, cautionary and
advisory lights
glass (when markings are on
the cover glass of an
instrument) with the face of
the dial.
Pilots must be able to see
clearly all markings on an
instrument
Ensure adherence of warning,
cautionary and advisory lights
with the following colorcoding scheme, red (warning),
amber (caution) and green
(safe). Lights must be visible
under all probably lighting
conditions, verify chromaticity
and discriminability.
Table 5: Simulator Evaluations
scenario.
Ability to see markings with
varying lighting conditions,
head positions during a real
flight scenario
Ability to see, identify and
discriminate lights under all
probably lighting conditions,
during a real flight scenario.
In-Flight Evaluations
These types of evaluations use the actual airplane. Typically, these evaluations are limited by the
extent to which the flight conditions of particular interest (e.g., weather, failures, unusual
attitudes) can be located/generated and then safely evaluated in flight. The types of pilots (test,
instructor, airline) used in the evaluations and the training they receive may significantly affect
the results and their utility.
In-flight evaluations follow the same format as simulator evaluations and additionally include
tests to ensure shielding of pilots eyes from direct and reflected rays and suitability of equipment
location.
16
5.
Other Issues/Comments
During the past five months a series of human factors evaluations were performed using both
EFIS and HITS displays. As a results of these studies, the following issues/problems were
identified.
5.1
Limitation of Using Low Fidelity Simulators to Capture Accurate
Performance Data
Low fidelity simulators can be constructed quickly and at low cost by using a computer monitor,
joystick and off the shelf software packages, such as Microsoft Flight Simulator. However, the
simulator results may not be a good representation of the display/instrument under investigation.
The validity of results depends on the measures being taken during the experiment. Performance
data from a low fidelity simulator may provide a poor representation of the display due to
inaccurate flight dynamics or poor/slow response flight controls. A high fidelity simulator or inflight testing is highly recommended for performance measures. However, other measures such
as situational awareness can be accurately obtained using a low fidelity simulator. The advantage
of using a simulator rather than in-flight studies is the ability to control the environment and limit
the number of variables.
5.2
Evaluation Measures
Human factors evaluations require a number of different measures in order to obtain a true
understanding of the displays strengths and weaknesses. Performance data provides evidence for
the pilot’s ability to use the display (providing a high fidelity simulator is used). Subjective data
is another valuable method of data collection as this gives the opinion of the user, which may lead
to a difference conclusion than the performance data. Both types of data should be collected
synchronized and analyzed together.
17
6.
References
[1]
R. Hansman et al., Integrated Human Centered Systems Approach to the Development of
Advanced Cockpit and Air Traffic Management Systems, 16th Digital Avionics Systems
Conference, October 1997.
[2]
K.M. Joseph and D.W.Jahns, Enhancing GPS Receiver Certification by Examining
Relevant Pilot-Performance Databases, Office of Aviation Medicine, February 2000.
[3]
S.A. Shappell and D. A. Weigmann, The Human Factors Analysis and Classification
System – HFACS, DOT/FAA/AM-00/7, Office of Aviation Medicine, Washington DC,
February 2000.
[4]
S. Vakil, Analysis of Complexity Evolution Management and Human Performance Issues
in Commercial Aircraft Automation Systems, Thesis submitted to the Massachusetts
Institute of Technology, May 2000.
[5]
D.A Weignmann, A Human Error Analysis of Commercial Aviation Accidents Using the
Human Factors Analysis and classification system, Office of Aviation Medicine,
February 2001.
[6]
K.W. Williams, Comparing Text and Graphics in Navigation Display Design, Office of
Aviation Medicine, February 2000.
[7]
Application Guide for Obtaining a Supplemental type Certificate, Advisory Circular AC
21-40, FAA, US Department of Transportation, May 1998.
[8]
FAA Aircraft Certification Human Factors and Operations Checklist for Standalone GPS
Receivers (TSO C-129 Class A), Research and Special Programs Administration, Volpe
National Transportation Systems Center, April 1995.
[9]
Guidance for Reviewing Certification Plans to Address Human Factors for Certification
of Transport Airplane Flight Decks, Department of Transportation, FAA, Policy
Statement Number ANM-99-2.
[10]
Description of the FAA Avionics Certification Process, FAA, Aircraft Certification
Service, Aircraft Engineering Division, Avionics Systems Branch, April 23 1997.
18
Appendix A - Guidance for Reviewing Certification Plans to Address
Human Factors for Certification of Transport Airplane Flight decks
(Policy Statement Number ANM-99-2)
19
Appendix B – Human Factors in Certification Presentation
20
Appendix C - Application Guide for Obtaining a Supplemental Type
Certificate (AC 21-40)
21
Download