Document 13356092

advertisement
16.842
Fundamentals of Systems Engineering
Human-Systems Engiineeriing
Vā€Model – Oct 23, 2009
Stakeholder Analysis
Systems Engineering
y
g
g
Overview
Requirements
Definition
Commissioningg
Operations
Cost and Schedule
Management
System Architecture
Concept Generation
Human Factors
Lifecycle y
Management
Tradespace Exploration
C
t S l ti
Concept Selection
Verification and
Validation
System Integration
I t f
M
t
Interface Management
Design Definition
M l idi i li
O i i i
Multidisciplinary Optimization
System
Safetyy
Traditional Systems Engineering Process Model*
16.842
ACQUISITION PHASE
N
E
E
D
ConceptualPreliminary Design
Detail Design and
Development
UTILIZATION PHASE
Production
and/or Construction
Product Use, Phaseout, and
Disposal
• Operational requirements drive technical
performance
f
measures which
hi h drive
d i human
h
factors
f t
requirements…..
– Human
u
co s de o s often
considerations
o e aree low
ow priority
o y
*Blanchard, B. S., & Fabrycky, W. J. (1998). Systems Engineering and Analysis (3rd ed.). Upper Saddle River, NJ: Prentice Hall.
Or stuck out on the periphery…
16.842
Stakeholder Analysis
Systems Engineering
y
g
g
Overview
Requirements
Definition
Commissioningg
Operations
Cost and Schedule
Management
System Architecture
Concept Generation
Human Factors
Lifecycle y
Management
Tradespace Exploration
C
t S l ti
Concept Selection
Verification and
Validation
System Integration
I t f
M
t
Interface Management
Design Definition
M l idi i li
O i i i
Multidisciplinary Optimization
System
Safetyy
Results of “Classic” SE Methods
16.842
These images have been removed due to copyright restrictions.
Three Mile Island
16.842
• March 28th, 1979
• Main feedwater pump failure, caused reactor to shut
down
• Relief
R li f vallve opened
d to red
duce pressure but became
stuck in the open position
– No indication to controllers
– Valve failure led to a loss of reactant coolant water
• No instrument showed the coolant level in the reactor
• Operators thought relief valve closed & water level too
high
– High
i h stress
– Overrode emergency relief pump
Three Mile Island, cont.
16 842
16.842
• System worked as designed, automation worked
correctly
– Confirmation bias: people seek out information to confirm
a prior belief and discount information that does not
support this belief
– Operators selectively filtered out data from other gauges to support their hypothesis that coolant level was too high
This image has been removed due to copyright restrictions.
The Spiral Systems Engineering Process Model*
16.842
Cumulative cost
Determine objectives,
alternatives, constraints
Progress through steps
Evaluate alternatives
identify, resolve risks
Risk analysis
Risk analysis
Commitment,
partition
Risk analysis
Risk
Prototype3
anal Proto- Prototype2
type1
Emulations
RQTS plan
Models
life cycle Concept of
Software
operation
plan
RQTS
Software
product
Development
Requirements
design
plan
validation
Integration
and test plan Design validation
and verification
Plan next phases
Operational
prototype
Review
Benchmarks
Detailed
design
Code
Unit test
Integration
and
Implemen- Acceptance test
tation
test
Develop, verify next
level product
Image by MIT OpenCourseWare.
*Boehm, B. (1988). A Spiral Model of Software Development and Enhancement. Computer, 61-72.
Human Sy
ystems Enggineeringg*
16.842
"Vision"
User survey, needs analysis, etc.
Artifact and like-system evaluation
Innovative concepts for next
version release
Installation
Finalize customer support
Conduct on-site assessment
Provide training materials
System/software Reqs.
Feasibility assessment
Performance and usability reqs.
HE labor/budget planning
System/Software
Acquisition Cycle
Needs and task analysis
Storyboards and demonstrations
UI design standards
Detailed Design
Integration & Test
Plan performance tests
Scenario-based assessment
Develop training materials
Preliminary Design
Unit Development
Design tradeoff and workflow analysis
Detailed UI designs and prototypes
On-line help and documentation
HE heuristic review
Define performance and effectiveness criteria
Usability evaluation of prototypes
Image by MIT OpenCourseWare.
*
* Aptima, Inc. rendition
A Simplified
p
HSE approach
pp
16.842
Planning
Mission &
Scenario Analysis
Analysis
Function
Analysis
Function
Allocation
Task
Analysis
Design
System
Design
Test &
Evaluation
Mission & Scenario Analysis
16.842
• Stakeholder analysis
• Do users always know what they want/need?
– Revolutionary vs. evolutionary systems
• Interviews & observations
• Work process flows
• This is a critical step and should be
agreed upon before moving forward
– The most ill-defined
ill defined but the most important step
A Simplified
p
HSE approach
pp
16.842
Planning
Mission &
Scenario Analysis
Analysis
Function
Analysis
Function
Allocation
Task
Analysis
Design
System
Design
Test &
Evaluation
Determining function allocation
16.842
• More art than science
• Steps:
– Identify
y functions
• Use Cases
• Interviews
• Customer requirements
– Identify who is best suited for each function
• Human or automation or shared?
• Static vs. dynamic/adaptive
• Sounds easy!
16.842
Function Allocation via Fitts’ List?
Attribute
Machine
Human
Speed
Superior
Comparatively slow
Power
Output
Superior in level in consistency
Comparatively weak
Ideal for consistent, repetitive
p
action
Unreliable, learningg & fatigue
g a
factor
Multi-channel
Primarily single channel
Memory
Ideal for literal reproduction, access
restricted and formal
Better for principles & strategies,
access versatile & innovative
Reasoning
Computation
Deductive,, tedious to program,
p g
, fast
& accurate, poor error correction
Inductive,, easier to pprogram,
g
, slow,,
accurate, good error correction
Sensing
Good at quantitative assessment,
ppoor at ppattern recognition
g
Wide ranges, multi-function,
j g
judgment
Perceiving
Copes with variation poorly,
susceptible to noise
Copes with variation better,
susceptible to noise
Consistency
Information
Capacity
p y
Hollnagel, 2000
To automate or not to automate?
16.842
Function Allocation Criteria
16 842
16.842
Excellent
1: No difference in the relative
capabilities of human &
machine.
2: Human performance > machine
performance.
3
M
Machine
3: Machine performance > human.
5
4: Machine performance is so poor
that the functions should be
allocated to humans.
1
5 H
5:
Human performance
f
is
i so poor
that the functions should be
allocated to machine.
2
Unsatisfactory
6: Unacceptable
p
performance
p
byy
both human and machine.
6
Unsatisfactory
Price, 1985
4
Human
Excellent
Three function allocation criteria:
•
Balance of value
•
Utilitarian & cost-based
allocation
•
Allocation for affective or
iti
t
Sheridan and Verplank’s 10 Levels of Automation of Decision and Action Selection
Selection
16.842
Automation
Level
Automation Description
1
The computer offers no assistance: human must take all decision and actions.
2
The computer offers a complete set of decision/action alternatives, or
3
narrows the selection down to a few, or
4
gg
one alternative,, and
suggests
5
executes that suggestion if the human approves, or
6
allows the human a restricted time to veto before automatic execution, or
7
executes
t automatically,
t
ti ll then
th necessarily
il informs
i f
humans,
h
andd
8
informs the human only if asked, or
9
informs the human only if it, the computer, decides to.
10
The computer decides everything and acts autonomously, ignoring the human.
The Human-Automation Paradox
16.842
These images have been removed due to copyright restrictions.
Adaptive Automation
16.842
• Dy
ynamic function allocation
• Mode
Confusion
– A problem of intent
• Mixed initiative
• Flexible
automation
This image of a crashed jet has been removed due to
copyright restrictions.
Functional Requirements
16.842
Planning
Mission &
Scenario Analysis
Analysis
y
Function
Analysis
Function
Allocation
Functional
q
Requirements
Task
Analysis
Design
System
Design
Test &
Evaluation
Task Analysis
16.842
Planning
Mission &
Scenario Analysis
Analysis
y
Function
Analysis
Function
Allocation
Task
Analysis
Design
System
Design
Test &
Evaluation
Task Analysis
16.842
• Taylor
• Determi
D t
iniing wh
hatt an operattor mustt accompli
lish
h to
t
meet a mission goal
–Interactions
Interactions both on a local & system
s stem level
le el are critical
– Will contain actions and/or cognitive processes
•• Flow process charts,
charts operational sequence
diagrams, critical task analysis
– Attempt to understand how a particular task could
exceed human limitations, both physical and cognitive
• Cognitive task analysis
– Shift from system control to systems management.
Cognitive Task Analysis (CTA)
16.842
• Goal: To analyze and represent the knowledge and
cognitive activities needed in complex work domains
• CTA is generally a descriptive modeling technique of
workers’ knowledgge and coggnition
– As opposed to Computational Cognitive Models (CCM) – Knowledge Elicitation is a central feature
• Experts vs.
vs Novices
• Evolutionary systems vs. revolutionary systems
• Backgground Research
– Standards, procedures, manuals, organizational charts
• Field Studies
– In
I both
b h reall environments
i
andd high
hi h fidelity
fid li simulati
i l ions
• Questionnaires/Surveys
CTA, Cont.
16.842
16
842
• Interviews
– Individuals vs. focus groupps
– Critical Incident Technique/Critical Decision Method
• Observations
– Verbal protocols
• Design Reviews
– Usability, Expert, Heuristic
• Problems with CTA
–
–
–
–
Labor intensive
Generate much data that is difficult to analyze
Gap between CTA and design
Opportunistic
SRK* Taxonomy of Cognitive Control
16.842
• Skill-Based Behavior (SBB)
– Motor control/automaticity in perception-action cycle
• Rule-Based Behavior (RBB)
– Procedural, stored if-then-else rules that dictate action
• Knowledge-Based Behavior (KBB)
– Serial, analytical reasoning based on a mental model (internal, symbolic representation of environmental constraints and relationships in the environment.)
environment )
• Which of these should be automated?
• Analyst’s best guess for SRK assignment?
assignment?
(*Rasmussen, 1976)
The Need for Iteration
16.842
Planning
Cumulative cost
Determine objectives,
alternatives, constraints
Progress through steps
Evaluate alternatives
identify, resolve risks
Risk analysis
Risk analysis
Commitment,
partition
Mission &
Scenario Analysis
Integration
and test plan Design validation
and verification
Function
Analysis
Functional
Requirements?
Risk analysis
Risk
Prototype3
anal Proto- Prototype2
type1
Emulations
RQTS plan
Models
life cycle Concept of
Software
operation
plan
RQTS
Software
product
Development
Requirements
design
plan
validation
Plan next phases
Operational
prototype
Review
Benchmarks
Detailed
design
Code
Unit test
Integration
and
Implemen- Acceptance test
tation
test
Develop, verify next
level product
Image by MIT OpenCourseWare.
Function
Allocation
Test &
Evaluation
Task
Analysis
System
Design
Information Requirements
16.842
Planning
Mission &
Scenario Analysis
Analysis
y
Function
Analysis
Function
Allocation
Information
Requirements
Task
Analysis
Design
System
Design
Test &
Evaluation
Prototyping
16.842
Planning
Mission &
Scenario Analysis
Analysis
y
Function
Analysis
Function
Allocation
Task
Analysis
Design
System
Design
Test &
Evaluation
The Spiral Systems Engineering Process Model*
16.842
Cumulative cost
Determine objectives,
alternatives, constraints
Progress through steps
Evaluate alternatives
identify, resolve risks
Risk analysis
Risk analysis
Commitment,
partition
Risk analysis
Risk
Prototype3
anal Proto- Prototype2
type1
Emulations
RQTS plan
Models
life cycle Concept of
Software
operation
plan
RQTS
Software
product
Development
Requirements
design
plan
validation
Integration
and test plan Design validation
and verification
Plan next phases
Operational
prototype
Review
Benchmarks
Detailed
design
Code
Unit test
Integration
and
Implemen- Acceptance test
tation
test
Develop, verify next
level product
Image by MIT OpenCourseWare.
*Boehm, B. (1988). A Spiral Model of Software Development and Enhancement. Computer, 61-72.
Prototyping & HSE
16.842
• Does design meet functional requirements?
– Willlead
Will lead to design requirements
– Elegant usability is not the primary focus
• Low
Lowfidelity
fidelity vs. medium/high fidelity
• Feedback & interactivity
– Particippatoryydesi gn
– Wizard of Oz
• Breadth vs. depth
– Front end vs. back end / Horizontal vs. vertical
• DANGER
– Research vs. product
– Decision support design – not cool interface design
Prototyping Fidelity
16.842
Hybrid
Low
Paper
High
Storyboarding
Power
Point
Operational
Simulators
Commercial
Visual
Java Gaming
2
Basic
(
(Flight
g
Decks,
,
C
(MFS)
Center
Participatory Design
Human/System
Performance Testing
T&E
16.842
Planning
Mission &
Scenario Analysis
Analysis
y
Function
Analysis
Function
Allocation
Task
Analysis
Design
System
Design
Test &
Evaluation
Test & Evaluation
16 842
16.842
• Concurrent testing
vs. system performance testing
• Human vs
– Simulation
– Demonstrations
• Usability evaluations
– Subjective vs. objective
• Lab testing vs. field testing
• Cost-benefit analysis
– Human trials are expensive!
– Testing in the spiral stages
• Formal experimental design (16.470)
• COUHES
The Sp
piral Syystems Eng
gineeringg
Process Model*
16.842
Cumulative cost
Determine objectives,
alternatives, constraints
Progress through steps
Evaluate alternatives
identify, resolve risks
Risk analysis
Risk analysis
Commitment,
partition
Risk analysis
Risk
Prototype3
anal Proto- Prototype2
type1
Emulations
RQTS plan
Models
life cycle Concept of
Software
operation
plan
RQTS
Software
product
Development
Requirements
design
plan
validation
Integration
and test plan Design validation
and verification
Plan next phases
Operational
prototype
Review
Benchmarks
Detailed
design
Code
Unit test
Integration
and
Implemen- Acceptance test
tation
test
Develop, verify next
level product
Image by MIT OpenCourseWare.
*Boehm, B. (1988). A Spiral Model of Software Development and Enhancement. Computer, 61-72.
Resources
16.842
• Schraaggen, J.M., Chippman, S.F., & Shalin, V.L.
(Eds) (2000). Cognitive task analysis. Mahwah,
New Jersey: Lawrence Erlbaum Associates.
• ONR/Aptima Cognitive Task Analysis website
• A Survey of Cognitive Engineering Methods and
Uses
MIT OpenCourseWare
http://ocw.mit.edu
16.842 Fundamentals of Systems Engineering
Fall 2009
For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
Download