Class Notes - Nancy Leveson

advertisement
Papers from Week 1
• Flying in Place
• Therac-25 accidents
• Role of Software in Spacecraft Accidents
• Augustine: Yes, but will it work in theory?
• Software and the challenge of flight control
• No Silver Bullet
• Software Lemmingineering
© Copyright Nancy Leveson, June 2012
Therac-25 Factors
• Overconfidence in software
• Inadequate software and system engineering practices
• Confusing reliability with safety
• Lack of defensive design
• Failure to eliminate “root causes”
• Complacency
• Unrealistic risk assessments
• Inadequate investigation or followup on accident reports
• Software reuse
• Safe vs. friendly user interfaces
• Lack of government oversight and standards
© Copyright Nancy Leveson, June 2012
Spacecraft Accident Factors
• Culture/System Engineering Flaws
– Overconfidence, complacency, poor risk management for
software (and systems)
– Problems and warning signs unheeded
– Unhandled complexity, ignoring system interaction
problems (assume all failures are random)
• Management
– Diffusion of responsibility, authority, accountability
• Lack of oversight (“insight” vs. “oversight”) (contract
monitoring)
• Faster, better, cheaper
© Copyright Nancy Leveson, June 2012
Spacecraft Accidents (2)
• Management (con’t.)
• Inadequate transition from development to operations
– Limited communication channels, poor info flow
• Technical deficiencies
– Inadequate system and software engineering
• Poor or missing specifications (note MCO error)
• Unnecessary complexity and software functionality
• Software reuse and changes without appropriate analysis
• Violation of basic safety engineering practices in digital
components (and misunderstanding differences in failure
modes between software and hardware, e.g., Ariane 5)
© Copyright Nancy Leveson, June 2012
Spacecraft Accidents (3)
– Inadequate review activities
– Ineffective system safety engineering
– Flaws in test and simulation environment
– Inadequate human factors design
© Copyright Nancy Leveson, June 2012
Introduction to Systems Theory
Ways to cope with complexity
1. Analytic Reduction
2. Statistics
[Recommended reading: Peter Checkland, “Systems
Thinking, Systems Practice,” John Wiley, 1981]
© Copyright Nancy Leveson, June 2012
Analytic Reduction
•
Divide system into distinct parts for analysis
Physical aspects  Separate physical components
Behavior
 Events over time
•
Examine parts separately
•
Assumes such separation possible:
1. The division into parts will not distort the phenomenon
– Each component or subsystem operates independently
– Analysis results not distorted when consider components
separately
© Copyright Nancy Leveson, June 2012
Analytic Reduction (2)
2. Components act the same when examined singly as when
playing their part in the whole
– or events not subject to feedback loops and non-linear
interactions
3. Principles governing the assembling of components into the
whole are themselves straightforward
– Interactions among subsystems simple enough that can be
considered separate from behavior of subsystems themselves
– Precise nature of interactions is known
– Interactions can be examined pairwise
Called Organized Simplicity
© Copyright Nancy Leveson, June 2012
Statistics
• Treat system as a structureless mass with
interchangeable parts
• Use Law of Large Numbers to describe behavior in
terms of averages
• Assumes components are sufficiently regular and
random in their behavior that they can be studied
statistically
Called Unorganized Complexity
© Copyright Nancy Leveson, June 2012
Complex, Software-Intensive Systems
• Too complex for complete analysis
– Separation into (interacting) subsystems distorts the
results
– The most important properties are emergent
• Too organized for statistics
– Too much underlying structure that distorts the statistics
Called Organized Complexity
© Copyright Nancy Leveson, June 2012
© Copyright Nancy Leveson, June 2012
Systems Theory
• Developed for biology (von Bertalanffly) and engineering
(Norbert Weiner)
• Basis of system engineering
– ICBM systems of the 1950s
– Developed to handle systems with “organized complexity”
(Reading recommendations:
Peter Checkland, Systems Thinking, Systems Practice
Peter Senge, The Fifth Discipline)
© Copyright Nancy Leveson, June 2012
Systems Theory (2)
•
Focuses on systems taken as a whole, not on parts taken
separately
–
–
Some properties can only be treated adequately in their
entirety, taking into account all social and technical
aspects
These properties derive from relationships among the
parts of the system
How they interact and fit together
•
Two pairs of ideas
1. Hierarchy and emergence
2. Communication and control
© Copyright Nancy Leveson, June 2012
Hierarchy and Emergence
• Complex systems can be modeled as a hierarchy of
organizational levels
– Each level more complex than one below
– Levels characterized by emergent properties
• Irreducible
• Represent constraints on the degree of freedom of
components at lower level
– Hierarchy theory
• Differences between levels
• How levels interact
• What are some examples of emergent properties?
© Copyright Nancy Leveson, June 2012
Communication and Control
• Hierarchies characterized by control processes working
at the interfaces between levels
• A control action imposes constraints upon the activity at
a lower level of the hierarchy
• Systems are viewed as interrelated components kept in
a state of dynamic equilibrium by feedback loops of
information and control
• Control in open systems implies need for communication
© Copyright Nancy Leveson, June 2012
Control processes operate
between levels of control
Goal condition
Control
Actions
Controller
Model condition
Observability
condition
Actuator
Sensor
Action condition
Feedback
Controlled Process
© Copyright Nancy Leveson, June 2012
System Engineering
• A little history
• Systems theory is underlying scientific foundation
• Basic concepts:
– Some system properties can only be treated holistically
• i.e., in social and technical context
– Optimization of components will not result in system
optimum
– Cannot understand individual component behavior without
understanding role and interaction within whole system
• “System is more than the sum of its parts”
© Copyright Nancy Leveson, June 2012
System Engineering Tasks
• Needs analysis
– Objectives
– Criteria to rank alternative designs
• Feasibility studies
– Identify system constraints and design criteria
– Generate plausible solutions
• Satisfy objectives and constraints
• Are physically and economically feasible
• Trade studies (to select one solution to be implemented
© Copyright Nancy Leveson, June 2012
System Engineering Tasks (2)
• System architecture development and analysis
– Break down system into subsystems and functions and
define interfaces
– Analyze with respect to desired system performance
properties
• Interface Design and Analysis
– Optimize visibility and control
– Isolation so can implement independently (modularity)
– Need to be able to integrate and test
• Implementation
• Manufacturing
• Operations
© Copyright Nancy Leveson, June 2012
Considerations
• Process is highly iterative
• Specification is critical
– Large and long development projects
– Maintenance and evolution
– Impacts human problem solving
• Control is critical (including in management of large
projects)
• Top-down approach vs. bottom-up
© Copyright Nancy Leveson, June 2012
What is a System?
• Definitions:
– System: Set of components that act together as a whole to
achieve some common goal, objective, or end
– Components are interrelated and either directly or
indirectly connected to each other
– Assumptions:
• System goals can be defined
• Systems are atomistic: can be separated into components
such that interactive behavior mechanisms can be described
© Copyright Nancy Leveson, June 2012
Definitions (2)
• Systems have states: set of relevant properties describing the
system at any given time
• Environment: Set of components (and their properties) that
are not part of the system but whose behavior can affect the
system state
• Implies a boundary between system and environment
– Inputs and outputs cross boundary
© Copyright Nancy Leveson, June 2012
Systems as Abstractions
• A system is always a model, i.e., an abstraction conceived by
viewer
– Observer may see different system purpose than designer
or focus on different relevant properties
– Specifications ensure consistency and enhance
communication
• System boundary
• Inputs and output
• Components
• Structure
• Relevant interactions among components and how behavior
of components affect overall system state
• Purpose or goals of system that make it reasonable to
consider it to be a coherent entity
© Copyright Nancy Leveson, June 2012
Griffin: Two Cultures
• Engineering science vs. engineering design
(Reading Recommendation: Samuel Florman, The Civilized Engineer and
others of his books)
• Software as art vs. engineering?
– Programmer vs. software engineer
• Role of failure in engineering
• Role of the system engineer
(Think about this as you read all the standards and the
other details of system and software engineering this
semester)
© Copyright Nancy Leveson, June 2012
Griffin: How Do We Fix System
Engineering?
•
Design Elegance
1. Does the design actually work?
2. Is it robust?
3. Is it efficient?
4. Does it accomplish its intended purposes while
minimizing unintended actions, side effects, and
consequences?
•
These should be core concern of system engineer
– Need to get beyond intuition and judgment
© Copyright Nancy Leveson, June 2012
“System of Systems”
• Implications for
– Emergent properties
– Interface analysis and IMA (integrated modular avionics)
– “Interoperability”
© Copyright Nancy Leveson, June 2012
Download