Workshop on Systems Engineering for Robustness June 2004

advertisement
Workshop on
Systems Engineering
for Robustness
June 2004
Accelerate capture and dissemination of key considerations and best practices
Broaden practitioner knowledge on current practices and emerging methods
Systems Engineering
for robustness means
developing systems/
system-of-systems
that are…
„
Capable of adapting to changes in
mission and requirements
„
Expandable/scalable, and designed to
accommodate growth in capability
„
Able to reliably function given changes
in threats and environment
„
Effectively/affordably sustainable over
their lifecycle
„
Developed using products designed for
use in various platforms and systems
„
Easily modified to leverage new
technologies
Morning Agenda
„
„
„
Opening Remarks – Dr. Marvin Sambur
Invited Speaker – Dr. Daniel Hastings
Stage-Setting Presentations …
¾
Air Force ACE: SE Revitalization Efforts in AF –
Mrs. Marty Evans, AF Acquisition center of Excellence
¾
Remarks on AF Guide and Leading Indicators,
Mr. Mark Wilson, AF Center for SE
¾
Focus Group Introduction – Focus Group Chairs and Student
Assistants; Overview Plan and Expected Outcomes;
Terminology & Groundrules
Dr. Donna Rhodes, MIT/LAI
¾
Perspectives on the Downstream Implications of SE
Dr. Dwight Holland, Human Factors Associates
Critical Issues in
Systems Engineering
Daniel Hastings
Massachusetts Institute of Technology
The solution lies in the direction of taking a systems view of things…
When you have the view from space,
you realize that the concept of fields within fields within fields,
systems of functioning within systems of functioning, is the only approach that will work.
-- Edgar D. Mitchell, Lunar Module Commander Apollo 14, 1971
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
1
Outline
ƒ The need for robust Systems Engineering
ƒ The growth of modern Systems Engineering and
the growth of Engineering Systems
ƒ Critical issues in Systems Engineering
– Development of “system of systems” methodology
– Moving system architecture from heuristics to
quantification
– Incorporating lifecycle uncertainty into systems
engineering
– Incorporating flexibility into systems engineering
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
2
Systems Engineering for “Robustness”
Systems Engineering for robustness means developing
systems/system-of-systems that are:
–
–
–
–
–
–
Capable of adapting to changes in mission and requirements
Expandable/scalable
Designed to accommodate growth in capability
Able to reliably function given changes in threats and environment
Effectively/affordably sustainable over their lifecycle
Developed using products designed for use in various platforms
and systems
– Easily modified to leverage new technologies
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
3
DSB/AFSAB Report on Acquisition of
National Security Space Programs
Review National Security Space programs, identify
systemic problems, recommend improvements
Findings:
– Cost has replaced mission success as the primary driver in
managing space development programs
– Unrealistic estimates lead to unrealistic budgets and
unexecutable programs
– Undisciplined definition and uncontrolled growth in system
requirements increase cost and schedule delays
– Government capabilities to lead and manage the acquisition
process have seriously eroded
– Industry has failed to implement proven practices on some
programs
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
4
SBIRS High Quantitative Framework Cost
Estimate History, 1996-2002
7
Billions of Then-Year Dollars
DSB/AFSAB
Example
and
Recommendation
on Systems
Engineering
$6.2B
6
$1.3B
$5.1B
5
Add ed R equirement s
$1.1B
Cost Gro wth with
Origin al Requ ir em ents
$2.3B
4
3
2
1
1996
1997
Estimate President’s
$3.2B
Budget
$2.6B
MPC
$2.4B
Contractor
Bid
$1.6B
$3.3B
Aw ard
$2.1B
Oct 01
EAC
Mar 2002
POE
0
• MPC: Most Probable Cost
• EAC: Estimate at Completion
• POE: Program Office Estimate
1. USecAF/DNRO should develop a robust systems engineering capability to support
program initiation and development. Specifically, USecAF/DNRO should
– Reestablish organic government systems engineering capability by selecting
appropriate people from within government, hiring to acquire needed capabilities, and
implementing training programs; and
– In the near term, ensure full utilization of the combined capabilities of government,
Federally Funded Research and Development Center (FFRDC), and systems
engineering and technical assistance (SETA) systems engineering resources.
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
5
Engineering Systems and Systems Engineering
What Is the Difference?
SYSTEMS ENGINEERING (Classical)
Systems engineering is the process of selecting and synthesizing the
application of the appropriate scientific and technical knowledge in
order to translate system requirements into system design. (Chase)
SYSTEMS ENGINEERING (Expanded)
Systems engineering is a branch of engineering that concentrates on
design and application of the whole as distinct from the parts…
looking at the problem in its entirety, taking into account all the facets
and variables and relating the social to the technical aspects. (Ramo)
ENGINEERING SYSTEMS (Context field for SE and other fields)
A field of study taking an integrative holistic view of large-scale,
complex, technologically-enabled systems with significant
enterprise level interactions and socio-technical interfaces.
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
6
Systems Engineering
Issues and Criticisms
Highly polarized view of scope from classical to expanded
Too focused on processes; not enough on system/system properties
Does not take adequate holistic perspective
Focuses too much on requirements; not enough on system behavior
Assumes system context as a constraint rather than variable
Value of SE is debated and not well quantified
Takes too much of a reductionist/top down approach
SE heritage is in aero/defense; commercial and public works programs
use different terminology and have expanded stakeholder set
Often applied at the subsystem and sometimes at the systems level –
but rarely at the system-of-systems/enterprise level
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
7
Systems Engineering Perspective versus
Engineering Systems Perspective
Systems Engineering Perspective Engineering Systems Perspective
Applied to
Small to large scale subsystems,
systems, system of systems
Very large-scale, complex open systems
that are technologically enabled
Policy
Viewed as fixed and constraining
system solution
Viewed as variables --can be created or
adapted to optimize overall solution
Sociotechnical
Viewed as considerations in
engineering
Viewed as primary in an overall system
solution
Stakeholders
Primary focus on customer and endusers with secondary focus on other
stakeholders
Balanced focus on all stakeholders
impacted by engineering system -product, enterprise, environment
Engineering
Processes
Applied to product system
Applied to both product system and
enterprise system
Practitioners
System architects, systems engineers,
related specialists performing systems
engineering process
System architects, enterprise architects,
engineers, operations analysis, project
managers, policy makers, social
scientists, and many others
Future Vision
Predictably develop systems with
value to primary stakeholders
Predictably develop sustainable
engineering systems with value to society
as a whole
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
8
Systems Engineering within Engineering Systems Context
Issues and Research Questions
Issue 1
Classical systems engineering principles and practices
need to be adapted and expanded to fully support
the engineering of highly complex systems
Questions
What Systems Engineering principles and practices are too limited at
present to effectively deal with large-scale complex systems with
socio-technical interfaces?
How can these be adapted and expanded to take the more robust view?
What new methodologies and tools are needed to implement an
expanded and more rigorous set of systems practices?
What case studies can show positive/negative impacts of taking/not taking
the broader engineering systems perspective?
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
9
Systems Engineering within Engineering Systems Context
Issues and Research Questions
1982: 22 MA and PhD programs in systems studies in US universities
1982, Systems Education in American Syllabuses, Gasparski, General Systems, Vol XXVII, 1982
2003: 94 MA and PhD programs in US universities (40 SE Centric and 54 Hybrid)
2003, INCOSE Report on Systems Engineering Degree Programs
Issue 2
For ES to become the context field for SE, there must be major
transitions in systems education strategies, policies, structures
Questions
How will universities need to evolve their structures and policies?
What new knowledge, skills, and abilities will systems practitioners need
for a more robust engineering systems perspective?
How will existing Systems Engineering curricula need to change to embrace
Engineering Systems as its context field?
What strategy can be used to transition current educational models to a
new model to support this vision?
Does the broader Engineering Systems context field enable development of
better systems leadership for 21st century engineering challenges?
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
10
Positioning the field of Systems Engineering as
a field of study within Engineering Systems
Classical SE not well suited to dealing with the global and
socio-technical aspects of 21st century systems, and does
not address the enterprise in the overall system
SE brings a rich set of directly applicable and/or adaptable
principles, practices, and methods to the field of
Engineering Systems
Placing SE within the broader context field of Engineering
Systems will enable the transformation of SE to more
effectively address engineering challenges of this century
A new education model is needed to realize this vision
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
11
Critical Issues in Systems Engineering
What are some of the critical issues we face?
• Development of “system of systems”
methodology
• Moving system architecture from heuristics to
quantification
• Incorporating lifecycle uncertainty into
systems engineering
• Incorporating flexibility into systems
engineering
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
12
System of Systems Context
Definition: A system of
systems is a combination
of platforms and
subsystems that work in
coordination to achieve
an objective
unachievable by them
individually
Current examples:
• Intelligence collection and
integration networks
• Kill chains
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
13
Motivation And Key Questions
Systems of systems (SoS) are increasingly central
to military tactics, strategy, and success
– May be emerging in real time—platforms and subsystems pulled
together to meet immediate operational objectives
– Currently few (if any) SoS “owners”, with corresponding system
analysis tools
– Limited perspective on the evolutionary behavior of SoS,
particularly for developing investment and planning strategies
Can these emergent SoS be predictively and
robustly architected and engineered?
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
14
Terminology For System Architecture
Optimization
• The process of finding the absolute best solution to a problem
(mathematics)
• The process of finding families of good solutions (engineering)
Conceptual Design
• The process of generating, analyzing, and comparing a vast array
of architectures for a given mission during the early stages of a
program, working towards concept selection [NASA SEH]
System Architecture
• The structure, arrangement, or configuration of a system of
elements and the relationships required to satisfy both the
constraints and a set of requirements [NASA SEH]
Systems Architecting
• The process of creating a system architecture that satisfies the
mission requirements while concurrently optimizing the system
within a set of constraints [INCOSE]
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
15
What is a Space Architecture?
The space architecture is the (form to function)
•
•
•
Spacecraft, ground stations
Communication links (RF and optical) in space & ground to and from space
Logical links (computer interconnections)
Which are necessary to deliver a given function to users, e.g.:
•
•
•
Deliver worldwide on demand assured communications to warfighters in a nuclear
crisis (MILSTAR)
Provide worldwide warning of the launch of strategic ballistic missiles (DSP)
Provide worldwide precision navigation and timing information continuously (GPS)
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
16
Architecture and Heuristics
“Before allocation (of system requirements) the system
engineer must conceptualize a system architecture….there
is no direct way to arrive at a design…there is no fixed
method to define the system concept and architecture..thus
systems engineering is an art not a science”
(SMC Systems Engineering Primer and Handbook, 15 Jan 2004, pp14)
– Much large scale system architecting occurs through heuristics and
legacy experience
Can Systems Engineering (including system
architecting) move to quantification of system
architecture?
– This is moving up the hierarchy of knowledge from description
to prediction
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
17
Sources of Uncertainty
in Aerospace Systems
Development uncertainty: Uncertainties of development of a
product/service
•
•
•
•
•
Political uncertainty—Development funding stability
Requirements uncertainty—Requirements stability
Development cost uncertainty—Development within cost targets
Development schedule uncertainty—Development within schedule targets
Development technology uncertainty —Technology provides expected
performance
Operational uncertainty: Uncertainties of contributing value once
product/service is developed
•
•
•
•
•
•
Political uncertainty—Operational funding stability
Market Uncertainty—Meet the demands of an uncertain market
Lifetime uncertainty—Performing to requirements for life
Obsolescence uncertainty—Performing to evolving expectations
Integration uncertainty—Operating within other systems
Operations cost uncertainty—Meeting operations cost targets
Model uncertainty: Uncertainties in our system tools/models
Develop a life-cycle & holistic perspective on uncertainty
© 2004, D. Hastings, Massachusetts Institute of Technology
18
Cost-capping Policy Intervention
Cost-capping government program expenditures is most
frequently reported government policy intervention
– Annual program budget capped by Congress
– Capping stretches out program duration and increases total
program costs as a result
Historical examples provide basis for relationship between
schedule extension and cost growth
Schedule and Cost Changes
16
14
*
Cost Change [%]
Technical
Domain
y = 0.2365x + 1.6987
R2 = 0.5579
12
Political
Domain
10
Architectural
Domain
8
6
4
2
Operational
Domain
0
0
5
10
15
20
25
30
35
40
45
50
Schedule Change [%]
* Data adapted from Augustine, Norman R. Augustine’s Laws, New York: Viking Penguin Inc., 1986
© 2004, D. Hastings, Massachusetts Institute of Technology
19
Uncertainty: Near-Term Cost of
Program Instability
Cost growth (average annual)
- Budget changes
- Technical difficulties
- Changes in user requirements
- Other sources
- Total
Program Managers
Government
Contractor
(N=101)
(N=80)
2.3%
2.4%
2.5%
0.1%
7.3%
1.8%
2.7%
2.7%
0.8%
8.0%
Finding: The “average” program can expect 4.5-5% cost growth
resulting from budget and requirements changes, year after year
Impact: Research identified factors contributing to program risk and
mitigating lean practices incorporated in DoD risk management
guidance (DoD 5000.2 and Deskbook)
SOURCE: 1996 LAI Government PM survey, 1996 LAI Contractor PM survey.
© 2004, D. Hastings, Massachusetts Institute of Technology
20
A Framework for Discussion of
Uncertainty Impacts on Complex Systems
Uncertainties
• Lack of Knowledge
• Lack of Definition
• Statistically
Characterized
Variables
• Known Unknowns
• Unknown Unknowns
Risks/
Opportunities
• Disaster
• Failure
• Degradation
• Cost/Schedule (+/-)
• Market shifts (+/-)
• Need shifts (+/-)
• Extra Capacity
• Emergent
Capabilities
Mitigations/
Exploitations
• Margins
• Redundancy
• Design Choices
• Verification and Test
• Generality
• Upgradeability
• Modularity
• Tradespace Exploration
• Portfolios&Real Options
Outcomes
• Reliability
• Robustness
• Versatility
• Flexibility
• Evolvability
• Interoperability
<Uncertainty> causes <Risk> handled by
<Mitigation> resulting in <Outcome>
Can SE develop tools & understanding
to explore this framework?
© 2004, D. Hastings, Massachusetts Institute of Technology
21
Flexibility: A word Rich with Ambiguity
Flexibility: A Word Rich with Ambiguity
1. Proliferation of pseudo-synonyms
2. The fundamental problem of Real Options
Selected literature review
Flexibility in Manufacturing Systems
“The concept behind type II robust design for
providing flexible solutions is represented below…”
Performance
Optimized
Solution
Robust
Solution
M
Design
Variable
Managerial Flexibility
Flexible versus Rigid Plans of Actions (Game Theory)
Type II robust design: developing flexible
solutions [Chen and Lewis, 99]
Comprehensive treatment of flexibility in system design
1. What is flexibility?
2. Why or when is flexibility in system design needed?
3. How to embed flexibility in system design?
4. Trade-offs/penalties (value, cost, performance, risk, etc.)
© 2004, D. Hastings, Massachusetts Institute of Technology
22
Definition: Process versus Design Flexibility
Flexibility of a design: Property of a system that allows it to respond to
changes in its objectives and requirements occurring after the system
has been fielded, in a timely and cost-effective way
Conceptual
design
Preliminary
design
Detailed
design
T0
Production
Tprod
Process flexibility
System
operations
System
retirement
Time
Tops
Design flexibility
Robustness: Property of a system that allows it to satisfy a fixed set of requirements despite
changes occurring in the environment or within the system itself after the product has been fielded
© 2004, D. Hastings, Massachusetts Institute of Technology
23
Definition and Examples
• Designing a spacecraft for 50-100 years
• Galileo & the Galileo Europa Mission
System’s Requirements (after fielding)
Poor Design
Changing
Fixed
Optimized Design
Fixed/known
Flexible Design
e.g., Thuraya beam pattern
and energy level per beam
adjustable after launch
Robust Design
Changing/unknown
Environment
e ti
rta
y
int
Li f
ce
Un
me
• B-52 versus B-58
• TechSat21: Instance of
flexibility and implication
Flexibility
© 2004, D. Hastings, Massachusetts Institute of Technology
24
Case study:
Galileo Rescue Mission Software Upgrade
• Galileo’s high gain antenna failed on April 1991
• The mission was in danger of catastrophic failure
• An innovative combination of new, specially developed
software for Galileo's on-board computer enabled the spacecraft to
use its low-gain antenna
Utilizing the new
software, 70% of
the original
science mission
goals were
accomplished!
© 2004, D. Hastings, Massachusetts Institute of Technology
25
Questions on Flexibility
Can SE develop a general understanding of flexibility
so that choices can be made in the design and build
phase?
– Example: A general understanding of flexibility for space systems
– When does want to use
• Software changes at the spacecraft level e.g Galileo
• Software changes at the ground station level e.g GPS
• Changes in the communications link structure e.g DSP
• Changes in constellation configuration e.g planned changes for
Iridium
• Additions to the constellation e.g smallsats which operate in
close proximity to a larger satellite
• Changes to the ground station hardware e.g SBIRS High
• Changes to the space based hardware which fall in two classes.
– Life extension
– Mission upgrade
© 2004, D. Hastings, Massachusetts Institute of Technology
26
Conclusions
Systems Engineering needs to evolve within the
larger context of Engineering Systems
There are serious challenges for practitioners and
academics:
• Development of “system of system” methodology
• Moving system architecture from heuristics to
quantification
• Incorporating lifecycle uncertainty into systems
engineering
• Incorporating flexibility into systems engineering
© 2004, D. Hastings & D. Rhodes, Massachusetts Institute of Technology
27
Headquarters U.S. Air Force
Integrity - Service - Excellence
Revitalizing Systems
Engineering - USAF
Approach
Mrs. Marty Evans
Director
Acquisition Center of Excellence
June 8, 2004
As of:
1
Revitalizing SE
„
„
Assistant Secretary (Acquisition) directed development of a
program to revitalized AF and industry SE – Spring 2003
Plan directs a fundamental shift in AF Systems Engineering
Policy and Practice
1. Elevate SE as principal consideration in:
„
„
„
2.
Deliver products that exhibit “robustness”
„
„
3.
solicitations
award
contract execution
easily expandable/scaleable
Insensitive to variation
Improve SE workforce
Refocusing of AF SE Policy & Practice!
As of:
Integrity - Service - Excellence
2
Why Change?
„
Evolutionary Acquisition is the “norm” for how the
AF procures future systems
„
As of:
Robust SE minimizes scrap/rework and build/test/fix cycles
when changes occur
„
SE capability (both Government & Industry) has
declined in recent years
„
Good SE can mitigate program “surprises”
„
Systems becoming more complex
„ System of Systems (SoS)
„ Family of Systems (FoS)
Integrity - Service - Excellence
3
Robust SE & Agile Acquisition
- SYSTEMS EASILY
ADAPTABLE TO CHANGE
(SCALABLE/EXPANDABLE)
SPEED
- SUPPORTS EVOLUTIONARY
ACQ
ROBUST
SE
AGILE
ACQUISITION
!
- FEWER SURPRISES
-LEADING INDICATORS
- INSENTITVE TO VARIABILITY
IN MANUFACTURE & USE
As of:
CREDIBILITY
Integrity - Service - Excellence
4
Air Force Approach
Three “Increments” to revitalizing SE:
„
Increment 1 - Immediate actions
Policy Memo – April 03
„
PMs assess SE on each program and suggest performance incentives
„ Include SE status in program reviews
„
„
„
Define standards
Evaluation of SE in proposals
Evaluation of SE performance
Incentives
Demonstration
„
Engage LAI
„
„
„
„
As of:
Policy Memo – Jan 04
Increment 2 – Standards for Robust SE
Increment 3 – Fixing the workforce
Industry
Input
Policies institutionalized
by incorporation into
AFFARS Source
Selection Guide, AFI 63101 and SE Guide
Future
Integrity - Service - Excellence
5
LAI Support to Systems
Engineering
LAI Initiatives on Systems Engineering
Fast Track
SE Research
SE Learnings from Lean Now Projects
Workshop on System Robustness
Robust SE Best Practices Study
Pilot Projects
Recommended
Policy
Changes
Robust
Engineering
Guide
(Updates)
Best
Practices
&
Research
Reports
Knowledge gaps
Longer Term
SE Research
...
Lean SE
Practices
Workforce Development & Education – Academia -- Government -- Industry
As of:
Integrity - Service - Excellence
6
Evolution of Government Role
in SE
No
Oversight
Hands on
Gov’t Oversight/Insight
Gov’t
1980s
Contr
SE
R
TSP
1990s
Contr
Gov’t
M
ov
e
2000s
SE
to
He
re
Contr
Gov’t
-Integrate into overall
architecture
-Solid req’ts foundation
--Insight into progress
As of:
Integrity - Service - Excellence
SE
- SE
-Accountable for results
- Incentivized for
robustness
7
SE End-State
Meet the warfighters performance requirements
AND
„ Products/systems that exhibit attributes of
spiral capable/robustness:
„
„
Are easily scalable/expandable to meet future
capability needs
Are desensitized to expected variability in
manufacture and use
Supports Evolutionary
Acquisition Strategies!
As of:
Integrity - Service - Excellence
8
Supporting Warfighter
Capabilities
As of:
Integrity - Service - Excellence
9
Workshop on Systems Engineering
for Robustness
SE Education & Robust SE Guide
Mr. Mark K. Wilson
Director
8 June 04
Systems Engineering Education
Education Opportunities
• Graduate - MS
– Second year offering of revised curriculum
• Graduate Certificate
– Second year
– Distance learning - Univ. of New Mexico
• SYS 182: Introduction to Systems Engineering
– Orientation briefing (working with Log Centers on tailored
version
– Converting to 3 hour, web-based, virtual course
• SYS 282: Applied Systems Engineering
– Advanced course (3.5 days)
– Initial Offerings Underway- Ogden, Warner Robbins, Eglin
Exploring
Exploringcollaborative
collaborativeopportunities
opportunities––DAU,
DAU,Univ
Univof
ofFlorida,
Florida,
Mercer
MercerU/Georgia
U/GeorgiaTech,
Tech,USC,
USC,Stevens
StevensInstitute,
Institute,Others
Others
Education Opportunities (cont’d)
• SYS 283: Introduction to Architecture
– Over a Dozen Offerings this Fiscal Year
• Architecture-Based Systems Engineering
– Developed by Dr. Alexander Levis, AF/ST – administered by
AFCEA
– Targeted at Senior Leadership (1&2 Star Generals and
SES’) – selected by AFSLMO
– AF focus initially, to be expanded to other services and
Industry
– Beta offering mid March 04 – two to three offerings per year
AFIT SE Graduate Students
• Degree-Seeking Students
– 1 graduating Mar 04
– 19 IDE students started program in Fall 03
(Sep 04 graduation)
– 8 thesis students (6 mil, 2 civ) started program in Fall 03
(Mar 05 graduation)
– 5 Space Systems students pursuing SE degree
– 30+ new IDE students will starting in Jun 04
• Certificate-Seeking Students
– 10 students completed in Mar 04
– 15+ students began program in Fall 03
• Includes 10 WPAFB, 5 Kirtland AFB part-time students
• Additional students pursuing certificate as minor concentration
towards other degree
Case Studies
F-111 Aardvark
C-5 Galaxy
Jun 04
TBMCS (Theater Battle
Management Core Systems)
Hubble Space Telescope
Robust Engineering Guide
Robust Engineering Guide
– Attributes of Robustness Guidance
• Deliver promised capabilities within budget and schedule
• Are easily scalable/expandable to meet future capability needs
• Are desensitized to expected variabilities in manufacture and use
– Contractual guidance for Inserting SE into Solicitation
and Award Process
• Section M
• Section L
• Incentives
– Examples of Leading Indicators
• Pro-active measures of SE process effectiveness
• Flag potential problem areas in time to fix
• Minimize surprises
– Available on CSE web site (http://cse.afit.edu)
Leading Indicator - Examples
• Status of Requirements Concurrence
– All levels against time
• Design Documentation Actual vs. Planned
– All levels against time
• Status of Specifications Approved vs. Planned
– All levels against time
• Software Work Products Tracked
– Complete vs. planned against time
A Brief Look at the Downstream Result of Systems
Engineering from a Person-Rated Human Systems Interface
Perspective
Dwight Holland, M.S., M.S.E., Ph.D
Human Factors Associates, Inc.
Recent Reserve Military affiliations:
International Program Manager/USAF Office of Scientific Research/IO
Human Systems Center, Brooks City-Base
Instructor/Curriculum Development Crew Systems Analysis
US Navy Test Pilot School
Naval Air Test Center, Patuxent River, MD
Past Human Factors-Related
Disasters (culture also!)
•
•
•
•
Three Mile Island—others
Salyut (1971)
Apollo 1 fire
Culture– many–
Challenger, Columbia
• HSI– many aircraft–
civilian and military
• Fatigue—Cuban Missile
Crises
• On and on…
Spacecraft Systems/Human Factors Engineering
Systems Engineering
Poor Human
Factors
Design
Potential for Accident Occurrence
Chance/Organizational or
Systems Issues
Mishap Occurs
Medical
Emotional
Systems
A Few Thoughts to Ponder…
• Dwight’s Quotes:
– “You may ‘fly’ with your
hands, but you get there
with your mind.”
– “Airspeed is life, so is
Situation Awareness.”
– Knowing who you are,
where you are, the Tactical
and Strategic Mission
State, and where you are
going
– Time/information
processing key elements
USAF Experience (Overview for a ten year
window)
•
USAF Class A Rates for the Fast
Movers (Holland and Freeman
1992, Holland 1995) per 100,000
hours:
– F-5
12 4.76
–
–
–
–
–
–
–
–
–
–
B-1
2 3.13
F-16
59 2.86
F-4
71 2.25
A-10
42 2.01
F-15
27 1.50
F/FB-111 12 1.21
T-38
18 0.51/1.6
T-38 (USN)
3.86
70-85% USAF LSA/SD, etc
Cost to DoD of HumanSystems Interface Issues in
about 1 Billion/year
Customer
Needs
Regulatory
Requirements
Design with Feedback from Testing
Feedback to design
REQUIREMENTS
DEFINITION
DESIGN
PROCESS
Proof –of-Concept
Mini - evaluations
Test Plans
And
Success
Criteria
TEST and
EVALUATION
SUCCESS
Credit: Newman
and
Greeley, 2002
Avionic
Systems
Cockpit
Design
Cockpit
Avionics
Rig
Aircraft
Prototype
Final
Aircraft
Design
NEW
Test Pilot
Tools
Training
Procedures
Limitations
G
Part-Task
Simulation
H
F
Full
Simulation
E
End User
Participation
New proposed design validation loops
and tools ( E, F, G, and H )
NEW
Detailed
Certification
Materials
Crew Station Analysis
Overview—TPS Program
Week 0
Weeks 3-5
Week 28
INTRO
ANTHRO
ANALYSIS
SENSES
INFO PROC
WORKLOAD
DISPLAYS
TASKS
SA
COCKPIT
EVAL
STRESS
DYN
ANALYSIS
DT-II
Systems Theory Layers for LongDuration Spaceflight
International Partners Pressures/Interfacing
National Political, Social, Economic Environment
Space Agency Goals/Rewards
Mission Control Decisions
Spacecraft Environment/Systems
Group Dynamics
Human-Machine Interface
Human Performance
Overview of MIR/Progress Collision: A
True Systems/Human Factors Mishap
• Previous near-miss with
Linenger on board
• KURS radar caused camera to
fail (?). Turned off 2nd attempt
• No reliable range/velocity info
• MIR Commander, Vasili
unpracticed, new crew member
for 2nd attempt, Michael Foale
• Progress out-of-sight approach
• Collision/Decompression
• Cables in Spektr delay isolation
Workshop on
Systems Engineering
for Robustness
June 2004
Accelerate capture and dissemination of key considerations and best practices
Broaden practitioner knowledge on current practices and emerging methods
Focus Groups
Day One – “Analysis”
Day Two – “Synthesis”
ARCHITECTING
Chair: Dr. Rashmi Jain, Professor, Stevens Institute of Technology
„ Assistant: Ryan Boas, MIT PhD Student
SYSTEMS DEVELOPMENT
Chair: Dr. Dwight Holland, President, Human Factors Associates
„ Assistant: Ryan Whitaker, MIT SM Student
SUSTAINMENT
Chair: Dr. Annalisa Weigel, Professor, Massachusetts Institute of Technology
„ Assistant: Matt Richards, MIT SM Student
ORGANIZATIONAL FACTORS & INCENTIVES
Chair: Dr. Eric Rebentisch, Researcher, MIT/Lean Aerospace Initiative
„ Assistant: Vic Tang, MIT PhD Student
EMERGING METHODS
Chair: Dr. Hugh McManus, Researcher, MIT/Lean Aerospace Initiative
„ Assistant: Adam Ross, MIT PhD Student
Assessing Space System Conceptual
Design Trade Study Practices
A PhD research project at MIT-LAI is investigating methods to improve the
rigor and completeness of Conceptual Design trade study analyses for
aerospace systems
•
•
Goal: assess the current state of industrial practice of Conceptual Designlevel trade study analyses for space systems
Issues: include identifying
•
•
•
•
Involvement in the study will include:
•
•
•
•
how need is captured
how tradespaces are explored and evaluated
how the analyses fit into the larger organization and system development processes
a pre-interview survey (approximately 15 minutes)
On site interview (approximately 1 hour)
Possible follow-up telephone interview
Participation in this study is completely voluntary. Individual identities and
data will be protected and only un-attributable data will be published.
Please contact Adam Ross (adamross@mit.edu) or (617) 452-2399 if you
are interested and able to participate in this study.
web.mit.edu/lean
© 2004 Massachusetts Institute of Technology Adam Ross/6.8.04 - 3
Focus Groups -- Day One
„
„
„
Meet your focus group members and provide
individual perspectives
Capture key challenges
“SWOT” Analysis
„
„
„
„
Strengths – effective practices, what we do well
today, methods/tools that we use successfully
Weaknesses – where we need to improve, gaps
Opportunities – strategies and approaches for
addressing the weaknesses, learning from others,
education and research possibilities
Threats – Inhibitors to effective practice, barriers to
pursuit of opportunities
Focus Group Day Two – Outcomes
„
„
„
Recommendations for AF Guide and any
general recommendations
New candidate leading indicators to assess
‘goodness’ of systems engineering and
‘robustness’ of the evolving solution
Research agenda
„
„
„
„
Research Questions
Recommendations for case studies
New methods/tools
“Key Questions Systems Leaders Need to Ask”
Little Yellow Book of Software
Management Questions
This set of questions has been
developed to provide a means for
program managers and review
authorities to assess the extent
to which project management
has an adequate understanding
of the key issues involved in
conducting a successful software
project and the extent to which
these practices are employed
Our Workshop will produce a similar product ... with
key questions systems leaders need to ask
What Questions Should You Be
Asking If …
„
„
„
„
„
„
„
You are developing an RFP
You are leading a major systems review
You are developing a strategy for subcontracting
development of key products
You are organizing a team for a complex
program with multiple contractors
You are making decisions for a system-ofsystems program
Your system has to adapt to meet a new mission
Etc …
Architecture
„ How will the architecture be evaluated for expandability?
„ What measures will be used to monitor the architecture integrity?
Systems Development
„ What models will be used to assess the ability to respond to changes in
mission and environment?
„ What testing will be done to ensure effective integration of new
technologies in a system-of-systems effort?
Sustainment/Operations
„ How information will be required from subcontract ors to ensure the
product they deliver will be robust?
„ What approach will be used to demonstrate a system can satisfy mission
requirements when it is part of more than one system-of-systems?
Organizational Factors
„ What incentives can motivate multiple system prime contractors to make
decisions for ensuring robustness of overall system-of system?
„ Who will be the authority for decisions that impact system-of-system
affordability where system level funding will be negatively impacted?
Emerging Methods
„
What methods will be used for multi-variant trade space analysis?
„ What approach and/or facilities will be used for collaboration on the
concept for a new system-of-system?
Day One -- Afternoon
„
Focus Groups begin after lunch
„
„
„
„
„
Meet your focus group members and provide
individual perspectives
Capture key challenges
“SWOT” Analysis
4:15 Break/Reconfigure room
Walk-around Review: Opportunity for each
participant to review flip charts developed by
each focus group in order to add additional
comments and thoughts to the work in progress
(we will use post-it notes to add to comments
and dots to add support for ideas thus far)
Space Technology Assessment Using
Real-Options Valuation
Dr. Robert Shishko
Jet Propulsion Laboratory
Robert.Shishko@jpl.nasa.gov
Presentation to
Workshop on Systems Engineering for Robustness
June 9, 2004
Valuing Technology Developments:
The Environment
– Future mission sets highly uncertain
z Some roadmapping, but entire programs can be unexpectedly revamped
z Individual medium and far-term missions can be change and/or be rephased
– Development costs uncertain
– R&D budgets uncertain, even in the short-term
– Many engineering relationships allowing improved technology
performance to be evaluated are not well-understood
– Technology Program Managers like to exercise discretion in funding
decisions based on perceived technical progress
– Road to a new or improved technology can be very bumpy
Basic Concepts of Real-Options Valuation
“The future is uncertain . . . and in an uncertain environment,
having the flexibility to decide what to do after some of that
uncertainty is resolved definitely has value. Option-pricing
theory provides the means for assessing that value”
Robert C. Merton
Nobel Lecture, 1998
Would You Have Invested?
Basic Concepts of Real-Options Valuation
z
The real-options approach captures the additional value of
managerial flexibility inherent in any phased investment,
which makes it more suitable than older methods.
– Mathematical representation of flexibility to alter course
– Bring market information to bear when its available
– Powerful analytic techniques based on stochastic processes
NASA Technology Assessment Using
Real-Options Valuation
z
z
For a technology investment, option value is what one should
be willing to pay for the right to undertake the technology
development, incurring its uncertain cost and reaping its
uncertain benefits beginning at time T in the future.
Application to a technology investment requires:
– Time-dependent representation of the uncertainty in the technology’s
development cost, probability of success, and readiness date
(development risk)
– Time-dependent representation of uncertainty in the technology’s
application (programmatic or “market” risk)
A Very Simple Model
Suppose there’s a technology that could lower the cost a possible future
NASA mission from IT to I’T
z The mission will be “green-lighted” at time T, if its perceived value then,
VT, is greater than its cost.
z The technology cost to develop = c and the probability of technology
development success by time T = p
z This can be modeled as an option-to-exchange one European call option
for another.
z
ν i (t , T ) = max(0,−c + pe − r (T −t ) Ε[max(0,VT − I T′ ) − max(0,VT − I T )])
IT
⎡
⎤
− r (T − t )
= max(0,−c + pe
⎢∆I T prob{VT ≥ I T }+ ∫ (VT − I T′ )dF (VT )⎥ )
⎢⎣
⎥⎦
I T′
Technology Option Value Vs. Volatility and Current Mission Value
lue
a
V
ion
Opt
V
CM
Origin
ti l
la
Vo
ity
General Model for Calculating
Technology Option Value
T
T*
⎛
⎤⎞
⎡
⎡
⎤
vi (t , T ) = max⎜ 0,−Ε ⎢∑ ci (τ ) exp(−r (τ − t ))⎥ + pi ,T Ε ⎢∑ max(0,∑ X i ,k (τ ) exp(−r (τ − t ))⎥ ⎟
⎜
⎟
τ =T
⎣ τ =t
⎦
⎦⎠
⎣k
⎝
X i ,k (τ ) = VMPi ,k (τ ) − MCi ,k (τ )
vi(t,T) is the option value of technology i at time t for a technology readiness date of T
r is the riskless discount rate
pi,T is the probability that the technology i development program will be successful by the
technology readiness date, T
Xi,k(τ) is the year τ net marginal value of technology i in project k, given a successful
technology i development program
VMPi,k(τ) is the value of the marginal product of technology i in project k (This is the
mission or project value enhancement resulting from the technology.)
MCi,k(τ) is the marginal cost of “productizing” technology i in project k (This is the mission
or project cost reduction resulting from the technology.)
T* is the time horizon
Analysis Flow
Determine technology’s
applicability by mission class
Determine technology’s payoff
attribute(s)
Develop stochastic model of payoff
per flight for each attribute for
mission class i
Develop stochastic flight rate model
parameters for mission class i
Perform Monte Carlo
simulation for mission class i
Calculate discounted payoff
for each trial j
Number
of trials
OK?
No
Yes
All
applicable
mission classes
done?
No
Yes
Develop decision tree for
technology development cost,
schedule, and probability of success
Combine using Eq. (2) to compute
real-option value
Develop technology attribute
stochastic “shadow price” model
Simulation Results: Technology Payoff PDF
This multi-modal pdf for MSR-class missions results because orbital
mechanics dictates a wait of about 26 months between opportunties to
launch Mars missions on a minimum-energy trajectory (See Ref. (4))
Mars Sample Return:
-1% Flight Rate, 10% Volatility, Best Titanium Tank w/Diaphragm
0.07
0.06
Fraction in Bin
0.05
0.04
0.03
0.02
0.01
0
0
50
100
150
Benefit ($M)
200
250
Quantifying the Value of Flexibility
for Space Systems
In a world of uncertainty, options
valuation is needed to quantify the
value of designing space systems
with flexibility
z Example: Value of On-Orbit
Servicing for
z
– Life extension of commercial
satellites
– Maneuverability of surveillance
satellite constellation
z
Developed by Lamassoure and
Hastings (MIT) in Ref. (3)
Quantifying the Value of Flexibility
for Space Systems
A combination of Real Options, Decision Tree Analysis and space systems
engineering models
Space System
Design
(from engineering
models)
Matrix of Costs
(from
engineering cost
modeling)
Utility Metric
(space-system specific)
Time-dependent Description of
Uncertainty
in cost or in Utility (from Real Options)
Options and Decision nodes
(Program engineering)
(l →n )
≥k
EV
( x) = x R
(n)
(n)
k
−C
( l →n )
k
+e
− rτ k
NM
∑ ∫ EV
m =1 I n→m
( n→m )
≥ k +1
( y ) pτ k ( y / x)dy
k +1
Real Options Math
Optimize option
selection at each
decision node
Work backwards
though Decision Tree
(from DTA)
Value of Flexibility
Decision Maps
Real-Options Approach:
Implications for NASA
z
Using the real-options framework, NASA can achieve greater strategic
flexibility with its limited technology budget
– Real-options approach captures the value arising from a technology’s
potential uses in scenarios other than the budgeting one.
– Real-options approach considers the uncertainty about what future
scientific discoveries might be made in space and what missions might
have sufficient public support.
– The option value capture payoffs that could potentially occur over the
long term without committing NASA to any particular mission.
z
Once option values are calculated, emphasis can be placed on
maximizing the value of NASA’s technology portfolio subject to a
budget constraint and the level of portfolio risk NASA management
desires.
The Real-Options Approach: Summary
z Real-options
approach deals with salient uncertainties,
including mission sets, phasing, development costs, etc.
z Recognizes that uncertainty may provide opportunities, not
just dangers
z Extends PV analysis to recognize the dynamic and flexible
nature of investment decisions
z Treats uncertainties that are hard to deal with in traditional
PV analysis
z Applied in the private sector for several years, especially in
R&D-heavy firms
z A growing set of published peer-reviewed application
examples
Backup Slides
Selected Articles on Real-Options Valuation
Luehrman, T., “Investment Opportunities as Real Options”, Harvard
Business Review, pp. 51-67, July-August 1998
Benaroch, M., “Option-Based Management of Technology Investment Risk”,
IEEE Transactions on Engineering Management, Vol. 48, No. 5, pp. 428444, November 2001
•
Lamassoure, E., Saleh, J., and Hastings, D., “Space Systems Flexibility
Provided by On-Orbit Servicing, Parts 1 and 2”, Journal of Spacecraft and
Rockets, Vol. 39, No. 4, pp. 551-570, July-August 2002
•
Shishko, R. Ebbeler, D., and Fox, G. “NASA Technology Assessment Using
Real Options Valuation”, Systems Engineering, Vol. 7, No. 1, pp. 1-12,
February 2004
Rouse, W. B., and Boff, K. R. “Value-Centered R&D Organizations”,
Systems Engineering, Vol. 7, No. 2, pp. 167-185, May 2004
Next Generation Systems
Engineering – An OSD
Perspective
Jun, 2004
Robert J. Skalamera
Deputy Director, Systems Engineering
(Enterprise Development)
Office of the Under Secretary of Defense (AT&L)
USD(AT&L) Imperatives
• “Provide a context within which I can make
decisions about individual programs.”
• “Achieve credibility and effectiveness in the
acquisition and logistics support processes.”
• “Help drive good systems engineering practice
back into the way we do business.”
2
Defense Systems
Organization
DS
DS
Defense Systems
Director
Director
Principal
PrincipalDeputy
Deputy
Director: Mr. Schaeffer
Mr. Skalamera
Dr.
Dr.Glenn
GlennLamartin
Lamartin
Mr.
Mr.Mark
MarkSchaeffer
Schaeffer
SMI
Developmental
Test & Evaluation
Mr. Lockhart
Air Warfare
Systems and
Mission Integration
Systems Acquisition
Systems Engineering
Enterprise
Development
Operations
SA
SE
Director: Dr. Garber
Director: Dr. Lamartin
Assessments &
Support
Joint Force
Application
Mr. Castellano
Land Warfare &
Munitions
Plans and
Mr. Durham
Naval Warfare
Missile Warfare
Joint Force
Integration
Ms. Quinlan
Joint Force
Operation
Mr. Kistler
Treaty Compliance
3
SE Education and Training
Summit (October 2003)
•
Brainstorming session
•
•
•
•
•
Participants
•
•
•
•
•
What’s working
What needs to be fixed
Significant barriers
Required actions
Services
Academia
Industry
Associations (NDIA, AIA, EIA, GEIA, INCOSE)
Formed five working groups, assigned leads
•
•
•
•
•
Policy
Processes
Tools and guides
Resources
Education and training
4
What We Found:
Lack of Uniform Understanding of SE
at the Department Level
• Lack of coherent SE policy
• Lack of effective SE implementation - no “forcing
function” for PM or contractor SE activities
• Program teams incentivized by cost and schedule, not
execution of disciplined SE
• Products and processes not in balance (emphasis on
speed; fix it in the next spiral)
• Inconsistent focus across life-cycle, particularly prior to
Milestone B
• SE inadequately considered in program life cycle
decisions
5
What We Found:
Lack of Uniform Understanding of SE
in the Community-at-Large
•
No single definition or agreement on the scope of SE
•
Lack of common understanding of how SE is implemented on programs
–
Is SE done by the systems engineer?
–
Does the systems engineer lead the SE effort?
•
No uniform understanding of what makes a good systems engineer
•
No consistent set of metrics/measures to quantify the value of SE
•
Cost and schedule estimation and risk management processes inconsistently
aligned with SE processes
•
Resistance to harmonization of multiple standards and models
•
Multiple practitioner communities not aligned
- Hardware
- Aircraft vs. Rocket Developers
- Software
- Submarines: Propulsion vs. Ship Designers
- Information Technology
- Program Management
- Telecommunications
6
What We Found:
System Complexity
• System complexity is ever increasing – Moore’s
Law at the system scale – Family of
Systems/System of Systems interdependencies
• Integrated systems (software with embedded
hardware) vice platforms (hardware with
embedded software)
• Network centric, spiral development, extension
of system applications are driving higher levels
of integration
7
What We Found:
The Resource Picture
• Degreed workforce is a shrinking pool
– Many graduates are not US citizens
– Total engineering enrollments continue to decrease
• Ability to attract and retain young engineers in the aerospace industry is directly
associated with the commercial marketplace
– The aerospace and defense industry is seen as being overly bureaucratic and lacking in
exciting technical challenges by engineering students
– 5 year itch
• Existing university/industry partnerships are not having enough impact
– SE is not a standard discipline (EE, ChemE, ME etc.)
– More focus at undergraduate level
– Do we have critical mass in terms of SE graduate level training in the U.S.?
• Need new ways to attract and develop system engineers
– Additional learning
– On-the-job experience
We need a better approach
8
Adapted from G. Shelton (Raytheon)
Systems Engineering
Revitalization
SE Education and Training
Summit Results/Actions
• Review and modify DoD SE Policy to include updating the
DoD 5000 series
• Establish a DoD Systems Engineering Forum to focus on
DoD SE efforts
• Update Education and Training
– Systems Engineering and related curriculum
– Review “enabling” domain treatment of SE
(Program Management, Contracts, Finance)
• Review/Update policy, procedures, education & training for
all SE components (R&M, Quality, CM, DM, T&E, Risk
Mgt, Value Engineering)
10
• Review/update relevant tools and guides
Elements of
SE Revitalization
Policy/Guidance
Training/Education
Assessments
SE Framework &
SEP Policy (memo)
SE Specific Courses
Assessment Guide
DODI 5000.2
Enclosure 10
Enabling Courses
Pilots
(PM, ACQ, CONT…)
Acquisition Guidebook
Continuous Learning
Courses
Underway
Completed
11
Systems Engineering Policy in DoD
Signed by the Honorable Mike Wynne, USD(AT&L) (Acting) Feb 20, 2004
• All programs, regardless of ACAT shall
• Apply an SE approach
• Develop a Systems Engineering Plan (SEP)
• Describe technical approach, including
processes, resources, and metrics
• Detail timing and conduct of SE technical
reviews
• Director, DS tasked to
• Provide SEP guidance for DODI 5000.2
• Recommend changes in Defense SE
• Establish a senior-level SE forum
• Assess SEP and program readiness to proceed
before each DAB and other USD(AT&L)-led
acquisition reviews
12
SEP Implementation Guidance
• Submitted to the MDA at each Milestone, the SEP describes a
program’s/system’s:
• Systems engineering approach
• Specific processes and their tailoring by phase
• Both PMO and Contractor processes
• Systems technical baseline approach
• Use as a control mechanism, including TPMs and metrics
• Technical review criteria and results
• Event driven
• Mechanism for assessing technical maturity and risk
• Integration of SE with IPTs
• Organization, SE tools, resources, staffing, management
metrics, integration mechanisms
• Integration of SE activities with integrated schedules
(e.g., IMP, IMS)
Systems Engineering Plan: Prescribed Contents not Format
13
SE in Defense Acquisition Guidebook
(formerly DoD 5000.2-R)
• New guidance to acquisition community
• Best practices for “applied” SE
• SE process
• Guide for each acquisition phase, concept
refinement through disposal
• Linkage of SE products and processes to
acquisition objectives and decision points
14
SE Education & Training
• Starting with DAU (DAWIA) courses, with plan to address broader
E&T community (e.g. undergraduate and graduate courses)
• Developed methodology for DAU course modification/ intervention to
revitalize SE training
• Courses for decision makers (i.e., PMs, PEOs, SESs)
• Core, certification courses before assignment specific
• Career fields with large populations (viz., SPRDE)
• Courses mandated for all Corps members (e.g., ACQ)
• Prioritized focused continuous learning and short courses
(System Safety, Technical Reviews, RAM, SEP)
• Courseware review:
•First tier – ACQ, PMT2XX/4XX, SAM 301, SYS, TST301
•Second tier – LOG, other PMT, other SAM, other TST, Selected
BCF, CON, PQM
15
Senior SE Forum Standup
• Initial mtg Apr 22, 2004
• Flag-level representation across Services and
components
– Army, Navy, AF, Space, SOCOM, DTRA, DIA,
NSA, NGA, NASA, NRO, DAU, DLA, DCMA,
DISA, MDA; OSD (NII), (DOT&E), (DDR&E),
(L&MR), (I&E), (ARA)
• Defense Systems presented the SE context
and challenge; solicited forum support for
collaborative solutions
16
SE Forum Next Steps
• “Each Component Acquisition Executive and defense
agency with acquisition responsibilities will, within 90
days provide Director, Defense Systems, its approach
and recommendations on how we can ensure that
application of sound systems engineering discipline is an
integral part of overall program planning, management,
and execution within both DoD and defense industry.”
USD(AT&L) memo of Feb 20, 2004
• Service Briefs rec’d from Air Force, Army, Navy, and Space Command
• Assess and comment on Draft DoD 5000.2 Enclosure 10
– Systems Engineering
• “Harmonized” E10 back out for review
• Begin to formulate an integrated and synergistic SE
strategy across our respective domains
• Next meeting – June 16, 2004
17
•
•
•
•
•
•
•
•
E10. ENCLOSURE 10
SYSTEMS ENGINEERING
E10.1 General. All programs responding to a capabilities or requirements document,
regardless of acquisition category, shall employ a robust systems engineering (SE)
approach that balances total system performance and total ownership costs within the
family-of-systems, system-of-systems context. Systems engineering provides the
integrating technical processes to define and balance system performance, cost,
schedule, and risk. It must be embedded in program planning at all life cycle phases.
The PM, in concert with the applicable systems engineering community, shall
coordinate a rigorous systems engineering approach that addresses each acquisition
phase and spans the total system life-cycle.
E10.2 Systems Engineering Strategy. Systems engineering shall be applied at the
initial stages of program formulation, and shall form the integrated technical basis for
program strategies; acquisition plans; acquisition decisions; management of
requirements, risk, and design trades; and integration of engineering, logistics, test,
and cost estimation efforts among all stakeholders. The overall systems engineering
strategy shall be addressed in, and integrated with, the program’s strategies for
acquisition, technology development, T&E, and sustainment. All system functions
(including those associated with human operators, maintainers, and support
personnel) are within the domain of SE and must be rationalized in system
requirements analysis, design, and verification. A comprehensive list of design
considerations that must be taken into account throughout the SE process are in the
Systems Engineering chapter of the Defense Acquisition Guidebook.
E10.3 Systems Engineering Leadership. Acquisition components shall maintain a
systems engineering technical authority to ensure proper training, qualification, and
oversight of systems engineering personnel support to cognizant programs. The
technical authority shall appoint a lead systems engineer at program inception. The
lead systems engineer shall be accountable to the program manager and to systems
engineering technical authority for the proper application of systems engineering to
meet program objectives.
E10.4 Systems Engineering Planning
E10.4.1. Systems Engineering Plan. Programs shall develop a Systems Engineering
Plan (SEP) for Milestone Decision Authority (MDA) approval in conjunction with each
Milestone review, and integrated with the Acquisition Strategy. This plan shall
describe the program’s overall technical approach, including processes, resources,
and metrics, and applicable performance incentives. It shall also detail the timing,
conduct, and success criteria of technical reviews.
E10.4.2 SEP Contents. The SEP shall be established early in the program definition
stages and updated continually as the program matures. Successful program
execution requires an effective partnership between the government program office
and its contractor(s). The SEP shall be an integrated document that defines the
systems engineering roles and responsibilities of all parties. The SEP shall serve as
a roadmap of how systems
•
•
•
•
•
•
•
•
engineering will support the translation of system capability needs into a cost-effective,
suitable, and sustainable product. The SEP is a living document and, as a minimum,
shall address the following:
E10.4.2.1. The systems engineering processes to be applied. The SEP shall describe
what systems engineering processes will be used, how these processes and controls are
to be tailored and applied in each acquisition phase, how the processes support the
technical and programmatic (cost, schedule, performance) metrics and products required
of the phase, how the application and integration of the processes are reflected in the
program’s integrated master plan and integrated master schedule, and how the
processes enable risk mitigation and support the successful completion of technical
reviews.
E10.4.2.2. The system’s technical baseline approach. The SEP shall describe how the
system technical baseline (i.e., functional, allocated and product baselines) will be
developed, managed, and used to control system design, integration, verification, and
validation.
E10.4.2.3. Systems engineering integration on program teams. The SEP shall describe
how systems engineering activities, information, and products will be integrated within,
and coordinated across, program teams (e.g., IPTs). Team resources, staffing,
organizational structure, management metrics, and integration mechanisms shall also be
addressed.
E10.4.2.4. The timing and conduct of systems engineering technical reviews.
(See Technical Reviews below.)
E10.5. Technical Reviews. Technical reviews shall be conducted to provide the
Program Manager with an integrated (program team and independent subject matter
experts) assessment of program technical risk and readiness to proceed to the next
technical phase of the effort. Technical Reviews shall be event driven (vice schedule
driven) and conducted when the system under development satisfies review entry
criteria. Accordingly, at a minimum, technical reviews shall be conducted at the transition
from one acquisition phase to the next (e.g., a Systems Requirements Review to support
transition from Technology Development to System Development and Demonstration)
and at major transition points of technical effort (e.g., a Preliminary Design Review to
support transition from preliminary design to detailed design). An independent technical
authority, commensurate with program scope and risk and approved by the MDA, shall
chair the technical reviews. The process and requirements for technical reviews shall be
addressed in, and required by, contractual documents. Results of the technical reviews
shall be documented in the SEP.
E10.6. Systems Engineering in Total Life Cycle Systems Management. Systems
engineering shall continue during operation and support of the system and continuously
assess fielded system technical health against documented performance requirements
and effectiveness, suitability, and risk measures. In-service systems engineering shall
provide the program manager with an integrated technical assessment of system trends
and sustainment alternatives and then oversee development and implementation of the
selected alternative.
Workshop on
Systems Engineering
for Robustness
DAY TWO: Focus Groups
June 2004
Accelerate capture and dissemination of key considerations and best practices
Broaden practitioner knowledge on current practices and emerging methods
Dr. Sambur’s
Challenge to the Workshop
„
Quantify Value of Systems Engineering
„
„
„
What does good SE look like?
„
„
Growing set of historical data shows upfront SE expenditure
pays off
Emerging methods allows quantifiable and predictive
assessment of design and program options
Key questions list gives indicators of good SE process – more
work to be done
Guidelines for spiral capable systems
„
„
Opportunities identified as part of SWOT
Emerging methods help quantify cost (real options, trade
space analysis, etc.)
Architecting
Focus Group
Chair: Rashmi Jain, Ph.D.
Assistant: Ryan Boas, MIT
Robust Systems Architecting
(SWOT)
„
„
„
„
Identify ‘Family’/ ‘Class’ of architectures to provide
options/trade-offs – W, O
Systems Architecting for ‘Class’ of threats, scenarios
(including future scenarios), environments and not a
specific environment – W, O
Isolate things that change from things that don’t in
order to insulate against dramatic criterion through
layering, modularity, partitioning – S, O
Identify attributes of Systems Architecture that would
allow supporting variable requirements – W
Robust Systems Architecting
(SWOT)
„
Predicting probability and impact of change and making
explicit assumptions. Involve customers in prioritizing
dimensions and levels of robustness (range, fuel
efficiency, performance etc.) W, T
„
„
„
Threat of not getting agreement from customers on what’s
important.
Architecting for unprecedented systems. Use of
domain-specific knowledge, data collection, prototyping
modeling, develop concepts for bounding uncertainties.
O
Need for good upfront Systems Engineering and
investment is important for robustness – W, T
„
Threat of unavailability and inadequacy of funding
Robust Systems Architecting
(SWOT)
„
„
Need for Systems Architecting for manufacturability
and sustainment (lifecycle considerations) – W, O
Open architecture for integrating components between
multiple vendors – W, T
„
„
„
Threat of government policy and politics, and commercial
market pressure.
Systems Architecting for ‘Planned Evolutionary’
systems – W, O
Lack of multi-disciplinary (high level) perspective for
Systems Architecting - W, O, T
„
Threat of resistance to bringing in people from other nontraditional disciplines.
Robust Systems Architecting
Key Questions
„
What are your key (domain specific) parameters of robustness?
(or “What changes are expected?”) (robustness)
„
„
„
„
„
Examples: threat changes, technology changes, CONOPS changes,
environmental changes, Op tempo changes, scalability
What assumptions is your architecture dependent on? What
happens to the architecture if these assumptions change?
(robustness)
Describe how the system architecture will be audited/assessed?
(enabler)
What SA metrics will you use? How do you define them?
(robustness)
What is the range of variation on the top three parameters on
your requirements. (robustness)
Robust Systems Architecting
Research Questions
„
„
„
„
„
What are the domain specific characteristics of
robustness?
Application of real-options analysis to historical data?
How can you use mathematical techniques to quantify
and evaluate the robustness of the system architecture?
How can SA metric set be characterized?
What is the impact of system robustness on cost and
mission effectiveness of SoS?
Systems Development
Focus Group
Chair: Dwight Holland, Ph.D.
Moderator: Donna Rhodes
Assistant: Ryan Whitaker, MIT
Systems Development
Challenges
„
„
„
„
„
„
„
„
„
Developing systems that have a reasonable ability to be adaptable and flexible over
time depending upon the context/missions the system operates in
Better assessment and anticipation of future states with regard to the future threat
environ that the system may reasonably be expected to operate in
While systems design may need to be frozen at a given point it time, robust systems
can adapt within reason to changing requirements– flexible, adaptable
Gaining a better understanding of long term risks for collaborative systems in spiral
development
Fostering a culture where people are more accepting that other real-world systems will
change unilaterally; Power structure should not impede info flow
Designing products with portable modularity that are relatively seamless across
platforms (e.g. software good example) –very important for military applications
Fostering a joint SE-PM culture for spiral development/evolutionary acquisition
Understanding what are the highest value activities for systems engineering and
adapting processes given the unique characteristics of the program
Better communication both internally and externally with regard to reviews of system
requirements for the various system disciplines
Systems Development
Strengths & Weaknesses
„
Strengths
„
„
„
„
„
„
„
„
„
Existing SE processes & structures properly applied lead to good results
Significant levels of process definition and training exist within
within organizations
Growing recognition that SE is critical to success of programs
Effective use of IPTs in development of requirements
Generally good basic science/technology and engineering concepts
Technology allows for better integration and verification capabilities
capabilities
Many tools have advanced the systems development capabilities
Some modeling & simulations tools exist for understanding alternative
alternative system scenarios
Weaknesses
„
„
„
„
„
„
„
„
„
Effective systems integration (including humanhuman-systems integration), lessons learned, and state of art knowledge
knowledge not often
enough applied at front end of systems design process
We don’
don’t fully account for the ‘n’ dimensionality of FoS & SoS complexities vs classical systems
We often do not consistently assess multiple problems and opportunities
opportunities given changing environment, new technologies,
and likely threats, etc.
SE approaches are often incompatible for HW & SW systems
We lack good measures/leading indicators of systems engineering of robust systems
Not enough people trained in “contemporary”
contemporary” best systems engineering practices
Often insufficient upfront
user
needs and requirements definition
up
Results from operational research, test and development do not often
often effectively get fed back into the systems design and
acquisitions processes
More modeling & simulation tools need to be created & implemented
implemented more consistently
Systems Development
Opportunities
„
„
„
„
„
„
„
Improve dissemination of verified SE best practices and lessons learned from
‘failures’
Modern technology allows for enhanced capabilities for more rapid/accurate
assessments, prototyping, and feedback in evolutionary programs
Better justification can be made for funding engineering for
robustness/systems engineering as an important component of the larger
acquisition process
Technology and database capacity has opened the door for better FoS/SoS
assessment tools that were not previously within technical reach
Technology now exists for better data fusion and visualization, and needs to
be considered as a high priority when interacting with complex data sets
Online collaboration encourages data exchange/communication—could
incredibly help up-front part of process
Creation of a methodology for evaluating potential future states thus
improving adaptability & flexibility for robust systems
Systems Development
Threats
„
„
„
„
„
„
„
„
„
„
„
„
Poorly defined user needs/requirements management threatens performance outcomes
Current incentives drive unbalanced focus on cost/schedule versus robust system performance
Divergence of ideas on how and when to implement the SE process
Inability to quantify the value-added of good systems engineering in the development process
Despite best intentions, too much ad-hoc systems development (e.g., fielding prototypes) may
lead to less robust SE processes
Over-emphasis on “exotic” technologies when not the most optimal solution to the problem
Not being able to distinguish between need for fixed versus flexible requirements
Accelerated development cycles can go only so far before unacceptable high risk is incurred
Impending loss of systems engineering subject matter expertise
Software engineering trying to operate independently of system engineering
Lack of cross-training between systems & software
Current emphasis by senior leadership on system engineering may disappear if not completely
institutionalized
Systems Development
Research Questions
„
Robust SE process research questions:
„
„
„
„
„
„
What are the common attributes of the systems engineering process of some particularly robust systems?
What are the SE risks & consequences of an accelerated procurement?
What can be done to create ‘smart’ tools/techniques (esp. ‘smart’ w.r.t. software) for implementing,
evaluating, & improving the SE process?
How do we extract the most information from past failures and successes? How did any of these affect
policy? Were any of these policy changes still in place 5 years later? 10? Was there a ‘pendulum’ effect?
What is the value added by robust approaches to SE? How can this be quantified? Could this be done
by analyzing the cost and benefits of having done the SE different (with an eye for robustness) for wellknown failures?
SE for robust systems research questions:
„
„
„
„
„
What can be done to develop better human-system interface testing procedures? (e.g. flight test and other
complex person-rated systems)
What has worked in the past to tie the user needs more directly to the developer? (i.e. How do achieve
direct communication with the user in order to develop the right product the first time? ) What
methodologies or tools can be created to help this? How can online collaboration help accomplish this?
How can we better communicate lessons learned from database evaluations of incidents, mishaps, &
usability studies to the front end of the process for all future development of similar systems?
Can tools & methodologies be developed for determining effective system life and planning system
upgrades?
When should one create a disposable system versus a robust one? What metrics help determine this?
Systems Development
Key Questions
„
„
„
„
„
„
„
„
„
How are you evaluating the scalability or expandability of the architecture of the
system?
What is your approach for ensuring interoperability now and in the future? How will
you decide which external systems not to interoperate with?
How are you evaluating compatibility of your spiral concepts and any impact on the
baseline system?
How do you accommodate technology insertion into the system? What methodology
are you using to assess the maturity of the technology being proposed for the solution?
How will you show how well you accommodate changes in the operational context?
How are you going to model the system early to ensure robustness? How are the
models and the modeling process going to evolve it through the life of the system?
How are you managing the operational risks that will occur after product completion?
What types of assessments and testing are being used to evaluate success in new
concepts or technologies used in the development? (are you running pilots, e.g.)
(related to spiral)
How are you going to evaluate alternative requirements (capabilities) to determine a set
that will lead to a viable set of requirements to which we can design. (pre-Milestone
A)
Sustainability
Focus Group
Chair: Annalisa Weigel, Ph.D.
Assistant: Matt Richards, MIT
Sustainability
Challenges
„
„
„
„
„
„
„
„
„
„
„
„
Sustainability doesn’t get a seat at the table early on
No cradle-to-grave mentality (different owners of system phases)
Misalignment of incentives across lifecycle of system
Total fiscal policy (funding stovepipes) deters making lifecycle cost-benefit
trades and having total ownership cost a metric
Too short term a focus; project managers don’t want to pay for options
beyond what’s written in the spec
Insufficient attention early on in acquisition process
Responsibility without authority
Poor or lacking metrics, data and benchmarks for successfully addressing
sustainability
Sustainment not uniformly a part of systems engineering historically
Maintenance of industrial base for low rate production over long timeframes
Lack of education on importance of sustainability
Lack of overarching architecture for sustainability
Sustainability
Opportunities
„
„
„
„
„
„
„
„
Give sustainability a seat at the system requirements/design table
from day 1
Long term relationship with contractors and suppliers
Rate contractors at least partially based on sustainability of
product
Incorporate life cycle sustainability into selection process
Educate acquisition workforce, contractor workforce on
assessment of supportability/sustainability of systems
Bridge legal/competitive/cultural hurdles to share sustainability
lessons learned
Establish metrics and data collection for benchmarking and
monitoring progress and doing trade studies
Make a sustainability plan a requirement for milestone decisions
Sustainability
Threats
„
„
„
„
„
„
Uncertainty/credibility in scenarios used for assessing
sustainability
Transition from requirements to capabilities doesn’t happen
(focus on function not form)
Not maintaining flexibility in requirements long enough in design
process to properly address sustainability issues
Focus on systems engineering and sustainability fades in favor of
short-term cost and schedule concerns
Erosion of workforce, diminishing manufacturing capability, lack
of people entering engineering, changing cultural attitudes
towards instant gratification
Corporate response to systems engineering effort limited to
responding to current USAF focus and doesn’t become
institutionalized
Sustainability
Key Questions
„
„
„
„
„
„
„
Is sustainability being considered at all phases of system acquisition and
design?
Does the systems engineering plan reflect the importance of sustainability?
Is sustainability being given appropriate weight and consideration in trade
studies?
Are contracts being written to emphasize/incentivize sustainability as a key
lifecycle performance parameter (question for MDA to ask)?
How will you ensure adequate suppliers to develop, maintain, and operate the
system throughout its lifetime?
Are all stakeholders educated on how to understand and appropriately
evaluate sustainability considerations in system design?
What new hardware technology are you using in the system, and what is your
plan for maintaining and sustaining it? What is your plan for sustaining
software and system interfaces in the face of changing standards over the
system life?
Organizational Factors
Focus Group
Chair: Eric Rebentisch, Ph.D.
Assistant: Victor Tang, MIT
Organizational Factors
Challenges
„
„
„
„
„
Commitment to robustness is fragmented without government
champions at all levels of a program, SoS multi-stakeholders,
specific requirements or metrics.
Prevailing culture creates an environment that drives government
and industry to compromise under political pressures resulting in
unrealistic schedules and costs.
Neither government nor industry have effective organizational
structures or processes to address system engineering robustness.
Without government incentives, the strategic context of
robustness is viewed in isolation without addressing technology
roadmaps, investments, manufacturing; potentially risking
making robustness a tactical “flavor of the month.”
There are insufficient SE personnel in the workforce, in the
pipeline, or in training.
Organizational Factors
SWOT
STRENGTHS
„ Leadership
„
„
„
„
Government/industry/academia teaming to improve acquisition
excellence
„
„
„
„
Sponsorship (DoD, AF, Navy)
Policy & Guidance
Independent Reviews
Contracts Innovation (Virginia Sub, AAAV)
Workshops/Working Groups (LAI, NDIA, INCOSE, IEEE)
CMMI
„
„
„
THREATS
Inadequate or inappropriate measurement requirements
„
„
„
„
„
Premature galvanizing on an immature set of metrics
Excessive number of metrics
No one size fits all
Ambiguity and vagueness of “robustness”
robustness” in RFPs and contracts
Inadequate resources for SE revitalization (people, funding)
Evolutional Acquisition/Robustness budget risk
Inadequate buybuy-in vertically and horizontally
„
„
„
„
„
„
„
Viewed as a fad and not taken seriously
Lack of leadership communication/involvement
Inadequate evolution of environment and culture
„
„
Processes, methods, and tools
Stress disciplined SE in addition to cost & schedule
Incorporation (contracts, training, PMs, organization)
Standards, tools
Incentives
Acceptance (fad)
Poor execution
„
DAU, AFIT, CSE
Universities (BS, MS, certificate programs)
Industry (internal, SEI, SPC)
Best practices
„
„
„
Education & training
„
„
WEAKNESSES
„ Evolutionary acquisition (spiral, incremental), SoS, NCO, and
collaborative systems stress current SE methods, practices, and
tools
„ Lag between Government policy and implementation
„
„
„
Don’
Don’t follow processes (shelfware)
False sense of security (ISO, CMMCMM-SW, CMMI)
Over reliance on process versus content/problemcontent/problem-solving
Dwindling knowledge base in Government and industry
OPPORTUNITIES
Being early in the process, we can affect government and
industry direction
„
„
„
„
„
Define criteria for assessing robustness
Modify procurement processes
Develop needed personnel skills and culture
Given government, industry, and university interest, we can
couple research and practice
„
„
Characterize industry successes and failures against proposed
solutions
Create working groups to study issues and potential solutions
(problem complexity requires more than a 22-day workshop)
Organizational Factors
Recommendations
„
Define robustness in practical terms
„
„
„
„
Establish working groups for both government and industry
„
„
„
Address needed changes in procurement procedures for use of robustness criteria within
evolutionary development
Develop standard RFP language with expected activities and products
Develop government and industry personnel who understand how to achieve robust
SE solutions
„
„
„
„
„
How it is measured/criteria for assessing
Identify/publicize examples of current program successes
Identify/quantify/publicize consequences of not addressing
Establish government and industry SE career paths, and requisite training & mentoring
Make government personnel aware of SE training opportunities
Develop strategies for organizing programs to encourage robust SE
Develop strategies to accelerate maturation of SE engineers
Provide recognitions and rewards
„
„
„
Contractual incentives
Public awards
Recognize horizontal robustness (systems of systems involving multiple contractors)
Organizational Factors
Key Questions
„
„
„
„
„
„
„
„
„
What are your organizational structure and process to tell you what can go
wrong?
What measures are in place to provide insight into risk of inadequate SE
performance.
If there are incentives for robustness, what are the criteria for success or
failure?
Do you have formal SE development and mentoring for employees?
What is the SE approach used and is it effective, why or why not?
What are your limiting factors to flexibility, scalability, schedule, and cost?
What is process to drive robustness throughout the supply chain?
What are the budgetary risks for evolutionary acquisition being addressed?
Do you have a best practices compendium and is it shared throughout the
supply chain.
Organizational Factors
Research Questions
„
„
„
Define proven measures and leading indicators
for robustness (not one-size-fits-all)
Short- and long-term strategies for accelerating
the development of competent systems
engineers
Organizational governance models for systems
of systems and families of systems
Emerging Methods
Focus Group
Chair: Hugh McManus, Ph.D.
Assistant: Adam Ross, MIT
Emerging Methods and Applications
„
Risk analysis
„
„
System and SoS Design
„
„
„
DFSix Sigma, DFManufacturing, DFProductivity
Product line/platform engineering
Technology insertion
„
„
SysML, AP233, Sim-based Acquisition
Design for X/change
„
„
Portfolios, real options, utility analysis, flexibility
Modeling languages and tools for robustness
„
„
Optimization, design under uncertainty, emergent behavior
Tradespace Exploration
„
„
Requirements variation, markets
Modularity, spiral, cross disciplinary, COTS
Virtual test
Emerging Methods
Challenges
„
Inserting mature robustness methods into
existing and new programs
„
„
Maturing, validating, and inserting emerging
methods and tools
„
„
E.g. probabilistic methods, risk management
E.g. tradespace exploration, real options, product
lines, model-based X, etc.
Sustained funding for above
„
Proving ROI
Emerging Methods
Strengths & Weaknesses
„
Strengths
„
„
„
„
Point solution and deterministic analysis tools
Monte Carlo and risk methods
Robustness analysis at component and information system
level
Weaknesses
„
„
„
Inability to specify robustness requirements
Lack of standards and tools for system and SoS robustness
analysis
Deterministic traditional engineering thinking, no emphasis
on robustness
Emerging Methods
Opportunities & Threats
„
Opportunities
„
„
„
„
„
Rigorous, quantitative upfront SE – improved system
lifecycle cost and performance
Robust design
Probabilistic tools at program/enterprise level
Extend/leverage methods from other domains
Threats
„
„
„
„
Valuation of emerging SE/robust methods
Premature/inappropriate tool implementation
No clear funding source
Required robustness exceeds method capability
Emerging Methods
Research Questions
„
„
„
„
Applications of real options in space missions
Maturing and validation of advanced methods
Development/extension of methods to handle Systems
of Systems
Quantifying value of
„
„
options/flexibility/upfront SE/good SE
Perform a quantitative assessment of historical program
data (current and when SE Processes were more
prescriptive) to validate expert opinion for policy and
training
Emerging Methods
Key Questions
„
With what methods:
„
„
„
„
„
„
„
„
„
are requirements and their changes tracked and accommodated?
are technologies and their changes tracked and accommodated?
are design alternatives and evolution paths identified, evaluated, and
chosen?
are SoS issues (e.g. emergent properties, interfaces) handled
How are these methods integrated into current program
practices?
How are your methods validated?
How are methods evolved or developed as needed?
What budget reserves and design margins are allocated to system
evolution?
How are budgets and margins defended?
Next Steps…
Focus Group
Backup Slides
Sys Dev Case Study
recommendations
„
„
„
Network-centric warfare
B-52 vs B-58
Software-compliant architecture (SCA)
Sys Dev Recommendations for the
AF Guide
„
„
„
„
Emphasize technical leadership in addition to technical
management skills; clearly define roles, responsibilities, &
accountability
SE work products (specs, trade studies, analyses) & requirements
over the product lifecycle need to be respected as much as the
end product design (and they need to be created with this in
mind). (i.e. systems engineers really need to make sure that their
products are usable & easily understandable to design engineers)
Emphasis on SE role in supplier management & sustainment
Use active voice when talking about SE activities—gives more
emphasis on the fact that systems engineers are leading the
technical effort
Download