How to test? - AIAA Info - American Institute of Aeronautics and

advertisement

Track 2 – Session 1

Tuesday, 13 August 2013

(13:30 – 15:30)

Developing “Testable” Capability-Level Requirements

Session Co-Chair: Bryan Herdlick, Ph.D.

Session Co-Chair: Eileen Bjorkman, Ph.D.

1 bryan.herdlick@jhuapl.edu

Track 2, Session 1

Testable Capability-Level Requirements

• Tuesday, 13 August 2013 (13:30 – 15:30)

• Objectives:

– Establish the challenge / issue and the associated “body-of-knowledge” gap

• Identify documentation that substantiates the research challenge

• Establish a common foundation for discourse

– Discuss SoS-level requirements and testing

• Goal: Identify prospective paradigm shifts and potential solutions

– Establish a context for subsequent collaboration and/or future research

• Foundation: Relevant contemporary references

• Foundation: Panelist / audience experience and comments

• Flow:

– ( 10 min) Introduce Panel

– (10 min) Establish topic landscape and ‘boundaries’ for this session

– (80 min) Discuss key questions (15-20 minutes per question, nominal)

– (15 min) Consider prospective solution-space

– (5 min) “Take-aways” and “next steps” bryan.herdlick@jhuapl.edu

2

Panelist Introduction

• Suzanne Beers, Ph.D.

– Panelist

– MITRE

• Systems Engineer

– Supporting: Office of the

Secretary of Defense

• Space & Missile Defense

– Retired Military

• Engineering & Test positions in

Space Systems and Space

Warfare (numerous)

• NRO: Deputy Director, Systems

Engineering

• USAF Operational T&E Center,

Detachment 4: Commander

– sbeers@mitre.org

bryan.herdlick@jhuapl.edu

• Eileen Bjorkman, Ph.D.

– Panelist, and Session Co-Chair

– USAF Test Center

• Senior Executive & Technical

Advisor

– Retired Military

• Flight test engineer

• Instructor

• Test squadron commander

• Director and staff positions associated with MS&A and

Joint T&E (numerous)

• Chief of M&S Policy Division,

Warfighter Integration &

Deployment

– eileen.bjorkman@edwards.af.

mil

3

Panelist Introduction

• David Gill

– Panelist

– Boeing

• System Test Engineer

– Activities

• Integration & Simulation

• Electromagnetic Test

• Test Ranges

– Experience

• F/A-18 Hornet

• F-15 Eagle

• CH-47 Chinook

• Space Shuttle & Space

Station

• Various Proprietary Systems

– david.j.gill2@boeing.com

• Bryan Herdlick, Ph.D.

– Session Co-Chair & Facilitator

– JHU Applied Physics Lab

• Systems Engineer

– Supporting: NAVAIR

• SoS Systems Engineering

• Integration of M&S and T&E

– Retired Military

• F-14 Tomcat RIO and WTI

• Flight Test NFO (F-14, F/A-18)

• Navy T&E Policy / Oversight

• Management (AIM-9X)

– bryan.herdlick@jhuapl.edu

4 bryan.herdlick@jhuapl.edu

Panelist Introduction

• Katherine Morse, Ph.D.

– Panelist

– JHU Applied Physics Lab

• Principal Professional Staff

• Computer Scientist

– Activity: Technical research that enhances distributed simulation

– Experience & Specialization

• Simulation Interoperability

Standards

• Live / Virtual / Constructive

(LVC) Federation Engineering

– katherine.morse@jhuapl.edu

• Steve Scukanec

– Panelist

– Northrop Grumman Corp.

• Aerospace Engineer

HALE Enterprise T&E Manager

– Activity: High Altitude Long

Endurance Family of Systems

• Test Planning & Requirements

• Test definition (Lab and Flight)

– Experience / Programs

• B-2 Spirit

• F-35 Joint Strike Fighter

• NASA, DARPA, MDA

• Co-Chair NDIA DT&E

Committee

– stephen.scukanec@ngc.com

5 bryan.herdlick@jhuapl.edu

Panelist Introduction

• Frank Serna

– Panelist

– Draper Laboratory

• Director: Systems Engineering

Directorate

– Experience:

• Draper SE activities, including:

– Trident II guidance

– Guided munitions

– NASA manned space

– UAS autonomy…

• Member Missile Defense

National Team, Systems

• Co-chair NDIA SE Division

• Chapter President, AUVSI New

England

– fserna@draper.com

• George Wauer

– Panelist

– Developmental Evaluations,

LLC; Touchstone POCs, LLC

• President & CEO

– Activity: OSD Consultant

– Experience / Specialties

• OSD SES (20 years)

– OT&E of interoperability and cyber aspects

– Command and Control T&E

• NRO / Space

– gw.touchstone@me.com

6 bryan.herdlick@jhuapl.edu

bryan.herdlick@jhuapl.edu

Topic Landscape

Literature Review

“Challenges” = Body-of-Knowledge Gaps

Terms of Reference

Discussion boundaries / guidelines

7

Literature Review Highlights

SoS-level ‘requirements’

– Absent

• SoS-based capability not documented / funded

• Conflicting priorities across constituent systems

– May require a new approach and/or lexicon?

“Capability Objective” (SoS-level)

– Human / Operator Performance not considered

• Human as a constituent system

• Unique metrics

Unique testing requirements

Supporting references and excerpts are detailed in back-up slides

8

Literature Review Highlights

SoS-level ‘testing’

– May require a new approach

• “Capability Characterization” relative to intended utility

• Operational context is paramount

– “CONOPS” at the capability level

• Full-scale SoS required

– Asynchronous development & fielding are problematic

– “Testing” ≠ Test ≠ T&E

Verify / Validate = Test + Analyze + Inspect + Demonstrate

– Balanced use of Live / Virtual / Constructive venues

Supporting references and excerpts are detailed in back-up slides

9

Terms of Reference

Capabilities & Requirements

• For purposes of this discussion…

Capability (CJCSI 3170.01C)

• “The ability to achieve a desired effect under specified standards and conditions through combinations of means and ways to perform a set of tasks. It is defined by an operational user and expressed in broad operational terms in the format of a joint or initial capabilities document…”

Requirement (Requirements Process; CJCSI 3170.01C)

• “The requirements process supports the acquisition process by providing validated capabilities and associated

performance criteria to be used as a basis for acquiring the

right weapon systems.”

10

Terms of Reference

Threshold vs. Objective

• For purposes of this discussion

Threshold (CJCSI 3170.01C)

• A minimum acceptable operational value below which the utility of the system becomes questionable.

Objective (CJCSI 3170.01C)

• The desired operational goal associated with a performance attribute beyond which any gain in utility does not warrant

additional expenditure. The objective value is an operationally significant increment above the threshold. An objective value may be the same as the threshold when an operationally significant increment above the threshold is not significant or useful.

11

Capability Objectives & Metrics

National Defense Industrial Association (2011)

“Addressing SoS Acquisition Questions”

(November 2011 DRAFT Out-Brief)

– Capability Objectives (…for the SoS in it’s entirety)

• Measured in terms of utility to the user

• Best evaluated in field / exercise / operational environment(s)

– External factors and complexity become more apparent

– Metrics

• Must be traceable to Capability Objectives

• Remain in-force as the SoS-based capability matures

• Supports utility assessment

The DoD definition of “Objective” and the concept of a

“Capability Objective” are not conflicted, but there is a difference…

12 bryan.herdlick@jhuapl.edu

Terms of Reference

For purposes of this discussion…

– “Characterization”

• Progress toward a Capability Objective

– Implies a utility assessment in an Operational Context

• Even more dependent on thoughtful application of the full-spectrum of:

– Test, Analyze, Inspect, Demonstrate

– Live / Virtual / Constructive

13

Terms of Reference

Complicated vs. Complex

• For purposes of this discussion…it won’t matter…

– Complicated

(Webster’s On-Line Dictionary)

• Consisting of parts intricately combined.

• Difficult to understand, solve or explain.

– Complex

(Webster’s On-Line Dictionary)

• A whole made up of complicated, interrelated parts.

• A group of obviously related units of which the degree and nature of the relationship is imperfectly known.

– Synonyms

(Webster’s On-Line Dictionary)

Complex, complicated, intricate, involved…

Let’s agree to let this one “slide” during our session…(please!)

14

Topic Landscape & Session Focus

“T&E”

M&S

“Requirements”

(Measures / Metrics)

(Characterization?)

Prototyping

“Architecture”

Trade Space

Exploration

“Emergence”

Operators as

Constituent

Systems

Management

Challenges

Terminology &

Definitions

Systems

Theory /

Thinking

DoD

Acquisition

Security

15 bryan.herdlick@jhuapl.edu

Keeping the Discussion Focused

“Requirements”

(Measures / Metrics)

Operators as

Constituent

Systems

“T&E”

(Characterization?)

M&S

?

Prototyping

Trade Space

Exploration

!

“Emergence”

Management

Challenges

Terminology &

Definitions

Systems

Theory /

Thinking

DoD

Acquisition

“Architecture” Security

16 bryan.herdlick@jhuapl.edu

A question / premise to consider…

If our “requirements” and “test strategy” were viable at the

SoS / capability level, the rest of the Complex System

“problem space” might become more tractable…(?)

17

bryan.herdlick@jhuapl.edu

Questions & Discussion

Target Window: 13:50 – 15:10

18

Candidate Questions for Discussion

Should performance at the capability-level be articulated differently from “traditional” platform- / system- / program-centric requirements?

Should “requirements language” be modified to better support mission-centric or capabilitycentric approaches to verification / test?

SYSTEM Requirements vs. SoS-LEVEL “Requirements”

13:50 – 14:20  30 min bryan.herdlick@jhuapl.edu

19

Candidate Questions for Discussion

Are there “preferred” venues or methods for verifying complex systems performance, and should ‘requirements’ be tailored accordingly?

– T&E approach / context

• “Traditional” system-centric T&E

– Performance verification against threshold values and/or

‘specifications’

• Capability Centric / Mission Focused T&E

Characterization of capability

WHAT we test vs. HOW we test

14:20 – 14:40 bryan.herdlick@jhuapl.edu

20

Candidate Questions for Discussion

Given that human operators constitute complex systems within the larger SoS, should human performance be reflected in capabilitylevel requirements?

– What aspects of human performance should be considered at the SoS / capability level?

– What does this imply for T&E methods and resource dependencies?

WHAT we test vs. HOW we test

14:40 – 14:55 bryan.herdlick@jhuapl.edu

21

UNCLASSIFIED

SoS Utility ≈

f

(Human Performance)

Example: Measures of Effectiveness for Command & Control

Measures / Metrics (?):

• Cognitive Performance

• Decision Quality

• Decision Timeliness

• Planning Quality

• Planning Timeliness

Cooley and McKneely (2012) JHU/APL Technical Digest 31 (1) bryan.herdlick@jhuapl.edu

UNCLASSIFIED

22

Candidate Questions for Discussion

Does the role of an operational concept

document (e.g., concept of operations, concept of employment, etc.) need to change to adequately support the development of

SoS-based capabilities?

– If requirements at the capability level are different, must the format of CONOPS / CONEMPS

/ OCDs change to accommodate SoS designs?

Establishing CONTEXT for WHAT & HOW we test

14:55 – 15:10 bryan.herdlick@jhuapl.edu

23

Candidate Questions for Discussion

Is there additional complexity or concern associated with SoS-level safety and

assurance?

– How does it differ from system-specific safety and assurance?

• Are new approaches to requirements and/or test methods required?

bryan.herdlick@jhuapl.edu

WHAT we test vs. HOW we test

 transition at 15:10 to “solution space” 24

Candidate Questions for Discussion

In what way is the incorporation of human

operators in the design process different when developing a SoS-based capability?

– Is it necessary to insert human operators from the outset, or only when the hardware and software integration is complete and the SoS is available to

“play with?”

This is a SE process question – Consider skipping to stay on schedule bryan.herdlick@jhuapl.edu

Human Operator as a System within the SoS

 transition at 15:10 to “solution space” 25

bryan.herdlick@jhuapl.edu

Solution Space (?)

Take-Aways / Wrap-Up

Target Window: 15:10 – 15:30

26

Solution Space (?)

National Defense Industrial Association (2010)

“Addressing SoS Acquisition Questions”

(November 2011 DRAFT Out-Brief)

– Capability Objectives

• Measured in terms of utility to the user

• Best evaluated in field / exercise / operational environment(s)

– External factors and complexity become more apparent

– Metrics

• Must be traceable to Capability Objectives

• Remain in-force as the SoS-based capability matures

• Supports utility assessment (capability characterization?)

27 bryan.herdlick@jhuapl.edu

“Capability Need Statement”

One possible format…

Focus: User Perspective & Utility

– The user needs to be able to ____________

• (insert: SoS-based capability objective)

– …in order to _____________

• (insert: utility statement; gap being addressed)

– …such that ____________.

• (insert: measure of effectiveness / metric)

• (address: what constitutes the first increment of

utility) bryan.herdlick@jhuapl.edu

28

SoS Utility ≈ f

(composite system performance)

• Candidate utility statements and categories of metrics / measures:

– Increase service capacity

• New customers / markets

• New targets / nodes

– Improve service in adverse environments

• Weather

• Terrain

– “Over-the-horizon”

– Aquatic / Subterranean

• Electromagnetic

• Day / Night

– Increase extent of service

• Range (miles)

• Altitude

– Exo-atmospheric

• Candidate utility statements and categories of metrics / measures:

– Improve persistence of service

• Availability

• Reliability

– Improve survivability / safety

• Reduce exposure

– Time

– Severity / level

• Reduce vulnerability

– Others?????

• Discussion…

HOW would such performance be “tested”?

29 bryan.herdlick@jhuapl.edu

SoS Utility

≈ f

(human system performance)

• Candidate utility statements and categories of metrics / measures:

• Candidate utility statements and categories of metrics / measures:

– Enhance decisionmaker effectiveness

• Accuracy / Error Rate

• Speed

• Consistency

– Within single role

– Across roles / positions

– Improve employment efficiency

• Relative to desired speed

/ timeline

HOW would such performance be “tested”?

30 bryan.herdlick@jhuapl.edu

“Litmus-Tests” for SoS-based Capabilities

ONLY successfully achieved and verified when constituent systems (or acceptable prototypes) are integrated and operational

• If human operators / decisionmakers are constituent systems within the SoS, their performance been captured as a metric

Uniquely delivered by a justifiable SoS-based design

• Do any of the constituent systems deliver the capability independently? If so, why is a SoS-based design required?

• Are multiple designs imbedded within the SoS, and if so, why?

– i.e., Can the capability be achieved in more than one way within the SoS?

• Can the SoS design be simplified / reduced, and are there disadvantages (or advantages) to doing so?

31 bryan.herdlick@jhuapl.edu

Relevant Excerpts from the

Literature Review

bryan.herdlick@jhuapl.edu

32

Systems Engineering & Requirements

• “…it is the responsibility of systems engineering to thoroughly analyze all requirements…vis-à-vis the basic needs that the

system is intended to satisfy, and then to correct any ambiguities or inconsistencies in the definition of capabilities for the system…”

– Systems Engineering: Principles and Practice, Kossiakoff & Sweet

• “The output of Requirements Analysis is a technical description of characteristics the future system must have in order to meet Stakeholder Requirements – not a specific

solution – which will be evolved in subsequent development processes.”

– INCOSE Systems Engineering Handbook (V 3.1)

33 bryan.herdlick@jhuapl.edu

Requirements   T&E

• “…it is a systems engineering responsibility to plan, lead and interpret the tests and their analysis in terms of what system design

changes might best make the user most effective.”

– Systems Engineering: Principles and Practice, Kossiakoff & Sweet

• “The purpose of the Validation Process is to confirm that the

realized system complies with the stakeholder requirements.”

– INCOSE Systems Engineering Handbook (V 3.1)

• The Validation Process “…is invoked during the Stakeholders

Requirements Definition Process to confirm that the requirements properly reflect the stakeholder needs and to establish validation

criteria…[and again] during the Transition Process to handle the acceptance activities.”

– INCOSE Systems Engineering Handbook (V 3.1)

34 bryan.herdlick@jhuapl.edu

Challenge / Gap (2013)

Trans-Atlantic Research & Education Agenda in SoS

SoSE Strategic Research Agenda

– Theme #5: Measurement and Metrics

• “…knowing what to measure and when to measure in order to determine SoS performance or likely future

behaviours is inadequately understood.”

– Theme #6: Evaluation of SoS

• “…further work is needed to be able to evaluate SoS against their expected behaviour, desired outcomes, and in comparison with other possible configurations of the SoS…”

SoS-level Requirements and Testing are Problematic bryan.herdlick@jhuapl.edu

35

Challenge / Gap

INCOSE Insight (July 2009)

INCOSE Research Plan: 2008-2020 (Ferris)

– Item 7: System-of-Systems & “Legacy” systems

• “Means to ensure assurance that a system of systems

will deliver the capability intended.”

– i.e., How do we test at the SoS level?

The SoS topic development landscape is complex too… bryan.herdlick@jhuapl.edu

36

Challenge / Gap (2013)

INCOSE International Symposium

System-of-Systems “Pain Points”

– Capabilities & Requirements (dedicated section)

• “The definition of SoS capabilities and translation to

systems requirements is core to the application of

[systems engineering] to SoS.”

• “…we often think about requirements differently when

working on an SoS.”

– “In an SoS context, many people prefer to focus on

capabilities and less on requirements…”

The relationship between Capabilities and Requirements bryan.herdlick@jhuapl.edu

37

Challenge / Gap

INCOSE Insight (July 2009)

INCOSE Research Plan: 2008-2020 (Ferris)

– Item 7: System-of-Systems & “Legacy” systems

• “Methods to address issues arising from legacy

systems in the design of new or updated systems.”

– e.g., Asynchronous development & fielding?

– e.g., Conflicting performance / capability requirements?

– Item 8: “Sundry other matters”

• “Means to predict and to design human-intensive

systems.”

The SoS development process is complex too… bryan.herdlick@jhuapl.edu

38

Challenge / Gap (2013)

Trans-Atlantic Research & Education Agenda in SoS

SoSE Strategic Research Agenda

– Theme #11: Human Aspects

• “For the technical community, human aspects are often the elephant in the room, recognised as a key aspect of the SoS, but either not included in the design or included retrospectively.”

• “…situational awareness [may be] compromised by the complexity of the SoS…”

• “…human error or unexpected behaviour features significantly in normal accidents…”

Human Operator = Constituent System within SoS bryan.herdlick@jhuapl.edu

39

Challenge / Gap (2013)

Trans-Atlantic Research & Education Agenda in SoS

SoSE Strategic Research Agenda

– Theme #8: Prototyping SoS

• “…the evolutionary nature of SoS make(s) prototyping a troublesome proposition.”

• “…emergent behaviour may not be manifest until the

system is fully deployed…”

• “…prototyping may need to take place within the

operational SoS, rather than a test bed…”

Asynchronous Fielding  Prototyping & Testing Challenges bryan.herdlick@jhuapl.edu

40

Challenge / Gap (2006)

USAF Scientific Advisory Board

Report on System-Level Experimentation

– Experiments: “The only way to explore the complexities of a system is through campaigns of

experiments, based on the proper venue, people and ideas. Combining these into a rigorous program of technology and CONOPS will create a deep understanding of what the future may be and how best to meet it.”

Experimentation  CONOPS  Technology  SoS  Capability

41 bryan.herdlick@jhuapl.edu

Bibliography & References

bryan.herdlick@jhuapl.edu

42

Bibliography

• The System of Systems Engineering Strategic

Research Agenda (2013)

– Trans-Atlantic Research and Education Agenda in System of

Systems (T-AREA-SoS) Project (www.tareasos.eu)

– Prof. Michael Henshaw, Loughborough University, UK

( tareasos@lboro.ac.uk

)

• INCOSE Research Agenda: 2008-2020

- Ferris, INCOSE Insight (July 2009)

• System-of-Systems “Pain Points”

- INCOSE International Symposium, 2013

43 bryan.herdlick@jhuapl.edu

Bibliography

• Report on System-Level Experimentation

- USAF Scientific Advisory Board, 2006

• Systems Engineering: Principles and Practice

- Kossiakoff & Sweet

• INCOSE Systems Engineering Handbook (V 3.1)

• Addressing SoS Acquisition Questions

- NDIA (November 2011 DRAFT Out-Brief) bryan.herdlick@jhuapl.edu

44

Additional References

TESTABLE REQUIREMENTS: System-of-Systems / Complex Systems

Simulation in Test & Evaluation, Allen, C.L., ITEA Journal, 2012

Rapid Prototyping and Human Factors Engineering, Beevis & Denis, Applied

Ergonomics, 23(3), 1992

Combining Attributes for Systems of Systems, Chattopadhyay (et al.), INCOSE

INSIGHT, 2010

System of Systems Requirements, Keating, Engineering Management Journal, 2008

Test as We Fight, O’Donoghue, ITEA Journal, 2011

Systems Engineering and Test for Unprecedented Systems, Weiss (et al.), ITEA

Journal, 2009

• Implications of Systems of Systems on System Design and Engineering. Proceedings of the 2011 6 th International Conference on System of Systems Engineering,

Albuquerque, New Mexico, June 27-30, 2011

Requirements Engineering for Systems of Systems, Lewis (et al.), IEEE SysCon, 2009

45 bryan.herdlick@jhuapl.edu

Additional References

OPERATIONAL CONCEPT DOCUMENTS: CONOPS, CONEMPS, DRMs & OCDs

• American National Standards Institute / American Institute of Aeronautics and Astronautics [1992] Guide

for the Preparation of Operational Concept Documents, ANSI/AIAA G-043-1992

• Chairman of the Joint Chiefs of Staff [2006] Joint Operations Concepts Development Process (JOpsC-DP),

Instruction CJCSI 3010.02B

• Department of the Air Force, Secretary of the Air Force [2005] Air Force Concept of Operations

Development, Instruction 10-2801, Pentagon, Washington, D.C. www.e-publishing.af.mil

• Department of the Navy, Fleet Forces Command (FFC) [2006] CONOPS TO DOCTRINE: Shaping the Force

From Idea Through Implementation. Fleet CONOPS Guidance Brief.

• Department of the Navy, Fleet Forces Command (FFC) [2009] Fleet CONOPS Writer’s Guide, Version 1

• Department of the Navy, Office of the Assistant Secretary of the Navy (Research, Development &

Acquisition) [2002] Technical Brief: Design Reference Mission Profile Development Guidelines, TB # ABM

1002-03, Pentagon, Washington, D.C. www.abm.rda.hq.navy.mil

• Department of the Navy, Office of the Chief of Naval Operations (OPNAV) [2010], Navy Concept

Generation and Concept Development Program, Instruction 5401.9, Pentagon, Washington, D.C.

• Department of the Navy, Office of the Chief of Naval Operations (OPNAV) [2010], Developmental System

Concept of Operations, Instruction 5401.xx (DRAFT), Pentagon, Washington, D.C.

• Department of the Navy, Naval Warfare Development Command (NWDC) [2010] Guide for Navy Concept

Generation And Concept Development Program, Version 1.0

• Herdlick, B.E., [2011] Establishing an Operational Context for Early System-of-Systems Engineering

Activities, Systems Research Forum, 5(2)

• Department of the Navy, Space Warfare Systems Command (SPAWAR) [2000] Data Item Description for

Operational Concept Document, DI-IPSC-81430A

46 bryan.herdlick@jhuapl.edu

Additional References

HUMAN PERFORMANCE in a SoS / CxS context

• Command and Control Systems Engineering: Integrating Rapid Prototyping and

Cognitive Engineering, Cooley & McKneely, JHU/APL Technical Digest, 31(1), 2012

The Effects of Automation on Battle Manager Workload and Performance, Soller &

Morrison, IDA Report (D3523) of January 2008

• Successfully Changing Conceptual System Design Using Human Performance

Modeling, Mitchell, Naval Engineers Journal (peer review DRAFT; 2009)

• MORS Workshop of January 2012: “A Joint Framework for Measuring Command

and Control Effectiveness” - see Group 4 Outbrief, in particular

Optimizing Performance for Mine Warfare: A Case of Mission-Centered Design,

Osga (et al.), Naval Engineers Journal (ASNE) (peer review DRAFT; 2009)

An Obstacle Course Evaluation of AWACS 40/45 HMI Options, Donnelly & Vidulich,

AFRL presentation, 28JUN2002

47 bryan.herdlick@jhuapl.edu

Additional References

INCOSE Compilation of 2013 (45 SoS ‘experts’ responded; 100 refs total)

References recommended by the most respondents were

• Maier, Mark W. 1998. “Architecting Principles for System-of-systems.” Systems Engineering 1 (4): 267–284.

[9 recommendations]

Office of the Under Secretary of Defense (Acquisition, Technology and Logistics) Systems Engineering Guide for Systems of

Systems [9 recommendations]

• Jamshidi, M. (Ed.) System of Systems Engineering: Principles for the 21st Century Wiley, 2009 [7 recommendations]

Beyond this, there were eight references which were recommend by more than one respondent

• Jamshidi, M. (ed). 2009. Systems of Systems Engineering - Principles and Applications. Boca Raton, FL, USA: CRC Press recommendations]

[4

• Dahmann, J.; Rebovich, G.; Lane, J.; Lowry, R. & Baldwin, K. An Implementer's View of Systems Engineering for Systems of Systems

Proceedings of the 2011 IEEE International Systems Conference (SysCon), 2011, 212 – 217. [3 recommendations]

• DeLaurentis, D.A., "Understanding Transportation as a System-of-Systems Design, Problem," 43rd AIAA Aerospace Sciences Meeting &

Exhibit, Reno, NV, 10-13 Jan. 2005. AIAA Paper No. 2005-123. [3 recommendations]

BKCASE, Part 4, Systems of Systems [3 recommendations]

Barot, V., Henson, S., Henshaw, M., Siemieniuch, C., Sinclair, M., Lim, S. L., Jamshidi, M., DeLaurentis, D., 2012. State of the Art Report,

Trans-Atlantic Research and Education Agenda in Systems of Systems (T-AREA-SoS), Loughborough University. Available on-line. A comprehensive introduction to SoS, issues and research towards solutions. [2 recommendations]

• Boardman, J. and B. Sauser. (2006). System of Systems – the meaning of Of. IEEE International Conference on System of Systems

Engineering. April 24-26, Los Angeles, CA [2 recommendations]

• Boardman, John, and Brian Sauser. 2008. Systems Thinking: Coping with 21st Century Problems. Boca Raton, FL: CRC Press. [2 recommendations]

• INCOSE Systems Engineering Handbook v. 3.2.2 INCOSE‐TP‐2003‐002‐03.2.2 October 2011 [2 recommendations]

All the recommended references from the recent INCOSE SoS working group survey / compilation are on the SoS WG area of INCOSE Connect at https://connect.incose.org/tb/soswg/

48 bryan.herdlick@jhuapl.edu

Additional References

TEXTS on COMPLEXITY & EMERGENT BEHAVIOR

• Emergence: From Chaos to Order (Holland, John H.; 1998)

• Hidden Order: How Adaptation Builds Complexity (Holland, John H.; 1995)

• Complexity: The Emerging Science at the Edge of Order and Chaos (Waldrop, M.

Mitchell; 1992)

• Chaos: Making a New Science (Gleick, James; 1987)

• Complexity: Life at the Edge of Chaos (Lewin, Roger; 1992; 2 nd Ed.)

• Exploring Complexity: an introduction (Nicolis and Prigogine; 1989)

• The Black Swan: The Impact of the Highly Improbable (Taleb, Nassim N.; 2007)

• Blackhawk Down (Bowden, Mark)

• Coping With the Bounds: Speculation on Nonlinearity in Military Affairs

(Czerwinski, Tom)

• Engineering Systems: Meeting Human Needs in a Complex Technological World

(Roos & de Weck)

• Causality (Pearl, Judea)

49 bryan.herdlick@jhuapl.edu

Download