Acquisition Guide Manager

jçÇÉäáåÖ= ~åÇ= páãìä~íáçå
E d u c a t i n g
t h e
D o D
C o m m u n i t i e s
a n d
S e r v i c e s
póëíÉãë=^Åèìáëáíáçå=j~å~ÖÉêÛë=dìáÇÉ=
Ñçê=íÜÉ=ìëÉ=çÑ=
jçÇÉäë=~åÇ=páãìä~íáçåë=
cÉÄêì~êó=OMMV=
Copyright Notification
"Copyright 2008. This is a work for hire of the US Government and is subject to
copyright protection in the United States. All rights reserved."
“This instructional text was produced by the United States Government for further
distribution to other US academic institutions and is in public domain. Materials held by
other copyright owners have been identified in this text for possible inclusion by
subsequent users. Subsequent users are responsible for conformance with US
copyright law, including ‘fair use’ provisions, if they choose to incorporate these or other
copyrighted materials.”
mêÉÑ~ÅÉ=
In 2007, the Naval Postgraduate School, Monterey, CA, led a multiuniversity effort to develop an education and training curriculum to
enhance the acquisition workforce’s ability to apply Modeling and
Simulation (M&S) tools. This endeavor ultimately augments warfighting
capability while it reduces lifecycle development time and costs. This
guidebook was initiated as an effort parallel to this curriculum
development to expand the skill-base for the modeling and simulation
workforce; it represents a collaborative effort from members of the military,
civilian and academic M&S community. Providing Acquisition Managers
and the general acquisition workforce with current DoD directives, policies
and technology applications upon which to base sound acquisition
decisions involving M&S, the information in this guidebook comes from
various sources, with contributions from the defense M&S community.
ii
=
THIS PAGE INTENTIONALLY LEFT BLANK
^ÅâåçïäÉÇÖÉãÉåíë=
This effort would not have been possible without the collaborative
support of multiple individuals and organizations. Acknowledgement is
first given to the Defense Modeling and Simulation Coordination Office
(M&S CO), along with the Navy Modeling and Simulation Office (NMSO),
for their tasking and financial support to develop a unique program of
lifelong learning in Modeling and Simulation (M&S) for acquisition and
T&E workforce personnel. Additionally, the administrative support and
subject-matter expertise of Professor David Olwell and his staff in the
Systems Engineering Department of the Graduate School of Engineering
& Applied Sciences at the Navel Postgraduate School helped to keep this
guidebook on target and within scope. Likewise, this deliverable would
not have been possible without the Acquisition Research Program of the
Graduate School of Business & Public Policy at the Naval Postgraduate
School and the work of David J. Wood and Jeri M. Larsen in compiling
content and providing technical editorial support under the guidance of
John Dillard.
iv
THIS PAGE INTENTIONALLY LEFT BLANK
_êáÉÑ=q~ÄäÉ=çÑ=`çåíÉåíë=
1.
Introduction
1
2.
M&S Background
9
3.
M&S in the Acquisition Lifecycle
17
4.
M&S Attributes and Hierarchy
43
5.
M&S Policy and the Defense Acquisition Framework
61
6.
Managing M&S within a Program
83
7.
Verification, Validation and Accreditation (VV&A) or
Certification (VV&C)
99
8.
Common Issues and the Future of M&S
109
Selected Bibliography
117
Appendix A.
List of Acronyms
131
Appendix B.
DoD Resources
137
vi
THIS PAGE ITENTIONALLY LEFT BLANK
q~ÄäÉ=çÑ=`çåíÉåíë=
1.
Introduction
1.1.
Scope and Purpose
2
1.2.
Organization
3
1.3.
Spectrum of M&S within the DoD
4
1.3.1.
General Model Types
4
Acquisition Environment
4
1.4.1.
4
1.4.
2.
M&S in Acquisition
1.5.
Chapter Summary
6
1.6.
Chapter References
6
M&S Background
2.1.
2.3.
2.4.
3.
1
9
A New M&S Management Approach
10
2.2.1.
M&S Master Plan, DoD Directive 5000.59-P
10
2.2.2.
Joint Capabilities Integration Development System (JCIDS)
11
2.2.3.
M&S Steering Committee (MSSC)
11
2.2.4.
Defense Modeling and Simulation Office (DMSO) Reform
12
Today’s Applications
13
2.3.1.
13
Application in Functional Areas
Systems Acquisition Process
14
2.4.1
15
Decision Support Systems
2.5.
Chapter Summary
15
2.6.
Chapter References
16
M&S in the Acquisition Lifecycle
17
3.1.
M&S to Support the Acquisition Community
17
3.2.
A Model
18
viii
Table of Contents
3.3.
A Simulation
18
3.4.
Simulations-based Acquisition
19
3.4.1.
19
3.5.
4.
Better, Faster, and Cheaper
M&S Applications in Support of Acquisition Processes
20
3.5.1.
Requirements Definition
20
3.5.2.
Program Management
22
3.5.3.
Design and Engineering
25
3.5.4.
Manufacturing
27
3.5.5.
Test and Evaluation
30
3.5.6.
Logistics Support
34
3.5.7.
Training
35
3.6.
Why use M&S?
38
3.7.
Chapter Summary
39
3.8.
Chapter References
39
M&S Attributes and Hierarchy
4.1.
4.2.
4.3.
43
Definitions
43
4.1.1.
Attributes: Validity, Resolution and Scale
43
4.1.2.
M&S Categories
44
4.1.3.
M&S Methods
47
M&S Classes
50
4.2.1.
Constructive Models and Simulations
50
4.2.2.
Virtual Simulation
51
4.2.3.
Live Simulations
52
Hierarchy of Models and Simulations
53
4.3.1.
Engineering-level Models and Simulations
55
4.3.2.
Engagement-level Models and Simulations
55
4.3.3.
Mission/Battle-level Models and Simulations
56
ix
5.
4.3.4.
Theater/Campaign Models and Simulations
57
4.3.5.
Hierarchy Summary
58
4.4.
Chapter Summary
59
4.5.
Chapter References
60
M&S Policy and the Defense Acquisition Framework
5.1
An Examination of the Evolving Defense Acquisition
Framework
61
5.1.1.
6.
61
Lifecycle Systems Management Model
61
5.2.
Systems Engineering within Development Projects
77
5.3.
Important Acquisition References
79
5.3.1.
DoD
79
5.3.2.
Joint Chiefs
79
5.3.3.
Services
79
5.3.4.
M&S References
80
5.5.
Chapter Summary
80
5.6.
Chapter References
80
Managing M&S within a Program
6.1.
6.2.
6.3.
83
Planning for the M&S Effort
83
6.1.1.
Questions for the M&S Effort
83
6.1.2.
M&S Plan Elements
85
6.1.3.
The Simulation Support Plan
85
Contracting for M&S
88
6.2.1.
Models and Simulations as Contract Deliverables
88
6.2.2.
Selecting a M&S Contractor
89
Evaluating Contract Proposals
93
6.3.1.
Modular Contracting and Contract Bundling
93
6.3.2.
Major Contract(s) Planned
94
x
Table of Contents
6.4.
7.
6.3.3.
Multi-year Contracting
94
6.3.4.
Contract Type
94
Affordability and Lifecycle Resource Estimates
95
6.4.1.
Total Lifecycle and Ownership Costs
96
6.4.2.
Lifecycle Cost Categories and Program Phases
96
6.5.
Chapter Summary
97
6.6.
Chapter References
97
Verification, Validation and Accreditation (VV&A) or
Certification (VV&C)
7.1.
VV&A Policy Background
7.1.1.
7.2.
The Role of Data
7.2.1.
7.3.
7.4.
7.5.
7.6.
DoDI 5000.61
Questions for Consideration
Purpose and Definitions
99
99
100
100
101
101
7.3.1.
Purpose of VV&A
101
7.3.2.
VV&A Definitions
101
7.3.3.
VV&C Definitions
102
VV&C Data
102
7.4.1.
The VV&C Tiger Team
103
7.4.2.
VV&C Tasks and Objectives
103
7.4.3.
VV&C Process Definitions
104
7.4.4.
Products
104
Important Distinctions
105
7.5.1.
Gaps
105
7.5.2.
Emerging Issues
105
7.5.3.
Considerations
106
Best Practices
106
xi
8.
7.7.
Chapter Summary
107
7.8.
Chapter References
107
Common Issues and the Future of M&S
8.1.
8.2.
8.3.
Intellectual Property
109
8.1.1.
Intellectual Property Regulations and Practices
110
8.1.2.
Commercial Software and Technical Data
111
8.1.3.
DoD Acquisition Policy and Flexibility
111
8.1.4.
Commercial and Noncommercial Technologies
112
8.1.5.
Additional Intellectual Property Forms
112
8.1.6.
Intellectual Property vs. Licensing
113
Acquisition Planning
113
8.2.1.
Long-term Planning
114
8.2.2.
Summary
114
The Evolution and Future of M&S
8.3.1.
8.4.
109
Getting to the Future State of M&S
Chapter References
115
115
116
Selected Bibliography
117
Appendix A.
List of Acronyms
131
Appendix B.
DoD Resources
137
THIS PAGE INTENTIONALLY LEFT BLANK
NK=
fåíêçÇìÅíáçå=
“Modeling and simulation is the wave of the future—and perhaps the
only way we can reduce the cost of testing.”
Dr. Marion Williams,
Technical Lead, Air Force Test Agency [1989]
Broadly defined, Modeling and Simulation (M&S) is simply an
attempt to create an artificial representation of real-world processes,
equipment, people, activities or environments in an attempt to better study
and understand how those elements interact with one another. A model
can be represented in myriad forms—from physical models to
mathematical models—and in any medium that provides a logical
representation of a system, entity, phenomenon, or process.
Although there is certainly a vast expanse of arenas in which M&S is
applicable, the focus of this guidebook is M&S within the Department of
Defense (DoD). The philosophy of M&S has evolved from three
overlapping areas: operational planning, acquisition and training.
Operational planning aides the DoD in utilizing equipment and forces to
best achieve national objectives and in identifying new requirements.
Acquisition provides the items, systems, and technology commanders
can use to support operational planning. Finally, M&S training teaches
commanders to employ forces, use systems and apply technology
provided through acquisition to support operational planning. Overall, the
use of M&S provides a comparatively inexpensive way for decisionmakers to optimize planning, acquisition and training programs.
Throughout the DoD, there is an increasing interest in finding ways to
optimize and fully utilize M&S capabilities. The impetus for this project
occurred under the auspices of the Executive Council on M&S (EXCIMS),
when, in February 2005, the Acquisition Modeling and Simulation Working
Group was assigned to define and develop goals within the M&S
community—with specific attention to acquisition. The outcome of this
effort resulted in the publication of the Acquisition Modeling and
Simulation Master Plan (AMSMP) on April 17, 2006. The AMSMP
emphasized the need to:
foster widely needed M&S capabilities beyond the reach of
individual programs; better enable acquisition of effective joint
capabilities and systems-of-systems; empower program and
capability managers by removing systemic M&S obstacles,
indentifying new options for approaching tasks, and helping support
widely-shared needs; and promote coordination and interface with
M&S activities of the DoD Components. [OUSD(AT&L), 2006: 7]
2
Chapter 1
Furthermore, the AMSMP identified and described five objectives.
These objectives illustrated a long-term plan to:

Provide necessary policy and guidance,

Enhance the technical framework for modeling and simulation,

Improve model and simulation capabilities,

Improve model and simulation use, and

Shape the workforce.
The fifth objective, “shape the workforce,” prompted DoD funding of a
number of programs to develop modeling and simulation education for the
acquisition workforce, including the effort behind this Acquisition
Manager’s Guide to Modeling and Simulation [2006].
1.1.
Scope and Purpose
A detailed description of the M&S community exceeds the scope of
any single text. To help narrow the scale of this project and to focus the
research effort into a useful resource, this guidebook is presented from the
Program Management prospective, focusing on the acquisition system
with limited discussion of basic acquisition knowledge—the assumption
being that acquisition professionals planning for M&S efforts will have
already acquired general education in the fundamentals of program
management.1
The objective of this guide is to provide a reference for Acquisition
Managers. It will describe M&S policies, types of models and simulations,
applications, and key technical and management issues. This guidebook
is intended for use by Program Management Offices (PMOs), acquisition
support agencies, policy-makers, military departments, government offices,
research centers, libraries, industry and academic institutions. It should
enable the manager to make better use of models and to better
understand their results. In addition, the guide both highlights current
trends and policies that give the Acquisition Manager the knowledge-base
to become more familiar with M&S decision-making, as well as offers
resources to find additional information pertinent to specific applications.
1
Suggested readings in defense acquisition management include: Defense Acquisition University,
Introduction to Defense Acquisition Management, 7th Ed. (Fort Belvoir, VA: Defense Acquisition University
Press, September 2005). http://www.dau.mil/pubs/gdbks/Intro_2Def_Acq_Mgmt_7th_Ed.pdf
Introduction
3
This guidebook is a part of a larger effort to extend the education
efforts of M&S, as indicated in the aforementioned AMSMP. In attempts
to create a guidebook useful across Services and throughout the DoD,
solicitations for content went out to the Navy, Army, Air Force and
defense-related M&S communities. In order to be most helpful to
Acquisition Managers using M&S throughout the acquisition lifecycle, this
guidebook highlights key areas in M&S requiring focus and is the result of
consultation with stakeholders and educational partners, as well as of
informal peer review.
1.2. Organization
The guidebook is separated into eight chapters. Chapter 1 introduces
M&S in acquisition. A brief background and history of M&S in acquisition
is provided, to include today’s applications and the reform movement.
Chapter 2 outlines M&S requirements to include organization and policy,
the hierarchy of modeling and simulation, evaluation of M&S proposals
and contracting for M&S. Chapter 3 discusses the management of M&S
in regards to planning for the M&S effort, models and simulations as
contract deliverables, and standards for reuse. Chapter 4 details the
attributes and hierarchy within M&S while also exploring various M&S
methods that may be applied during a project’s lifecycle. Chapter 5
covers M&S policy and also provides a detailed explanation of the defense
Acquisition Framework. Chapter 6 describes managing M&S within a
program and provides the Program Manager (PM) with an outline of a
Simulation Support Plan (SSP). Chapter 7 helps the Acquisition
Manager in determining credibility and confidence in the use of M&S
results as achieved through the implementation of Verification,
Validation, and Accreditation (VV&A) processes. Chapter 8 draws
attention to common issues in M&S as well as discusses some future
trends in the field. Following the conclusion is an extensive Bibliography
and reference list of M&S acquisition-related sources. This guidebook
also includes a List of Acronyms and additional DoD Resources as
appendices.
Throughout this guide, the authors have inserted sections from
previous guidebooks and quotes from subject-matter experts that are
profoundly applicable to the study of M&S. These sections are included as
text, as many are too long to be confined in text boxes or other
differentiating formats. Thus, they will flow into the text without pause.
At the end of each chapter, the reader will find the Chapter References
that were cited in that chapter.
4
Chapter 1
1.3. Spectrum of M&S within the DoD
The number of communities using M&S is expanding, extending from
laboratory analysts all the way to weapon systems operators. For this
reason, a full comprehension of M&S terminology is difficult to obtain.
Thus, this document will not attempt to make the reader an expert, but it
will aid in the reader’s understanding of this very complex topic.
Furthermore, there have been several discussions about appropriate
definitions and the use of M&S terms throughout the DoD; this book is no
exception. However, after a review of this guidebook, readers will improve
their knowledge of M&S so as to have a solid foundation on which to base
their discussion.
1.3.1.
General Model Types
There are many ways to characterize M&S. The spectrum of defense
M&S includes broad types, classes, hierarchy and applications (functional
areas). The three general types of models are as follows:

Wargaming models range from single-engagement (one-on-one)
to joint, theater-level campaign operations.

Training models range from single-template instructional systems
to complex virtual-reality simulations.

Acquisition models range from physical-level phenomenon
models through engineering component design tools to models of
systems in the end-use environment.
1.4. Acquisition Environment
As a result of changing political environments and the Global War on
Terror, the DoD is faced with a new world-wide order of political, economic
and military affairs. National security has many new challenges. Although
the Government is committed to providing a strong force capable of
effectively deterring threats to the United States and its allies, the US DoD
is being faced with military downsizing and more limited resources. Thus,
it is of even greater importance that those resources are being utilized as
effectively as possible. One way to accomplish this is through the
appropriate application of M&S throughout the acquisition lifecycle.
1.4.1.
M&S in Acquisition
M&S is viewed as a potential answer to many of the DoD’s systems
acquisition process problems. Models are generally used to prove
concepts. Such models can be anything from mathematical calculations to
full-scale replicas that are subjected to controlled environments for testing.
Introduction
5
It is important to keep in mind that an underlying reason for using M&S is
to reduce risk; risk reduction is the unifying concept throughout the entire
systems acquisition process. It is very logical to view M&S as a set of tools
with which to minimize risk to cost, schedule, performance and
supportability for the PM. Furthermore, the value added of M&S can be
easily communicated through this analogy.
An explanation of what risk means in the acquisition system is needed
at this point in the discussion. When a system is fielded (operational), it is
intended to meet a particular requirement based on a need; then, the
system’s ability to meet the mission requirements is continually evaluated.
As the system becomes outdated, the risk associated with the system’s
ability to accomplish the mission increases. A risk assessment is
conducted to determine if the mission can be accomplished by changing
the use(s) of the system (i.e., tactics), by modifying the system, or by
acquiring a new system.
Once the level of risk has increased such that a major modification or a
new system is needed, the operational risk—through requirements
documentation—is translated into programmatic risk and becomes shared
with the acquisition community. The acquisition community receives
direction to provide a system that satisfies requirements evolved from the
mission need.
The cost, schedule, performance and supportability risks associated
with the acquisition process are inherent. The acquisition community finds
the best contractor, manages system development and reports progress
through the chain of command to aggressively provide the operational
community a system to meet its need. Since time and resources are
limited, this usually creates a situation in which trade-offs must be made
frequently, using information generated by models and simulations to get
the system operational at an acceptable performance level. This
completes the systems acquisition process cycle (illustrated in Figure 1.1).
Again, this example helps to clarify the need to use M&S to minimize risk
within the systems acquisition process.
6
Chapter 1
Figure 1.1.
Systems
Acquisition
Process
Cycle
[Piplani, Mercer and Roop, 1994]
1.5. Chapter Summary
This chapter provided a brief introduction to the broad world of M&S
before focusing the scope of this effort to the Program Managers’
perspective of M&S within acquisitions. From there, an explanation of the
guide’s organization was provided, as well as an introduction to the
spectrum of M&S within the DoD and an exploration of the acquisition
environment as it pertains to M&S.
The purpose of this guidebook is to assist program management
offices (PMOs), acquisition support agencies, policy-makers, military
departments, government offices, research centers, libraries, industry and
academic institutions in better understanding the appropriate application of
M&S throughout a program’s lifecycle.
1.6. Chapter References
Defense Acquisition University (DAU). Defense Acquisition Guidebook. Fort Belvoir, VA:
Defense Acquisition University Press, December 2008.
https://akss.dau.mil/dag/GuideBook/PDFs/GBNov2006.pdf
Introduction
7
Defense Acquisition University (DAU). Introduction to Defense Acquisition Management,
7th ed. Fort Belvoir, VA: Defense Acquisition University Press, September 2005.
http://www.dau.mil/pubs/gdbks/Intro_2Def_Acq_Mgmt_7th_Ed.pdf
Office of the Under Secretary of Defense (Acquisition, Technology & Logistics) Defense
Systems. Department of Defense Acquisition Modeling and Simulation Master
Plan. Washington, DC: Author, April 17, 2006.
Piplani, Lalit, Joseph Mercer, and Richard Roop. Systems Acquisition Manger’s Guide
for the Use of Models and Simulations. Fort Belvoir, VA: Defense Systems
Management College Press, September 1994.
Williams, Marion. “Simulation in Operational Test and Evaluation.” The ITEA Journal of
Test and Evaluation X, no. 3 (1989).
=
THIS PAGE INTENTIONALLY LEFT BLANK
OK=
jCp=_~ÅâÖêçìåÇ=
Each reader of this guidebook most likely has pre-conceptions of what
the term M&S means. The term model may invoke mental images
ranging from a simple plastic model ship or an architect’s more
complicated, 3-dimentional architectural foam-board rendering. Generally
speaking, there are many different descriptions and definitions of a model.
And certainly the issue is not simplified when a discussion is focused on
the DoD.
Since World War II, technology has advanced at an ever-increasing
rate, and the appropriate application of technology can help realize results
that would have been considered impossible only five years ago. This
phenomenon is a dream come true for many. Virtual reality (an
interactive, computer-generated or synthetic environment), for example, is
significantly changing our lives; entertainment, work, learning, travel and
communications are all incorporating virtual reality. M&S applications
have been linked to the development of many significant computational
devices, such as:

Early digital approaches for arithmetic calculations,

Analog computers (solution of differential equations—naval
gunnery, 1930s; aircraft design and flight simulation, 1940s and
‘50s; missile on-board control, 1950s and ‘60s),

Electronic digital computers emerging from WWII code-breaking
(UK) and artillery table calculations (US) needs,

Formation of the hybrid computer (linked analog-digital units),
which permitted real-time solution of non-linear, time-dependent
systems of equations with linked, continuous and discrete
components and led to the simulation of large-scale physical
systems. Real-time operation permitted incorporation of
hardware—leading to hardware-in-the-loop simulation,

Further digital computer development—providing computational
capacity that exceeded analog computers, and

Present-day status—in which digital computation is a pre-requisite
for M&S applications.
Benefits are also coming from virtual prototypes: computer-based
simulation of systems with a degree of functional realism. For example,
virtual prototypes with properly modeled fluid dynamics can be used in
designing aircraft, ships and missiles to replace wind tunnel testing: a
10
Chapter 2
costly and time-consuming process [US Department of Transportation,
1993].
This chapter will provide background information on the modern
evolution of M&S, as well as information about some uses of models and
simulations. However, it is important to first reiterate a standard definition
of Modeling and Simulation (M&S) as it is used throughout this guidebook
(from DoD Directive (DoDD) 5000.59, DoD Modeling and Simulation
Management [2007]). A model is a physical, mathematical or otherwise
logical representation of a system, entity, phenomenon or process.
Simulation is twofold: a method for implementing a model over time; and
a technique for testing, analyzing, or training in which real-world systems
are used, or in which real-world and conceptual systems are reproduced
by a model.
2.1. A New M&S Management Approach
Since the late 1980s, technical progress in the M&S community has
spurred a management response within that community. Prior to 1988,
there were limited-scope simulations and little interoperability in M&S.
The Defense Advanced Research Projects Agency (DARPA) initiated
the Simulator Network (SIMNET).
The program resulted in the
establishment of important principles for simulation interaction and the
creation of a network messaging protocol for community members to
follow when exchanging essential data [Smith, 1998]. Furthermore,
SIMNET signaled the origins of new simulation interoperability protocols.
The Distributed Interactive Simulation (DIS) attempted to incorporate
the SIMNET technology across wider applications in simulations. Another
emergent simulation interoperability protocol was the Aggregate-level
Simulation Protocol (ALSP). The success of the SIMNET prompted
DARPA to seek guidance from the defense community to increase the
success rate of combat M&S.
2.2.1.
M&S Master Plan, DoD Directive 5000.59-P
In 1994, the DoD released its policy for managing M&S, DoD Directive
(DoDD) 5000.59. The following year, the 1995 M&S Master Plan, DoD
Directive 5000.59-P was released. The M&S Master Plan called for a
viable, flexible, common framework for M&S reuse so DoD decisionmakers could quickly employ past models when combining live, virtual and
constructive joint capability forces to make acquisition, training, doctrine,
testing and operational planning decisions [US DoD M&S CO, 2007]. In
1995, the Defense Modeling Simulation Office (DMSO), working with
teams from industry, government and academia, developed the High
Level Architecture (HLA) to replace the faltering Distributed Interactive
Simulation (DIS) and Aggregate-level Simulation Protocol (ALSP).
M&S Background
11
HLA continued to evolve until its acceptance as an Institute of Electrical
and Electronic Engineers (IEEE) standard (1516) in 2000.
2.2.2.
Joint Capabilities Integration Development
System (JCIDS)
Within the larger Acquisition community, a burgeoning reform
movement culminated in the June 2003 release of the radically revised the
Chairman of the Joint Chiefs of Staff Instruction (CJCSI 3170.01EI) and
the Chairman of the Joint Chiefs of Staff Instruction (CJCSM) 3170.01.
These pieces of legislation promulgated the new Joint Capabilities
Integration Development System (JCIDS) and literally turned the legacy
Requirements Generation System (RGS) upside down. The decadesold “threat-driven,” “bottom-up” development process of warfare-materiel
requirements was summarily replaced by a “revolutionary,” “capabilitiesdriven,” “top-down” process.
With the imposition of the Joint
Requirements Oversight Council (JROC) and Commander in Chief
(CINC) participation by the Defense Reorganization Act of 1986 (also
known as the Goldwater-Nichols Act), the historically Service-unique
requirements development processes and organization were necessarily
changed. Seventeen years later, this Act was suddenly re-introduced by a
new and rapidly evolving DoD/Joint Chiefs of Staff (JCS)-driven process.
2.2.3.
M&S Steering Committee (MSSC)
In August 2007, Deputy Secretary of Defense Gordon England issued
a memorandum ordering the creation of an executive-level panel to
spearhead modeling and simulation (M&S) efforts [USD(AT&L), 2007].
Based on guidance set forth in the memorandum, the Under Secretary of
Defense for Acquisition, Technology and Logistics (USD(AT&L)) was to
lead the new M&S Steering Committee (MSSC). The guidance specified
that the MSSC shall:
oversee the development and implementation of policies, plans,
procedures, and DoD issuances to manage M&S and the
implementation of best practices of how models and simulations
are effectively acquired, developed, managed, and used by DoD
Components (e.g., verification, validation, and accreditation;
standards, and protocols. [Ibid., 3]
In addition, the memorandum entrusted the MSSC to:
develop plans, programs, procedures, issuances, and pursue
common and cross-cutting M&S tools, data, and Services to
achieve DoD’s goals by: promoting visibility and accessibility of
models and simulations; leading, guiding, and shepherding
investments in M&S; assisting collaborative research, development,
12
Chapter 2
acquisition, and operation of models and simulations; maximizing
commonality, reuse, interoperability, efficiencies and effectiveness
of M&S, and supporting DoD Communities that are enabled by
M&S. [Ibid., 2]
2.2.4.
Defense Modeling and Simulation Office
(DMSO) Reform
In 1985, the DMSO was formed as a repository for DoD M&S. A
November 2007 news release from the Modeling and Simulation
Coordination Office (M&S CO) described the re-designation of the
Defense Modeling and Simulation Office (DMSO) as a “visible sea
change in the Department’s vision of the way DoD manages M&S” [US
DoD M&S CO, 2007].
The M&S CO performs those key corporate-level coordination
functions necessary to encourage cooperation, synergism, and
cost-effectiveness among the M&S activities of the DoD
Components. The M&S CO is the Executive Secretariat for DoD
M&S Management in fostering the interoperability, reuse, and
affordability of crosscutting M&S to provide improved capabilities
for DoD operations. [Ibid.]
The crosscutting of M&S efforts are detailed in the following chart in
Figure 2.1.
Figure 2.1.
Current DoD
Modeling and
Simulation
Management
[US DoD M&S CO, 2007]
In another August 2007 memo, members of the Modeling and
Simulation Steering Committee (MSSC) iterated the strategic vision for
M&S Background
13
DoD M&S. The memo described a proposed “end-state” in the following
words:
[A] robust modeling and simulation (M&S) capability enables the
Department to more effectively meet its operational and support
objectives across the diverse activities of the Services, combatant
commands, and agencies […] [A] Defense-wide M&S management
process encourages collaboration and facilitates the sharing of data
across DoD components, while promoting interactions between
DoD and other government agencies, international partners,
industry, and academia. [Office of the Director of Defense
Research and Engineering, 2007]
2.3. Today’s Applications
The M&S user community is very broad, spanning not only those
involved in the employment of weapon systems, but also those involved in
all phases of systems acquisition. Primary developers of today’s models
are war colleges, industry, DoD laboratories and universities. There are
varied opinions over modeling techniques, the amount of detail required
within the model, and the value of analytical models, simulations, games
and field exercises. An examination of these yields a variety of models;
and even if multiple users are employing the same model, each of them
generally has a different application in mind for that model. This guide will
not cover all these various perspectives, nor provide guidance on the use
of one model over another. The decisions regarding the specific use of
models and simulations within a given program belong to the reader.
However, when making such decisions, readers should consider the
guidelines and information contained herein that pertain to the particular
activities within their programs and the policies regarding those programs.
The user community is divided into the following functional areas:
education, training and operations; research and development; test and
evaluation; analysis; and production and logistics.
2.3.1.
Application in Functional Areas
Specific applications for each of the functional areas are broken out
below.

Education, training and operations—Re-creation of historical
battles, doctrine and tactics development, command and unit
training, operational planning and rehearsal, and wartime situation
assessment.

Research
and
development—Requirements
definition,
engineering design support and systems performance assessment.
14
Chapter 2

Test and evaluation—Early operational assessment, development
and operational test design; and operational excursions and posttest analysis.

Analysis—Campaign analysis, force-structure assessment,
system-configuration determination, sensitivity analysis and cost
analysis.

Production and logistics—System producibility assessment,
industrial base appraisal and logistics requirements determination.
Those functional areas are broad and not mutually exclusive;
alternatively, they are representative of the many applications of M&S
throughout the user community.
2.4. Systems Acquisition Process
The goal of the systems acquisition process is to deploy (in a timely
manner) and sustain an effective system that satisfies a specific user’s
need at an affordable cost. The 2008 model (Figure 2.2) is only briefly
introduced here to familiarize the reader with the five phases of the
acquisition process because it is such an important element of the
acquisition process. However, greater detail will be focused on this figure
in Chapter 5.
When the acquisition process is revisited, the reader will see how M&S
tolls can be applied along every phase of the process to reduce costs and
improve efficiencies.
Figure. 2.2.
2008 Defense
Acquisition
Management
Framework
[DoD, 2008]
M&S Background
2.4.1
15
Decision Support Systems
The Department of Defense Directive (DoDD) 5000.1 establishes
broad policies governing defense systems acquisition programs. It states
that the three decision-making support systems must interact and
interface with each other in order for the process to work effectively. The
three systems illustrated in Figure 2.3 are: 1) the Joint Capabilities
Integration and Development System (JCIDS), 2) the defense acquisition
system, and 3) the Planning, Programming, Budgeting and Execution
System (PPBES).
Figure 2.3. Three
Major Decisionmaking Support
Systems
[DAU, 2008]
The first formal interface between JCIDS and the defense acquisition
management system occurs at the Method Definition Document (MDD),
which is supported by the Joint Requirements Oversight Council (JROC),
while PPBES provides the monetary means for programming the needed
capabilities within the acquisition management system. The acquisition
management system, the JCIDS and the PPBES all interface at major
milestones and during each Program Objective Memorandum (POM)
cycle.
2.5. Chapter Summary
This chapter has served to provide the reader with a brief background of
modern M&S development and applications. The key element to be taken
from this chapter is that the M&S field is constantly changing and evolving.
16
Chapter 2
As new technologies are launched and different M&S techniques can be
explored and applied, one can expect changes in M&S policy to continue
to change as well.
2.6. Chapter References
Defense Acquisition University (DAU). Defense Acquisition Guidebook. Fort Belvoir, VA:
Defense Acquisition University Press, December 2008.
https://akss.dau.mil/dag/GuideBook/PDFs/GBNov2006.pdf
Department of Defense (DoD). Modeling and Simulation (M&S) Master Plan (DoDD
5000.59-P).
Washington,
DC:
Author,
2007,
A-6.
http://www.dtic.mil/whs/directives/corres/pdf/500059p1.pdf
Department of Defense (DoD). Operation of the Defense Acquisition System (DoDI
5000.02). Washington, DC: Author, December 8, 2008.
Institute for Defense Analyses (IDA). A Review of Study Panel Recommendations for
Defense Modeling and Simulation. Washington, DC: Author, June 1992, Part 2,
Paragraph A.
Irwin, Sandra. “Pentagon Takes another Shot at Enforcing Joint Thinking.” National
Defense (August 2004): quoted in the Early Bird (July 28, 2004).
Locher, James R. Victory on the Potomac: The Goldwater-Nichols Act Unifies the
Pentagon. College Station, Texas: Texas A&M University Military Series, no. 70,
2002.
Matthews, David F. The New Joint Capabilities Integration Development System
(JCIDS) and Its Potential Impacts upon Defense Program Managers. Monterey,
CA: Naval Postgraduate School, December 30, 2004, 2-7.
Office of the Director of Defense Research and Engineering. “Strategic Vision for DoD
Modeling and Simulation.” Memorandum. Washington, DC: Author, August 24,
2007.
Smith, Roger D. “Essential Techniques for Military Modeling & Simulation.” Proceedings
of the Winter Simulation Conference, 1998.
http://www.modelbenders.com/papers/wsc98.html.
US Department of Transportation. The Road to 2012. Washington, DC: Author, 1993.
US DoD Modeling and Simulation Coordination Office (US DoD M&S CO). “DoD
Changes Approach to Managing Modeling and Simulation.” Alexandria, VA:
Author, November 23, 2007.
http://www.dmso.mil/files/MS_Mgmt_Structure_News_Release.pdf
PK=
jCp=áå=íÜÉ=^Åèìáëáíáçå=
iáÑÉÅóÅäÉ=
Because M&S is a fundamental and essential tool for acquisition
programs, planning for use of M&S throughout developmental test
and evaluation must be an early consideration in test planning.
Just as M&S planning should be integral to program acquisition
plans and systems engineering plans, it should also be integral to
the program Test and Evaluation Strategy and T&E Master Plan.
Important planning considerations include: the use and reuse of
M&S applications and data for T&E [Test and Evaluation] across
the program lifecycle, establishing credibility of M&S tools and data,
using M&S to predict live test results, and using live test results to
improve the credibility of M&S.
Chris DiPetto, Deputy Director
OUSD (AT&L) A&T/SSE/DT&E
March 26, 2007
In theory, M&S can be applied to processes, equipment, people,
activities and environments as a representation of the real world. The
intention of this chapter is to introduce the reader to some basic M&S
elements and to help the reader understand how these elements can play
a role in the M&S acquisition lifecycle.
3.1.
M&S to Support the Acquisition Community
This chapter begins to explore the application of these various M&S
elements so readers can understand how such elements can be applied to
support the acquisition community. The use of M&S within acquisition is a
multi-dimensional activity which:

Supports the milestone decision process,

Supports multiple communities (operator, developer, designer,
manufacturer, supporter, tester and trainer), and

Consists of various classes and types of M&S—each with a specific
purpose.
In conducting this exploration, this chapter provides an overview of
how M&S may be used across the phases of acquisition and a discussion
of its application to specific acquisition-related activities.
18
3.2.
Chapter 3
A Model
A model is “a representation of an actual or conceptual system that
involves mathematics, logical expressions, or computer simulations that
can be used to predict how the system might perform or survive under
various conditions or in a range of hostile environments” [DAU, July 2005:
B-107]. There are three types of models: physical, mathematical, and
process. Respective definitions for these model types are as follows:
Physical Model—A physical model has “physical characteristics [that]
resemble the physical characteristics of the system being modeled” [DoD,
1998].
Mathematical Model—A mathematical model is a “symbolic model
whose properties are expressed in mathematical symbols and
relationships” [1998]. Examples of mathematical models are the use of
queuing theory and game theory to simulate situation outcomes.
Process Model—A process model “represents the procedural steps of a
task, event or activity performed by a system” [Ibid.].
3.3.
A Simulation
Now that readers have a working definition of a model, it is also
important to review a working definition of a simulation. A simulation is:
a method for implementing a model. It is the process of conducting
experiments with a model for the purpose of understanding the
behavior of the system modeled under selected conditions or of
evaluating various strategies for the operation of the system within
the limits imposed by developmental or operational criteria.
Simulations may include the use of analog or digital devices,
laboratory models, or “test-bed” sites. Simulations are usually
programmed for solution on a computer; however, in the broadest
sense, military exercises, and wargames are also simulations.
[DAU, July 2005: B-147]
As there were three types of models, there are also three kinds of
simulations: live, virtual, and constructive. Respective definitions for these
simulation types are as follows:
Live Simulation—A live simulation involves “real people operating real
systems” [DoD, 1998]. Live simulations can involve individuals and groups,
real equipment, and can replicate the intended action. As a result, live
simulations tend to be costly because of the large expenditures involved.
Live simulations can also present safety hazards.
M&S in the Acquisition Lifecycle
19
Virtual Simulation—A virtual simulation involves “real people operating
simulated systems” [1998.].
Furthermore, “virtual simulations inject
human-in-the-loop (HITL) [simulations] in a central role by exercising
motor control skills (e.g., flying an airplane), decision skills (e.g.,
committing fire control resources to action), or communication skills (e.g.,
as members of a C4I team)” [DoD, 2007].
Constructive Simulation—Constructive simulations involve “simulated
people operating simulated systems […]. Real people stimulate (make
inputs) to such simulations, but are not involved in determining outcomes”
[2007]. Constructive simulations are useful for simulating large
organizations and for generating data and statistics to be analyzed later.
3.4.
Simulations-based Acquisition
In 1996, a new OSD “Simulation-based Acquisition (SBA)” initiative
was described by Annie Patenaude in the Study on the Effectiveness of
Modeling and Simulation in the Weapon System Acquisition Process:
The use of M&S tools has increased […]. This increase has not
been imposed by fiat; it is not the result of new guidance or
direction from top management. Rather it is the result of […]
powerful new emerging M&S tools to support existing processes
and to satisfy emerging requirements. [1996: 20]
Patenaude explained, “[I]t is clear that a revolution is underway and
that the end result will be a new way of doing business. We will call this
new approach to acquisition, ”Simulation Based Acquisition” [1996: 20].
Essentially, SBA is the robust and interactive use of M&S throughout a
product’s lifecycle to reduce costs and risk.
3.4.1.
Better, Faster, and Cheaper
The inspiration for SBA was to facilitate the “better, faster, and
cheaper” development of systems. SBA enables the warfighter to
participate in a system’s design stages and allows for the incorporation of
changes based on input from designers and users. The SBA permits
prompt feedback to a design team and allows for multiple iterations on
hundreds of point designs [Johnson, McKeon and Szanto, 1998: 1-2].
SBA can reduce the initial cost of systems. Such cost reduction can also
includes the financial costs associated with operating and sustaining
systems up to and including disposal [Ibid.: 1-3].
20
Chapter 3
The M&S community has recently produced a number of studies in the
last decade that reinforced the importance of M&S in defense decisionmaking and planning [Myers and Hollenback, 2005]. 2 These studies
indicated a need for significant changes within the M&S community.
Some of the reports recognized deficiencies in the M&S community by
noting that acquisition community managers and staffs were mostly
uninformed about M&S capabilities and its limitations. In addition,
acquisition personnel had a limited understanding of commercial M&S
activities. Likewise, a limited number of career paths for M&S existed,
thus leading to a shortage of formally educated M&S experts [Ibid.]. It is
these deficiencies that the MSSC (M&S Steering Committee) has sought
to evaluate via its preponderance.
3.5.
M&S Applications in Support of Acquisition
Processes
M&S supports key acquisition functions as they span the phases of the
process. These functions are: requirements definition, program
management, design and engineering, manufacturing, test and evaluation,
logistics support and training. PMs should look for opportunities in two
areas:

How the program can use the M&S tools across phases of the
acquisition process, and

How the program might make use of M&S to integrate activities
across functional boundaries.
3.5.1.
Requirements Definition
Models and simulations can be used in the process of developing
requirements documents (Initial Capabilities Document (ICD),
Capability Development Document (CDD), specifications). Figure 3.1
highlights these models. As with most other analyses conducted during
the acquisition process, a complimentary suite of models and simulations
is likely to be used over the course of the program’s lifecycle, ranging from
engineering performance to theater/campaign levels.
2
Myers and Hollenbach detail a number of studies that have been conducted over the last decade.
Please reference their research for a complete bibliography.
M&S in the Acquisition Lifecycle
21
The input data to the analysis process includes ground rules, such as:

Defense Intelligence Agency (DIA) threat estimates, along with
scenarios and missions derived from the Defense Planning
Guidance (DPG),

Environmental data—including weather, terrain, ocean environment,
countermeasures, etc.,

A selection of operational concepts and tactics, which allow for
evaluation of potential non-material solutions as required by DoD
Instruction (DoDI) 5000.02, and

System options to include existing, upgraded or new systems; and
New technologies that may be available through the DoD’s science
and technology programs, advanced technology demonstrations or
industry.
These data address a variety of scenarios, systems and tactics and will
be used in analyses conducted at each level in the M&S hierarchy (which
will be described further in Chapter 4). Using the engineering level of
models, analyses provide performance estimates for existing and
improved capability systems, taking into account the emerging technology
opportunities. The performance and design trade-offs of system and
subsystem design concepts and technologies are evaluated at this level.
These system/subsystem performance capabilities are evaluated within
the engagement and mission/battle-level models and simulations to
determine system effectiveness (e.g., probability of kill, losses,
survivability, vulnerability) and mission effectiveness (e.g., loss exchange
ratios, probability of engagement) in a limited engagement or mission.
These capabilities support campaign-level models to examine effects of
force mix, tactics or new capabilities on outcomes—typically in terms of
force-exchange ratios, draw downs or troop movements.
3.5.1.1.
Initial Capabilities Document (ICD)
The analyses are repeated for a variety of operational concepts and for
each of the system options under consideration. The engagement,
mission and campaign models may be run iteratively to provide statistical
significance to the outcomes. Material capability needs are identified and
documented in an Initial Capabilities Document (ICD). The engineering
models—in conjunction with the engagement and mission/battle-level
models—also provide the basis for the description of broad capabilities
and technology developments that should be studied in Material Solution
Analysis (MSA).
22
Chapter 3
3.5.1.2.
Capabilities Development Document (CDD)
The Capabilities Development Document (CDD) is developed during
the Technology Development phase. The CDD defines thresholds and
objectives in terms of operational effectiveness measures, system
performance measures and critical system characteristics. The CDD is
evolved during System Development and Demonstration (SDD), with
refined and more detailed capabilities and characteristics that can be
produced. It is likely that mission/battle and engagement models, in
coordination with engineering models, will be used to develop the
effectiveness and performance measures for the CDD.
3.5.1.3.
Technical Specifications
Technical specifications similarly evolve. A draft system-level
specification is developed during MSA; development specifications are
written during the Technology Development Phase; and product, process
and material specifications are crafted during Engineering and
Manufacturing Development (EMD). Engineering-level M&S (e.g.,
design, support, manufacturing and HW/SWIL) typically supports the
development of these requirements specifications.
There is not a simple one-to-one mapping between a particular level of
M&S and a particular requirements document. Rather, a combination of
M&S (levels and classes) will likely be needed to generate the various
measures and insure consistency of those measures across the program
documents.
3.5.2.
Program Management
The PM is faced with balancing cost, schedule and performance
objectives throughout the program. Much of the current emphasis in M&S
is on the performance or military utility arena, as has been the focus of
much of this guidebook. This next section will touch upon some of the
management tools that exist.
3.5.2.1.
Cost Models
Program Managers develop two types of cost estimates during the
acquisition process:

Program lifecycle cost estimates, and

Cost estimates for alternatives evaluation in the Analysis of
Alternatives (AoA).
Two separate cost estimates are required from the DoD component in
support of Milestone A and subsequent reviews. One of these estimates
will be prepared by the program office, and the other by a separate
organization that does not report through the acquisition chain [DoD,
M&S in the Acquisition Lifecycle
23
2008]. Additionally, the Office Secretary of Defense (OSD) Cost Analysis
Improvement Group (CAIG) will develop an independent DoD estimate
and prepare a report for both the Under Secretary of Defense for
Acquisition and Technology (USD(A&T)) for ACAT ID programs, and for
the DoD Component Acquisition Executive for ACAT IC programs.
The second use of cost estimates is in the preparation of the AoA to
support milestone decisions, beginning with Milestone A. The AoA is
prepared by an independent activity within the component. It should aid
decision-makers in judging which of the proposed alternatives to the
current program, if any, offer sufficient military benefit worth the cost [Ibid.].
3.5.2.2.
General Cost Model Features
Some general features of a cost model might include the following:

Cost-estimating relationships,

Statistics package,

Ability to address various cost-estimating methodologies,

Learning curve calculations,

Risk analysis,

Sensitivity analyses,

System Work Breakdown Structure (WBS),

Multiple appropriations (Research & Development
Appropriations, Operating & Support (O&S)),

Time-phasing of costs,

Overhead rates, and

Inflation indices.
(R&D),
The above features are contained in the Automated Cost Estimating
Integrated Tools (ACEIT) [DAU, June 2005], which is a framework within
which the analyst can develop a cost model. ACEIT is designed to
support Program Managers and cost/financial analysts throughout a
program's lifecycle. ACEIT applications are a collection of the premier
tools for analyzing, developing, sharing, and reporting risk-adjusted cost
estimates, providing a framework to automate key analysis tasks and to
simplify/standardize the cost estimating process. The applications are
designed to do the following:
24
Chapter 3

Store and normalize cost and technical data,

Conduct all the key cost analysis/statistical analysis functions,

Provide a framework to systematically create, edit and run cost
estimates for cases in which all the most labor-intensive processes
have been automated,

Permit users to create and share Center for Educational Resources
(CER) libraries in a controlled environment,

Automatically create and update standard and tailored cost reports,

Provide a powerful interface within ACE and Excel to conduct
detailed cost-risk analysis and generate reports,

Automatically make use of both the latest data and up to 10 years
of historical data from the OSD inflation database,

Integrate virtually any other cost or engineering tool—including
ModelCenter, PRICE, SEER, MS Project and Excel, and

Fully document risk-adjusted, phased cost estimates. [Ibid.]
These features are shown only as an illustration of what might be
addressed in a cost model and are not necessarily all-inclusive, nor must
any particular model contain all those features.
3.5.2.3.
General Cost Model Guidelines
The Program Manager should consider the following guidelines
regarding the characteristics of a good model for costing estimating; these,
with tailoring, might be useful for any model application.

Consistency in cost-element structure: The basic cost structure
should not change as a system passes through the acquisition
phases. However, the basic elements and their sub-elements
should be expanded to capture greater levels of detail.

Consistency in data elements: Data elements of the proposed
system should be consistent with those of operational systems for
which actual data exists. This allows the costs and cost-driving
parameters of the reference and proposed system to be compared.

Flexibility in estimating techniques: The estimating techniques
should be allowed to vary as a program progresses through the
various acquisition phases.
M&S in the Acquisition Lifecycle
25

Simplicity: Complexity is not desirable in an O&S cost model.
Models should be structured in a way that allows them to
accommodate more detailed information as a program progresses
through the lifecycle.

Usefulness to the design process: While the ability to estimate
costs for a CAIG review is an important function, a model’s
applicability to day-to-day program office and contractor decisionmaking is equally important.

Completeness: The model should capture all significant costs that
will be incurred by the weapon system over its useful life.

Validity: The model should provide sound, reproducible results for
its intended application.
The PM should recognize that in actual practice, cost estimating is a
melding of art and science. There is no one model that fits all, but rather
typically a custom model for each program, relying on various cost
methodologies or historical databases to address different elements of the
system. As with any other M&S effort, an experienced analyst is key to
obtaining credible results. The Cost Analysis Requirements Document
(CARD) describes the system and salient features of the program that will
be used to develop lifecycle cost estimates [DoD, 2008]. It provides a
description of the system and its key characteristics (weight, size, payload,
speed, power, etc.) for each WBS element. The CARD addresses the
operational concept, risk, quantities, manpower, system usage rates,
schedules, Acquisition Strategy, development plans and facilities
requirements for the system. Since the CARD addresses all the key cost
elements of the system, it provides the basis for cost estimating and the
use of cost models.
3.5.3.
Design and Engineering
The use of M&S is most prevalent in this functional discipline. An oftcited example is the Boeing 777 aircraft. This is the first airplane designed
solely by computer, largely via the CATIA (Computer-aided Threedimensional Interactive Application)3 system. The results of this approach
included more than a 50% reduction in change error and rework in
manufacturing [Proctor, 1994: 36].
3
This computer-aided design system was written by Dassault and licensed in the US by IBM.
Modeling Complex Systems by Designing
the Boeing 777 in Cyberspace
26
Chapter 3
Recent advances in M&S technologies have made significant changes in the way the
Boeing Aircraft Company designs, builds and tests airplanes. The Boeing 777 (B-777)
was the first aircraft created using M&S methodologies as the foundation of their [sic]
design and engineering processes. The changes wrought by M&S were very dramatic
and encompassed many areas, including technical, organizational and administrative
changes [proctor, 1994: 48]. […] Although the technical innovations were impressive,
what made the B-777 project unique was the way that Boeing integrated state-of-the-art
M&S technologies throughout the design, production, and testing of the aircraft [49].
In the past, Boeing had always made thousands of engineering drawings and then
built a full-scale, non-flying mockup of the aircraft to check fit and interference problems.
In the case of the B-777, the cost of the mockup alone was estimated to have been at
least $22.5 million. Without M&S technologies and the computer design network, all the
various aircraft systems must be designed independently [53].
M&S technologies are more than three-dimensional design tools, when dealing with
complex systems. CATIA was used as a component-level modeling method, as well as a
digital pre-assembly tool. The DBTs [Design/Built teams] used this data in conjunction
with a computer network to produce a “paperless” design that also allowed engineers to
simulate the complete assembly of the B-777. By using the three-dimensional solid
images generated on a computer, the B-777 airplane could be pre-assembled in
cyberspace to position parts properly and to ensure a good fit. Additional software tools
allowed for “fly-through” analysis of various hardware configurations. Human factors and
maintenance access questions were answered by maneuvering a digital “virtual
mechanic” in three-dimensional space. Many complex systems issues, including some
human factors and maintenance accessibility studies, could be done from individual
workstations. With a three-dimensional database that everyone could use simultaneously,
design interference problems were greatly reduced before full-scale production [55].
Because the CATIA workstations were networked, it was also easy to coordinate
design changes between the teams. The same M&S techniques used during the design
phase also allowed a “virtual” B-777 airplane to be digitally “pre-assembled” in
cyberspace. M&S allowed the teams to study the effects of design changes on the
aircraft. Design changes could be coordinated digitally before building expensive
prototypes and then redesigning for unforeseen changes. Concurrent engineering
permitted a more mature and stable design to be reached sooner [62].
M&S pervades the various specialty disciplines involved with design—
ranging from finite element analysis for structural design, to computational
fluid dynamics for aerodynamics or hydrodynamics. For human factors,
anthropometric models can be used to examine the ability of a crew
member to operate controls, repair equipment or fit within crew
compartments. What these models and simulations offer is the ability to
modify designs, analyze the effects and refine the design repeatedly prior
to building a single hardware prototype.
Figure 3.2 depicts the process whereby all of the functional disciplines
might use the same virtual prototype to support activities across the
system lifecycle—from operational requirements generation through
engineering, construction, testing, training and operations and logistics
support [NAVSEA-03].
M&S in the Acquisition Lifecycle
27
Figure 3.2.
Simulation-based
Design: Virtual
Prototyping in the
System Lifecycle
[NAVSEA-03]
3.5.4.
Manufacturing
Producibility is intimately linked with product design—shape, features,
materials, etc. The use of computer models to simulate manufacturing
processes such as metal forming, machining and casting allows one to
evaluate the ability to produce a design before actually bending metal.
The use of Computer-aided Design (CAD)/Computer-aided
Manufacturing (CAM) models allows the design and manufacturing
communities to converge on a producible design that meets the
requirement. Using the same models and simulations for design and
manufacturing—combined with the transfer of digital design databases
directly to the manufacturing floor—reduces errors, rework and, hence,
production risk. In addition to having a producible design, the program
office must be assured that the necessary capability/capacity is available
to meet planned production rates.
In the MSA phase, production planning begins with an industrial base
analysis. Considerations include the investments necessary for industrial
capabilities to provide and sustain production, tooling, and facilities
[NAVSEA-03]. During the Technology Development (TD) phase, an
initial manufacturing plan is developed to portray the facilities, tooling and
personnel resources required for production [Acker and Young, 1989: 3-9].
This plan is updated during the Engineering and Manufacturing
Development (EMD) phase based upon the planned detailed
manufacturing operations. In production readiness reviews, conducted
during EMD, the program management office (PMO) will evaluate the
capacity of the production facility to meet the required production rates.
28
Chapter 3
The PMO will also evaluate the contractor’s production planning—
including manufacturing methods and processes, facilities, equipment and
tooling, and plant layout [DoD, 2008].
3.5.4.1.
Factory Simulations
Factory simulations are used to aid in this cycle of production
planning and can support the activities mentioned above.
These
simulation tools can address production processes, factory process flow,
statistical variation in manufacturing operations, equipment, plant layout
and manpower requirements to meet production demands. Military and
commercial programs are turning to such tools to improve efficiency or to
determine facilitization requirements. These tools may be used for
planning a new production activity or to examine changes to an existing
program. An example follows showing the use of simulation to plan
changes in the periodic maintenance of C-141 aircraft [Schuppe et al.,
1993].
In this case, the periodic depot maintenance (PDM) of the C-141
aircraft fleet was impacted when two structural problems were discovered:
wing and center wing box cracks. Repair of the wing cracks and
replacement of the center wing box needed to be incorporated into the
ongoing PDM of the aircraft. Furthermore, replacement of the center wing
box was a new process for the depot—it had only been done once on a
prototype aircraft at a contractor’s facility. The SLAM II 4 simulation
language was used to simulate the ongoing PDM, along with the
introduction of the wing repair and center box replacement. A sample of
the results of this simulation include:

An achievable schedule for wing box replacement, but a shortfall
for wing crack repair,

Bottleneck locations, and

The preference to reallocate rather than purchase additional
inspection equipment.
3.5.4.2.
Commercially Owned Factory Simulations
Commercially available and industry-owned factory simulations are in
use by many weapon system contractors or maintenance depots today.
Factory simulations such as “Witness” [2009] are now being regularly
used to support aircraft, missiles, and electronics production, and depot
4
SLAM II is a simulation language which allows a modeler to formulate a system description using
process, event, or continuous world views, or any combination of the three. Since its initial release in
1981, SLAM II has undergone continual development and application.
M&S in the Acquisition Lifecycle
29
activities. A listing of commercially available, manufacturing-related
simulation programs can be found in Nwoke and Nelson [1993: 43].
Factory simulations can be used to accomplish the following:

Develop an assembly strategy,

Graphically model the assembly sequence,

Develop and validate work sequences,

Develop and validate manufacturing process plans,

Model the factory floor, including facilities and equipment,

Identify what is achievable in terms of cost and schedule,

Identify bottlenecks,

Compare different manufacturing strategies, and

Identify impacts of engineering changes, new materials, machines
or processes. [Motley, 1994]
All of these factors are important in determining the robustness of
production planning in proposal evaluation, or eventually, readiness for
production. If contractors use them—beginning no later than the
Technology Development (TD) phase—then the program office can
assume production planning has been completed properly.
3.5.4.3.
Virtual Manufacturing
The use of M&S in manufacturing is aiming toward a future “Virtual
Manufacturing” environment. In this approach, the operational
requirements identified in the synthetic battlefield environment are
translated into design concepts using three-dimensional virtual simulations
incorporating geometry and performance. These designs are passed
along to a network of distributed manufacturing simulations—which may
reside throughout a vendor base (i.e., prime contractor and its
subcontractors)—to identify a system’s manufacturing processes, facilities
and tooling requirements. This vendor base is closest to the
manufacturing processes and is in an optimal position to develop cost and
schedule estimates. These estimates may then be fed back up the chain
of command to provide better estimates of costs and schedules to support
trade-offs and system-level alternative evaluations in the Analysis of
Alternatives (AoA).
The virtual manufacturing initiative is intended to provide the ties
between new product design concepts and the processes necessary to
30
Chapter 3
manufacture them. The initiative is to start in the earliest phases of
development to provide quick and improved cost and delivery estimates,
and to smooth the transition of new process technologies into production
facilities.
3.5.5.
Test and Evaluation
The purpose of a test and evaluation program is to provide information
for risk assessment and decision-making, to verify attainment of technical
performance specifications and objectives, and to confirm that systems
are operationally effective and suitable for their intended use. Test
planning begins in MSA—resulting in the initial Test and Evaluation
Master Plan (TEMP) at Milestone B. Models and simulations supporting
the development test (DT) or operational test (OT) programs must be
discussed in the TEMP. For DT, the program must list all models and
simulations to be used and explain the rationale for their use [DoD, 2008].
For OT, the TEMP must identify planned sources of information (e.g.,
development testing, testing of related systems, modeling, simulation,
etc.) that may be used by the operational test agency to supplement this
phase of operational test and evaluation. Whenever models and
simulations are to be used, PMs must explain the rationale for their
credible use [Ibid.].
3.5.5.1.
Developmental Test and Evaluation (DT&E)
Weapon systems being developed today are increasingly more
complex. Technology is advancing; the ability to process more information
is rapidly growing, and the performance of systems is increasing. As an
example, consider the illustration in Figure 3.3 of available test assets and
data requirements for missile development programs over the last 40
years. There has been a significant increase in missile complexity and
data requirements; however, this increase in missile complexity has not
been accompanied by a corresponding increase in missile launch assets
because of tighter program cost and schedule constraints.
Figure 3.3.
Missile Data
Requirements
and Test Assets
[Adapted from Eichblatt, 1994]
3.5.5.2.
Simulation Use
Simulations, therefore, are used to “bridge the gap” between the everincreasing data requirements and the relatively constant (or even
decreasing) available test assets. Specifically, simulations can be used
for:

Pre-test planning—Insuring that the tests to be conducted are,
indeed, those that are most critical and that verify instrumentation
plans. Simulations can be used to identify the critical test points on
which to focus the live tests. Data from the simulations can be
used prior to actual testing to check out and exercise the datareduction processes.

Mission rehearsal—“Walking through” the test from initial launch
conditions to give confidence that tests will be successful. One can
use actual hardware in captive carry being stimulated with threat
simulators to check out the system and tactics prior to test.

Post-test analysis—Taking the raw test data and extracting the
critical performance parameters.

Augment actual tests—Running large numbers of simulations
over many conditions for which test assets are unavailable or when
32
Chapter 3
environmental, political, resource or safety constraints make testing
infeasible.

Risk reduction—Conducting simulations to reduce program,
political, and technical risks.
o Political risk reduction—Programs are increasingly under
scrutiny from all levels, and managers can ill afford the risk of a
live test failure. Simulations to conduct mission rehearsals and
checkout of the actual test items can reduce this risk.
o Technical risk reduction—Simulations allow developers to
evaluate far more design alternatives over more conditions in
shorter time periods than with live tests. This allows
identification and correction of technical problems early in a
program—resulting in a design that better meets technical and
operational requirements. An example of this latter case is the
use of HW/SWIL simulations.
3.5.5.3.
Operational Test and Evaluation (OT&E)
Operational Test and Evaluation (OT&E) is a comprehensive
process which uses analytical studies, analysis, component tests and
actual weapon system tests in a complimentary manner. In accordance
with Title 10, US Code:
The term operational test and evaluation does not include an
operational assessment based exclusively on (a) computer
modeling; (b) simulation; or (c) an analysis of system requirements,
engineering proposals, design specifications, or any other
information contained in program documents. [US Code, 2007, Title
10, Section 2399].
However, this does not mean that models and simulations do not have
a role in OT&E. Constraints on testing such as cost, security, safety,
ability to portray threats, treaty constraints, limitations on test
instrumentation, number/maturity of test articles, test space and lack of
representative terrain or weather may preclude a comprehensive
evaluation based on field testing alone. M&S tools can augment or
complement the actual field tests to provide decision-makers with needed
information that would otherwise not be available.
Appropriate uses of M&S include test planning; test data analysis and
evaluation to augment, extend or enhance test results; tactics
development; and early operational assessments of expected capabilities.
Specifically, the user, developer, and tester should, ideally, agree on the
M&S needed for operationally oriented assessments for a system under
consideration no later than Milestone A. This policy also reiterates the
M&S in the Acquisition Lifecycle
33
importance of describing plans in the TEMP for the use of models and
simulations in OT&E to augment, extend, or enhance field test results.
Credibility is a key part of successful use of the M&S in supporting
OT&E. This includes an acceptable M&S approach; confidence in the
models, users, methodology, and results; and a robust VV&A process.
The Service’s operational test agency is accountable for the OT results
it reports and, hence, results of any M&S it uses in support of OT&E.
Today, the test community, using advanced distributed simulation
(ADS), will be able to conduct live tests that are networked to
geographically dispersed human-in-the-loop simulations within a synthetic
environment. This provides for a realistic test/simulation in a war-like
environment with a variety of friendly and hostile combatants.
3.5.5.4.
Live Fire Testing (LFT)
Title 10 of the US Code requires realistic survivability testing of
covered systems (or product-improvement programs) and lethality testing
for major munitions programs prior to proceeding beyond low-rate initial
production. Examples of M&S supporting Live Fire Testing (LFT) include:
aircraft and missile flight-path generation; detection, tracking, and shooting
performance of artillery; warhead-target fragment interactions; penetration
mechanics and failure-mode analysis.
Evaluations of materials, fuel system design, internal routing of lines
and cables, etc., are accomplished using models and simulations that can
facilitate “design for survivability” early in development before hardware is
produced and tested [US Code, 2007, Title 10, Section 2366]. The
Survivability/Vulnerability Information Analysis Center (SURVIAC) is
a centralized information resource for information on survivability and
lethality. The SURVIAC has an inventory of models and simulations and
can provide Program Managers with technical advice.
The Acquisition Manager should recognize that the use of M&S
complements the T&E activities. It has been recommended that an
integrated model-test-model approach be implemented in development
programs with three objectives in mind:

Ensure models and simulations still meet the developer’s needs,

Use models and simulations to identify critical tests and data
requirements, to analyze data and reduce the amount of actual
testing, and

Ensure every test serves the dual purpose of evaluating system
performance and validating the models and simulations. [Johnson
and Crocker, 1999]
34
Chapter 3
Such an approach has been common in electronic combat system
development programs. These development programs intensely employ
models and simulations prior to testing within integration laboratories,
simulation facilities and, finally, in the open-air. Testing is then followed by
further modeling to analyze test data and to extract the Measures of
Performance (MOP) and Measures of Effectiveness (MOE) [Deis, 1991].
This concept of model-test-model is applicable to all system
development programs. An adaptation of the above two philosophies is
illustrated in Figure 3.3.
Figure 3.3.
Model-testmodel
Approach to
Development
[Piplani et al., 1994]
3.5.6.
Logistics Support
Models and simulations support logistics analyses across the system
lifecycle—from defining system-level supportability concepts to reliability,
availability and maintainability design requirements, to eventually
modeling actual operational capability during operations and support.
Early activities in the logistics community include building the baseline
comparison system. This can be used along with M&S to do a
comparative analysis for the proposed new system, to identify
supportability, cost, and readiness drivers, and to estimate the operations
and support portion of the lifecycle costs.
M&S in the Acquisition Lifecycle
35
In Technology Development (TD), as the weapon system becomes
more defined at the subsystem level, level of repair analysis (LORA)
models are used to identify candidate areas for interface between logistics
and design. These analyses help define trade-offs in manpower, reliability,
availability, maintainability, and alternate maintenance concepts and their
effects on supportability for specific subassemblies. Using these models
to quantify the impacts on support, logisticians can interface with the
designers to produce designs that lead to reduced overall support costs.
The LORA models will then be used for the actual repair-level decisionmaking and to form the basis for the system maintenance plan.
In EMD, models will be used to analyze repair tasks and identify the
requirements in the Integrated Logistics Support (ILS) elements for
each component. The results of these analyses form a data repository, the
Logistics Support Analysis Records (LSARs), which can be used in the
detailed identification of logistics resource requirements for each element
of logistics, as well as for projected readiness modeling. Among the
models used are provisioning models to determine initial spares
requirements and the optimum spare parts and quantities necessary to
attain end-item operational availability at the least cost.
Early in development, engineering estimates of component failure
rates are used in the models. As the system matures and is eventually
fielded, test data and actual operational data become available. This data
replaces the initial estimates on failure and repair in the LSARs. During
O&S, this information can be used in models and simulations to evaluate
actual system readiness, adjust provisioning levels or support system
operational planning. Models and simulations are also useful in this phase
to evaluate the supportability impacts of proposed Engineering Change
Proposal (ECPs) or modifications to the system.
The ILS elements and logistics support analysis (LSA) tasks are
supported by an assortment of models or simulations. One source of
information on these models and simulations is the Supportability
Investment Decision Information Analysis Center (SIDAC), which
maintains a small number of logistics models and can provide assistance
in preparing and running those models and using assorted logisticsrelated databases.
3.5.7.
Training
Training is integral to achieving and maintaining force readiness.
Despite reductions in force structure and annual operating funds, the
Services are determined to maintain their “warfighting edge” with superior
training. Throughout the DoD, simulation in support of training spans all of
the classes of simulation.
36
Chapter 3
Wargaming is used to train battle staff in planning and executing
tactics from individual system levels through combined assets applications.
This is often accomplished using constructive models representing
systems or groups of systems or may even be linked to live systems.
Facilities which support such simulations may allow multiple participants to
interact and provide recordings of events for subsequent data analysis
and debriefing of participants.
3.5.7.1.
Virtual Simulators
Virtual simulators such as weapon-system simulators (aircraft, tank,
ship, etc.) are commonly used for training. These simulators immerse
operators in a realistic environment (visual, aural, motion), allowing them
to perform a mission as if they were in the actual vehicle—thereby
receiving combat-realistic training. Another example of an operator being
immersed in a virtual environment might be an air defense simulator,
which allows operators at multiple consoles to track, identify, allocate and
control weapons using command-and-control formats obtained from other
simulated platforms. Weapon characteristics might be provided via
computer-generated weapon simulations.
3.5.7.2.
Live Simulations
Live simulations in support of training include the Army National
Training Center at Ft. Irwin, the Navy “Strike University” at Fallon Naval Air
Station, the Air Force “Red Flag” at Nellis AFB, and the Marine Corps AirGround Combat Center at Twenty-nine Palms. These simulations allow
participants to operate systems under environmental conditions which
realistically mimic life in combat. Data gathered during instrumented
exercises can be used to debrief participants and can provide the system
acquisition community valuable information on weapon systems’
performance and human interaction during close-to-real combat conditions.
3.5.7.3.
Future Applications
The future application of simulation to training will involve a
combination of live and virtual participants within synthetic environments
and will allow for training with individual participants geographically
distributed.
3.5.7.4.
Maximizing Simulation Use
The PM should aim to maximize the use of simulations between
weapon system and training system development. For instance, the B-2
aircraft program developed weapon system trainers that served an
additional function for the acquisition program. As part of the Operational
Flight Program (OFP) software development process, the B-2 aircraft
program used the weapon system trainer as a systems integration lab to
compile and check run the software in conjunction with other real and
M&S in the Acquisition Lifecycle
37
synthetic data. After any debugging, the OFP was returned to the Flight
Test Center to be certified for flight.
The above discussion provides the Acquisition Manager insight into
both the present and planned applications of models and simulation to
training. In many cases, the models and simulations that support the
development of the weapon system can be used to support the training
systems—be they system simulators or distributed training systems
combining live, virtual and constructive simulations. Currently, models and
simulations for training purposes are often developed separately by
another software development activity. The PM should not have to pay for
these simulations twice—an integrated M&S plan during the MSA phase
can help the PM transition between the system’s developmental simulator
and its training simulator.
Modeling the Integration of Open Systems and Evolutionary Acquisition
38
3.6.
Chapter 3
Open Systems and Evolutionary Acquisition are two recent innovations designed to
improve program performance with flexibility. The full potential of these approaches has
not been captured, partially because of integration challenges during implementation.
Dillard and Ford’s June 2008 study investigated the impacts of open systems and
evolutionary acquisition on DoD development programs. The researchers used changes
required to use both Open Systems and Evolutionary Acquisition to identify and describe
impacts of implementation on program process and management. They then used a
dynamic simulation model of a program using both Evolutionary Acquisition and Open
Systems to map these impacts.
This model’s structure reflected the arrangement of development work moving
through the separate development blocks of an acquisition project. In the model, four
types of work flowed through each block of an acquisition project. Within a development
block, each type of work flowed through a development phase that completes a critical
aspect of the project. This conceptual model was used to build a formal computer
simulation model of an acquisition program that can reflect evolutionary acquisition and
the use of open systems. The simulation model was a system of nonlinear differential
equations. Each phase was represented by a generic structure, which was parameterized
to reflect a specific phase of development.
Simulation reinforced the potential for open systems to accelerate acquisition and
revealed a potentially important distinction between design and integration errors in
explaining the impacts of required changes. Implications for practice included shifts in the
type and timing of risks due to open systems use and the possibility of trading design
obsolescence for integration obsolescence.
Dillard and Ford’s research contributed to the understanding of open systems and
evolutionary acquisition in several ways. The work improved the description and
specification of impacts of acquisition policy on acquisition practice. It also used dynamic
computer simulation to model and investigate open systems and to model evolutionary
acquisition and open systems together, both for the first time. The results of the
simulation reinforced several suggested impacts of open systems and provided additional
causal rational behind why suggested impacts may occur. These rationales were the
basis of potential implications for the evolutionary acquisition practice with open systems.
The reasoning provided, based on the computer simulation, can now be used to extend
and deepen decision-makers’ understanding of open systems and evolutionary
acquisition and the design of program processes and management. [Taken directly from
Ford and Dillard, 2008: i, 1, 13, 15, 31]
Why use M&S?
By stepping back from the specific steps of the program lifecycle and
taking a more generalized view of this topic, readers can see how M&S
can help reduce costs that impact both total lifecycle costs, as well as
design and development costs. In addition, they can understand how
M&S reduces the cost of requirements planning, research and
development and training because simulations offer a low-cost way to
experiment with new concepts and to test systems on a large scale. If a
model is designed properly, a PM can use its results with confidence to
predict the performance of the modeled system. Decision-makers can also
use M&S as a way to anticipate and resolve issues that otherwise cannot
be modeled. For example, certain training scenarios simulate dangerous
M&S in the Acquisition Lifecycle
39
battlefield conditions such as a nuclear or biological attack. M&S can be
used extensively to mitigate risk; however, many Acquisition Managers
are not aware what M&S can do for them.
3.7.
Chapter Summary
This Chapter provided an overview of the use of models and
simulations across the acquisition lifecycle and in specific acquisition
activities. The challenge for PMs in using these models and simulations
efficiently is to:
3.8.

Integrate the use of M&S within program planning activities and
documentation,

Plan for lifecycle application, support and reuse of models and
simulations, and

Integrate M&S across the functional disciplines.
Chapter References
Acker, D., and S. Young. Defense Manufacturing Management Guide for Program
Managers, 3rd ed. Fort Belvoir, VA: Defense Systems Management College,
April 1989, 3-9.
Defense Acquisition University (DAU). ACEIT—Automated Cost Estimating Integrated
Tools.
Fort
Belvoir,
VA:
Author,
June
2005.
https://akss.dau.mil/Lists/Software%20Tools/DispForm.aspx?ID=14
Defense Acquisition University (DAU). Glossary of Defense Acquisition Acronyms and
Terms, 12th ed. Fort Belvoir, VA: Defense Acquisition University Press, July 2005,
B-107.
Deis, M. R. The Air Force Electronic Combat Development Test Process. Eglin AFB,
FL: Air Force Developmental Test Center (AFDTC/XRE), May 1991.
Department of Defense (DoD). Cost Analysis Guidance and Procedures (DoD 5000.4M-1). Washington, DC: Author, 2007.
Department of Defense (DoD). DoD Modeling and Simulation (M&S) Glossary.
Washington, DC: Author, 1998.
Department of Defense (DoD). Modeling and Simulation (M&S) Master Plan (DoDD
5000.59-P).
Washington,
DC:
Author,
1995,
A-6.
http://www.dtic.mil/whs/directives/corres/pdf/500059p1.pdf
40
Chapter 3
Department of Defense (DoD). Operation of the Defense Acquisition System (DoDI
5000.02). Washington, DC: Author, December 8, 2008.
DiPetto, Chris. Deputy Director, OUSD (AT&L) A&T/SSE/DT&E, March 26, 2007.
Eichblatt, Emil J. Naval Air Warfare Center (Weapons Division). Notes. Point Mugu NAS,
CA, February 10, 1994.
Ford, D.N., and J.T. Dillard. Modeling the Integration of Open Systems and Evolutionary
Acquisition in DoD Programs (NPS-AM-08-108). Monterey, CA: Naval
Postgraduate School, June 2008.
http://www.acquisitionresearch.net/_files/FY2008/NPS-AM-08-108.pdf
Johnson, L.H., and C.M. Crocker, Jr. Cost Effective Weapon System Development
through Integrated Modeling and Hardware Testing. Redstone Arsenal, AL: US
Army Test and Evaluation Command, Redstone Technical Test Center, 1999.
Johnson, Michael V.R., Mark F. McKeon, and Terence R. Szanto. Simulations-based
Acquisition: A New Approach. Fort Belvoir, VA: Defense Systems Management
College, December 1998, 1-2.
Miller, D.E. (Maj.). Modeling and simulation technology: A new vector for flight-test.
School of Advanced Airpower Studies, Air University, June 1998.
https://www.afresearch.org/skins/rims/q_mod_be0e99f3-fc56-4ccb-8dfe670c0822a153/q_act_downloadpaper/q_obj_a5f44be7-c9e5-46b5-a31a4b96637a0760/display.aspx?rs=enginespage
Motley, William, FDMM, Defense Systems Management College, Fort Belvoir, VA.
Correspondence, May 17, 1994.
Myers, Fred, and Jim Hollenbach. “Improving M&S Support to Acquisition: A Progress
Report on Development of the Acquisition M&S Master Plan.” NDIA Systems
Engineering Conference, October 26, 2005.
Nwoke, B.U., and D.R. Nelson. An Overview of Computer Simulation in Manufacturing.
Industrial Engineering. July 1993, 43.
Naval Sea Systems Command (SEA-03)
Patenaude, Annie. Study on the Effectiveness of Modeling and Simulation in the
Weapon System Acquisition Process. Washington, DC: Office of the Secretary of
Defense, October 1996.
Piplani, Lalit, Joseph Mercer, and Richard Roop. Systems Acquisition Manger’s Guide
for the Use of Models and Simulations. Fort Belvoir, VA: Defense Systems
Management College Press, September 1994.
M&S in the Acquisition Lifecycle
41
Proctor, P. “Boeing Rolls out 777 to Tentative Market.” Aviation Week & Space
Technology, April 11, 1994, 36.
Reed, F. “You Can look but You Can’t Touch.” Air & Space, April/May 1994, 54.
Schuppe, T. F., D.V. McElveen, P.H. Miyares, and R.G. Harvey. “C-141 Depot
Maintenance: Using Simulation to Define Resource Requirements.” Air Force
Journal of Logistics (Winter-Spring 1993): 1145-1152.
USAOPTEC. Memorandum Number 73-21, December 9, 1993.
Under Secretary of Defense (Acquisition, Technology & Logistics) (USD (AT&L)). DoD
Modeling and Simulation (M&S) Management (DoDD 5000.59). Washington, DC:
Author, August 8, 2007.
United States Code, Title 10, Section 2366. Major Systems and Munitions Programs:
Survivability Testing and Lethality Testing Required before Fullscale Production.
Washington, DC: US Printing Office, January 4, 2004.
United States Code, Title 10, Section 2399. Operational Test and Evaluation of Defense
Acquisition Programs. Washington, DC: US Printing Office, January 3, 2007.
US Army Material Command. Logistic Support Analysis Techniques Guide (AMC
Pamphlet AMC-P 700-4). Headquarters, Redstone Arsenal: Author, February 20,
1991.
US Army Test and Evaluation Command (TECOM). TECOM Modeling and Simulation
Master Plan (Final Draft), October 1993.
"WITNESS.” AT&T Istel Visual Interactive Systems, Inc., 2008.
=
THIS PAGE INTETIONALLY LEFT BLANK
QK=
jCp=^ííêáÄìíÉë=~åÇ=
eáÉê~êÅÜó=
Now that the reader has a basic background in M&S concepts, history
and applicability, the coming chapter will establish some standard M&S
definitions and analyze the M&S hierarchy.
4.1.
Definitions
As a matter of review, the M&S field uses models and simulations,
either statically or over time, to develop data as a basis for making
managerial or technical decisions. This includes, but is not limited to,
emulators, prototypes, and simulators. Furthermore, the subject of the
model or simulation is referred to as the Simuland, which is the real-world
(or notional) system of interest, object, phenomenon, or process to be
simulated. In order to do this, the Referent—or the body of knowledge
available to a modeler regarding a simuland—will be utilized in developing
the simulation. This information may either be quantitative and formal
(e.g., physics equations for aircraft flight dynamics), or it may be
qualitative and informal (e.g., pilot’s expectation of buffet before stall).
4.1.1. Attributes: Validity, Resolution and Scale
An attribute is a significant or defining property or characteristic of a
model or simulation. The three most important attributes—validity,
resolution, and scale—are defined below:
Validity (or what is sometimes referred to as fidelity) is the measurement
of the accuracy of model’s representation or simulation’s results. Validity
is not only relative to how well the model results match reality but also to
how well the model has represented the various aspects of the simuland.
It also is relative to the requirements of the model, as different applications
may require different levels of fidelity.
Resolution (or what may be called granularity) references the degree of
detail with which the real-world is simulated—more detail equates to
higher resolution. For example, the simulation of a squadron attack as a
whole would be a low-resolution simulation, while the simulation of a
sensor in the weapon systems or one aircraft within the squadron would
be a high-resolution simulation due to the greater detail included within the
simulation.
44
Chapter 4
Scale (or level) is the final attribute of interest in this discussion. Scale
refers to the size of the overall scenario or event the simulation represents.
Typical scales for military simulations include the following categories:

Engineering, Component
▪

Engagement, Platform
▪

1-vs-1 to many-vs-many
Mission, Battle
▪

System or subsystem of a single entity
10s to 1,000s of entities
Theater, Campaign
▪
10,000s of entities
While these three attributes are all important, they are not all directly
related to one another. For example, it is commonly assumed that validity
and resolution are correlated. This assumption is false: one may have an
extremely accurate model that captures the simuland well in terms of
validity, even if the simulation is at a low resolution. Likewise, scale and
validity are also assumed to correlate, but for similar reasoning as above,
one can still have a very valid model even if the scale is large. It does
generally follow, however, that more resolution does equate with less
scale.
4.1.2. M&S Categories
In addition to considering the attributes of a model or simulation, one
must also consider the categorical type of model or simulation, as certain
categories are more applicable to specific scenarios than others. The
modeling method is the basis on which a model represents its subject. It
has inherent advantages and disadvantages, depending on the specific
situation. Following are the various categories commonly found.
4.1.2.1.
Training
Simulation training is a form of simulation used to produce learning in a
user or patient. Advantages of simulation training include a safer, more
forgiving environment in which the user can develop skills and a
knowledge base that can be applied outside of the simulated environment.
Simulated training applications include the following examples:
M&S Attributes and Hierarchy


Aircraft control skills
▪
Work with high-fidelity flight simulator (virtual)
▪
Develop psycho-motor skills
Team tactics training
▪ Work with Close
(virtual/constructive)
▪

45
Combat
Tactical
Trainer
(CCTT)
Develop cognitive skills, more reactive
Command staff training
▪
Work with the Joint Simulation System (JSIMS) (constructive)
▪
Develop cognitive skills, more deliberate
4.1.2.2.
Analysis
In analysis, simulation is a valuable tool and can be used to predict,
design, test, or evaluate a real or notional system or idea. While Program
Managers (PMs) may conduct analysis to answer a specific question, the
benefit of utilizing this simulation category is that the simulation may be
repeated in multiple trials. PMs can take care in the experimental design
to plan trials in advance that cover myriad cases and can run multiple
trials to achieve higher statistical significance. Following is a sample of
simulation examples and their purposes:

Course of action analysis


Force structure analysis


Compare different unit organizations
Doctrine analysis


Predict outcome of alternative plans
Evaluate new tactical doctrines
Operational test and evaluation

Simulate weapon system performance trials
4.1.2.3.
Experimentation
Within experimentation, simulation is used to explore design or
solution spaces, or to gain insight into an incompletely understood
46
Chapter 4
situation [Ceranowicz et al., 1999]. In other words, this use of simulation
may be seen as a way to “explore” possibilities and is less “controlled”
than analysis simulation in that later trials may be determined by earlier
ones instead of being planned in advance. Additionally, statistical
significance may not be relevant in experimentation. This form of
simulation would be applicable in the following situations to perform the
listed functions:

Long-range strategic assessment


Simulate outcome of hypothetical conflicts
Broad effectiveness exploration

Assess new doctrinal concepts
4.1.2.4.
Acquisition
Acquisition can benefit from simulation, which it is used to specify,
design, develop, and acquire new systems. As defined earlier, this
category may also be referred to as Simulation-based Acquisition (SBA).
Within this category, simulation may be utilized to improve the design and
manufacturing (“Build the thing right”), as well as to support
effectiveness and selection (“Build the right thing”). An important goal of
the DoD as it pertains to SBA is that simulation can save time and money.
Following are some simulation examples:

System design


Simulate alternative system designs to assess capability and
reliability (“Engineering” level)
System selection

Compare combat effectiveness of alternative notional weapons
systems
4.1.2.5.
Randomness
There are two basic ways of dealing with randomness within the
simulation environment. With deterministic simulation, a given set of
inputs will produce a determined, unique set of outputs. For example, this
form of simulation is commonly seen in engineering—in which case,
physics-based rules determine the outcome of specific inputs. In contrast,
stochastic simulation accepts random variables as inputs, leading to
random outputs. This type of simulation is most commonly found in
combat simulation and utilizes pseudo-random numbers. One of the
benefits of stochastic simulation is that it has the ability to combine
M&S Attributes and Hierarchy
47
characteristics of randomness with the advantages of experiment
repeatability.
Another method of handling randomness is through Monte Carlo
simulation, which combines the stochastic simulation of real-world
systems modeled with probability distributions. Another way to think about
this is that the physics of a system are not modeled explicitly, but rather
through the use of probability distributions. Additionally, in simulation, one
will find randomly generated parameter values used to compute possible
outcomes. Normally, Monte Carlo simulation utilizes multiple trials and
statistical analysis. In addition, it is static—meaning the simulation
represents a single time instant with no time advance.
4.1.2.6.
Time
As with randomness, there are a few different ways to account for time
in M&S. To begin, a discrete model is a model in which model-state
variables change only at a discrete set of points in time (events) [Banks,
Carson & Nelson, 1996]. Simulation using discrete models is most
commonly seen in simulated manufacturing processes or service queues.
In contrast, a continuous model is a system in which state variables can
change continuously over time [Ibid.]. Typically, when this type of
simulation is conducted on a computer, time advances in small, fixed-time
increments—necessarily resulting in a quasi-continuous simulation.
4.1.3. M&S Methods
Just as there are various M&S categories, there are also various
methods that may be applied in developing a model or simulation. It is
valuable for the guidebook to briefly explore these methods so as to
familiarize the reader with some of the terminology that he/she may
encounter. The following methods do not represent an exhaustive list, but
were selected as representative of the types of available methods and are
categorized as non-executable and executable.
4.1.3.1.
Non-executable Methods
4.1.3.1.1.
Visual Model
A visual model is a model that takes the appearance of an object,
perhaps in different variations. This form is often based on polygons and
textures and may or may not be specific to a particular image generator.
Additionally, it should be noted that fidelity of the image is not directly
related to fidelity of underlying physics or behavior.
4.1.3.1.2.
Conceptual Model
A conceptual model is a model of a simuland or scenario defined in
an informal or semiformal way. Commonly, a conceptual model is used to
48
Chapter 4
represent structure and relationships; it exhibits intermediate detail, as the
focus of the model is on components and their interactions. Conceptual
models are used to communicate between users and modelers and to
compare model capabilities to requirements. In addition, a conceptual
model can also be used to lay the groundwork for a more detailed future
model.
4.1.3.1.2.
Physical Model
A physical model, or surrogate, is a physical object which models
physical aspects of that which is being modeled. This form of model is
typically used in live simulation, and although it may not be a perfect
replica, it is “close enough” for the purpose of the exercise. Use of a
physical model is often motivated because use of a simuland is dangerous,
is expensive, or is simply not available for use.
4.1.3.2.
Executable Methods
Executable methods include physics-based models, which are
mathematical models in which the models’ equations are based on the
equations used in physics to describe the phenomenon being modeled.
4.1.3.2.1.
First Principles Model
A first principles model is a model based on physics equations,
which is why they are sometimes said to be based on “first principles.”
However, despite this definition, it is important to note that these “first
principles” do not guarantee validity.
4.1.3.2.2.
Finite Element Model
The Finite Element Model (FEM) is a method for modeling large or
complicated objects by decomposing them into a set of small “elements”
and by modeling the elements. The underlying concept of FEM is that the
elements are represented by a “mesh” of nodes (elements) and edges
(neighboring elements). These nodes are then modeled with physics
equations. Generally, increasing the node count and/or decreasing
duration of simulation time step increases fidelity and computational cost.
4.1.3.2.3.
Data-based Model
As its name might indicate, a data-based model is one that is based
on data—rather than on equations—describing the represented aspects of
the simuland. The model is not based on physics equations. This data is
collected (or generated) in advance, and may be sourced from a number
of venues—including operational (field) experience, T&E results, or other
simulations. Data-based models are commonly applicable in the following
situations:
M&S Attributes and Hierarchy

Surrogate not available,

Physics of model subject not understood,

Computation costs of physics-based too high,

Reliable data available, and/or

Subject of model is classified.
4.1.3.2.4.
49
Aggregate Model
An aggregate model represents a large number of small objects and
actions in a combined, or aggregate, way. Aggregate models are often
used in constructive simulation and are generally not directly physicsbased because many of the physics-based interactions are abstracted.
4.1.3.2.5.
Hybrid Model
Employing M&S
The final model method reviewed is the hybrid model, which is a
model combining more than one of the previously noted modeling
methods. Perhaps more common than some of its composite methods on
their own, this modeling method grants the advantages of multiple
modeling methods.
In December 2006, Hagan and Slack conducted an MBA research project through
the Naval Postgraduate School that emphasized the validity of organizational modeling.
The goal of this project was to determine how to decrease the F414 engine throughput
time at the Aircraft intermediate Maintenance Division (AIMD) at Naval Air Station (NAS)
Lemoore, California. To achieve this goal, the researchers employed organizational
modeling to evaluate how changes to the organizational structure of the Lemoore AIMD
affected engine throughput time. Hagan and Slack acquired data to build the
organizational model via interviews with AIMD personnel. They developed a baseline
model of the AIMD organization in order to model the organization’s current structure and
performance. The actual, real-world duration required to conduct F414 maintenance was
compared to the duration predicted by the model and was determined to be within
3%. Once confidence was gained that the baseline model accurately depicted the
organization’s actual F414 maintenance performance, the researchers made
modifications or interventions to the model to evaluate how organizational changes would
affect F414 maintenance duration. Interventions included paralleling the tasks associated
with accomplishing administrative paperwork when initially receiving the F414 engine,
and tasks associated with on-engine maintenance, combining personnel positions, adding
personnel, and modifying the duration and frequency of meetings. The modeled results
of these modifications indicated that the paralleling effort significantly decreased the F414
maintenance duration; likewise, decreasing meeting frequency and slightly increasing
duration also facilitated a decreased maintenance duration. [Adapted directly from Hagan
and Slack, 2006: i]
50
4.2.
Chapter 4
M&S Classes
The previous section provided a more finite review of M&S definitions.
The intention of the remaining sections of this chapter will be to take a
step back from that finite view in order to capture a more cohesive
overarching view of M&S capabilities.
This discussion will continue by laying out a framework of model and
simulation classes. Before proceeding further, readers are cautioned not
to become too enamored with the terminology, nor should they try to fit
every model or simulation neatly into one of the classes, as the lines
across the various classes of models and simulations are becoming
blurred. In other words, technology allows linkage and interoperability
among the various classes of models and simulations; human interactions
can span across all these categories, as well. Therefore, as noted above
with regard to the hybrid model, one often is not simply talking about a
single model or simulation, but rather hybrids formed from among two or
more classes. The various modeling methods presented above will be
consolidated into three primary M&S classes—Constructive, Virtual and
Live. These classes are described in more detail below.
4.2.1. Constructive Models and Simulations
The models and simulations contained within this class currently
represent the predominant form of M&S tools used within or in support of
a program office. Constructive models and simulations consist of
computer models, wargames and analytical tools that are used across a
range of activities. At the lowest levels, they may be used for detailed
engineering design and costing, or for subsystem and system
performance calculations to support development of technical
specifications. Higher-level models and simulations provide information on
the outcomes of battles or major campaigns involving joint or combined
forces, identify mission needs and support operational effectiveness
analyses.
A variety of constructive models may be used to represent a system
and its employment at different levels of detail—from engineering physics
of piece parts to aggregated combat forces in a campaign analysis.
Many constructive simulations may be performed either with or without
human interaction. Without human interaction, they might be run in
multiple iterations to provide statistical confidence in the outcomes of the
simulation. With human interaction, they are often referred to as
wargaming simulations and are used for battle-staff training or tactics
development. The tactics developed in such interactive simulations may
then be used for establishing tactics within the non-interactive
simulations.
M&S Attributes and Hierarchy
51
Within acquisition, the uses of constructive models and simulations
include design and engineering trade-offs, cost, supportability, operational
and technical requirements definition and operational effectiveness
assessments.
4.2.2. Virtual Simulation
4.2.2.1
Human-in-the-Loop
Virtual simulation brings the system (or subsystem) and its operator
together in a synthetic, or simulated environment. Although this document
uses the term human-in-the-loop (HITL) to represent these simulations,
other names include man-in-the-loop, warfighter-in-the-loop, or
person-in-the-loop.
In a virtual simulation, the system may include actual hardware that is
driven (stimulated) by the outputs of computer simulations. As an
example, a weapon system simulator may employ a near-real crew
compartment with the correct equipment, controls and display panels. A
computer-generated synthetic environment is then displayed on a screen
in front of the crew and reflected in the crew compartment instrumentation
and displays. Motion of the platform may be driven by the computer
simulation to represent the system dynamics. Sounds of the system and
equipment can also be duplicated.
The operators are thereby immersed in an environment driven by the
simulator that to them looks, feels, and behaves like the real thing. During
simulated missions, the crew must operate the equipment, receive
commands and control weapons just as in a real system.
Human-in-the-loop simulations provide a better understanding of
human reactions and decision processes and man-machine interfaces.
They can provide both a platform for crew training prior to live exercises
and tests, as well as realistic mission rehearsal in preparation for actual
combat operations.
By linking HITL simulations to other simulators, PMs can examine the
interaction of multiple weapon systems; such combinations can illustrate
the need for changes in tactics or engagement rules. These simulations
also provide powerful tools for evaluation of actual system hardware and
software within realistic environments for developmental programs.
Human-in-the-loop simulations run in real-time; hence, they may
require fewer iterations than non-interactive constructive simulations.
4.2.2.1.
Virtual Prototypes
A more advanced concept for virtual simulation is virtual prototyping.
In this realm, a three-dimensional, electronic, virtual mockup of a system
52
Chapter 4
or subsystem allows an individual to interface with a realistic computer
simulation within a synthetic environment.
The representation is solely a computer simulation. As it does not
employ actual system hardware, it may be applied in early prototyping
work to evaluate concepts, human-machine-interfaces, or to allow
designers, logistics engineers and manufacturing engineers to interface
with the same design. Such an approach supports concurrent engineering
(or Integrated Product and Process Development (IPPD)) by providing
a common platform from which all functional disciplines can work.
Current trends indicate this ability of the designer, operator, maintainer
and manufacturer all interacting with the same realistic, three-dimensional
representation of the system will become more prevalent in future
acquisition.
4.2.3. Live Simulations
Live simulations are live exercises in which troops use equipment
under actual environmental conditions that approach real life in combat
and provide a testing ground with live data on actual hardware and
software performance in an operational environment.
These data also can be used to validate the models and simulations
used in an acquisition program. This form of simulation provides the stress
and decision-making that is associated with human-in-the-loop simulation.
By introducing multiple types of platforms, PMs can evaluate actual
interaction and interoperability of multiple systems. However, assembling
the personnel and equipment and conducting a live simulation is a
resource-intensive enterprise requiring time, funds and people.
4.2.3.1.
Constructive and Virtual Simulations
Constructive and virtual simulations may already have been
conducted prior to live simulations to plan the tests or exercises, identify
critical issues, rehearse the mission or train the participants. They may
also be used to analyze results after the test, or to augment tests to
address scenarios that may not be feasible due to safety or environmental
reasons. With the high cost of live simulations (tests), the use of other,
less resource-intensive forms of M&S is a good idea. For example, an airto-air missile in development might be valued at $1 million, and a training
torpedo firing could cost up to $50,000. As an integral part of test planning
and support, M&S will allow a Program Manager to use such valuable
assets more efficiently. For even greater benefits to their programs,
managers must insure that live simulations include adequate
instrumentation. The data thereby collected will serve two important
purposes: to further validate models and simulations and to provide
ground truth data to support post-exercise debriefs.
M&S Attributes and Hierarchy
53
As previously noted, human interaction may be a part of any of the
classes of M&S. The acquisition Program Manager may choose to employ
human interaction in M&S for two reasons:

Determination of human decision-making or logic patterns and their
impact on system performance and effectiveness. Simulations of
any class requiring human input may serve this function.

Identification and refinement of human-machine interfaces. This
results from simulations that allow for the human to act as part of
the system, such as in manned simulators or live exercises.
These three classes of models and simulations (constructive, virtual
and live) may be used in varying levels of detail to support a variety of
activities—ranging from detailed engineering design to the military utility of
a new system or technology on the battlefield. To describe the different
levels of models and simulations used to support these activities, the next
section will introduce a hierarchy of models and simulations.
4.3.
Hierarchy of Models and Simulations
Models and simulations support acquisition program activities ranging
from design to operational effectiveness assessments. This assortment of
tasks requires a suite of models and simulations with differing levels of
detail suited to their particular application. These models and simulations
form what may be called a hierarchy of models and simulations.
Hierarchies of models and simulations are described in documented
form [HQ AFMC, 1993] and are also found in undocumented form
throughout the DoD. These hierarchies are similar in concept and vary
only in detail. Some extend to higher levels, including national policy and
force structure planning, while others extend down to include actual
testing. This document describes a hierarchy that is representative of
those that the reader may come across or use. This hierarchy is depicted
in Figure 4.1 alongside a force-level and system work breakdown
structure (WBS); the latter indicates the system level that corresponds
with the level of analysis to be performed.
54
Chapter 4
Figure 4.1.
Hierarchy of
Models and
Simulations
[Douglas, 1999]
The levels within this hierarchy include the following:

Engineering: for design, cost, manufacturing and supportability.
Provides measures of performance (MOP).

Engagement: for evaluating system effectiveness against enemy
systems. Provides measures of effectiveness (MOE) at the
system-on-system level.

Mission/Battle: for depicting the effectiveness of a force package
or of multiple platforms performing a specific mission. Provides
MOE at the force-on-force level.

Theater/Campaign: for predicting the outcomes of joint/combined
forces in a theater/campaign-level conflict. Provides measures of
value added at the highest levels of conflict, sometimes called
measures of outcome (MOO).
Each of these hierarchical levels will be discussed in more detail in the
sections to follow.
M&S Attributes and Hierarchy
55
4.3.1. Engineering-level Models and Simulations
Engineering-level models and simulations are concerned with the
performance; producibility; supportability; cost of components, subsystems
and systems; and the trade-offs associated therewith. At the engineering
level, there are literally thousands of models and simulations, including:

Basic phenomenology—such as aerodynamics,
hydrodynamics, heat transfer, acoustics, fatigue, etc.
fluid
flow,

Physics-based models of components; subsystems; and systems
for design, performance, costing, manufacturing and supportability.
For acquisition, engineering-level models and simulations provide
three main benefits: 1) provide the basis for design trade-offs at the
component, subsystem and system levels, 2) support development of
technical design specifications, and 3) support test and evaluation of a
system. Following are the three types of models that the Program
Manager will encounter.

Cost models provide development, production, and operations and
support costs.

Support models can include reliability, availability
maintainability; level of repair; and provisioning analyses.

Manufacturing models and simulations can provide both
information on producibility of a particular design as well as
simulation of work flow on the factory floor. They can also identify
facilitization requirements.
and
These engineering-level models indicate performance capabilities,
often termed measures of performance (MOP). Examples of these
measures include radar acquisition range, miss distance, range, payload
or speed. Such performance parameters might be used in the system and
development specifications.
The representations of the system in higher-level models and
simulations should have their basis in these engineering-level models. It
is in those higher-level models and simulations that the actual impacts of
weapon system performance on combat effectiveness are evaluated.
4.3.2. Engagement-level Models and Simulations
Engagement models and simulations represent the system in a
limited scenario, such as one-on-one, few-on-few or sometimes many-onmany. This level of simulation evaluates the effectiveness of an individual
platform and its weapons systems against a specific target or enemy-
56
Chapter 4
threat system. These models rely on the system performance, kinematics
and sensor performance provided by the engineering-level models and
simulations. They provide survivability, vulnerability and lethality results
for measures of system effectiveness or for use in higher-level models.
Detailed performance of subsystems—such as propulsion, combat
systems, sensors, and guidance and control—may be included and
evaluated.
The outputs of engagement-level models and simulations indicate the
effectiveness of systems and subsystems in an engagement scenario and
are termed measures of effectiveness (MOE). Examples include
probability of kill, losses or mission aborts.
Acquisition programs can use
simulations to identify the following:
engagement-level
models
and

System effectiveness and performance to support requirements
documents (Initial Capabilities Document (ICD) and Capability
Development Document (CDD)) and Analysis of Alternatives (AoA),

System-level performance trade-offs,

Test and evaluation support, and

Potential or necessary tactics changes and new weapon concepts.
4.3.3. Mission/Battle-level Models and Simulations
Mission/battle-level models and simulations reflect the ability of a
multi-platform force package to accomplish a specific mission objective—
such as air superiority, interdiction or strike—which might span a period of
hours. It might consist of an attacking force of fighter and electronic
warfare aircraft, of a combined arms group attack or defense, or of carrier
battle group operations consisting of aircraft, ships and combat systems
against an integrated air defense (e.g., Surface-to-Air Missiles, enemy air
assets).
In conjunction with human participation, mission/battle-level
simulations may be used for wargaming, training and tactics development.
The outputs of mission/battle-level models and simulations are MOE.
Examples of these MOEs might include loss-exchange ratios, probabilities
of engagement, or success in achieving a specific mission objective.
The acquisition applications of such M&S include the following:

Analysis in support of requirements for analysis,
M&S Attributes and Hierarchy
57

Operational effectiveness analyses for alternatives evaluation in
Analysis of Alternative (AoA),

Examination of interoperability and compatibility issues, and

Support of test and evaluation.
4.3.4. Theater/Campaign Models and Simulations
Theater/campaign models and simulations represent combined
force combat operations and are used to determine the long-term outcome
of a major theater- or campaign-level conflict. Forces are often
represented as aggregations of lower-level forces and systems. These
models and simulations can identify major deficiencies in capabilities of
force structures and employment alternatives.
Since these simulations usually encompass longer periods of warfare,
they are more likely to include sustainment representations within the
model. These models usually require the results of lower-level
(engineering, engagement or mission/battle) models and simulations as
inputs to generate the aggregated, force-level capabilities. Some may
even have the capability to directly incorporate more detailed models of
specific systems within their input architectures.
As with models and simulations within other levels of the hierarchy,
theater/campaign-level simulations might be run with human interaction. In
this interactive mode, they may by used as a wargaming tool for battlestaff training or tactics development.
Whereas the engineering-level models are used to determine actual
performance values for the components, subsystems, or systems being
modeled, the higher-level models in the hierarchy are used to establish
trends, identify driving factors and obtain relative comparisons of military
utility among systems or groups of systems being analyzed.
The measures which result from theater/campaign-level models and
simulations are sometimes termed outcomes. Examples of outcomes
may include force drawdowns or battle group losses, air superiority and
ground-force movements.
Acquisition applications
simulations include:

of
theater/campaign-level
models
and
Evaluation of force-level combat outcomes in Functional Area
Assessments (FAA)—leading to development of the Initial
Capabilities Document (ICD),
58
Chapter 4

Provision of support of Cost and Operational Effectiveness
Analysis (COEAs), and

Evaluation of the impacts of new systems or operational concepts.
4.3.5. Hierarchy Summary
The hierarchy discussed above represents an integrated framework for
analysis of performance, effectiveness, tactics and doctrine, and conflict
outcomes. Each level in this integrated framework is aimed at addressing
specific issues and relies on information obtained in analyses conducted
at other levels.
Figure 4.3 summarizes the primary attributes of the various models
and simulations within each level of the hierarchy. It also includes
representative examples.
Figure 4.3 Attributes and Uses of Models and Simulations
Attributes and Uses of Models and Simulations within the Hierarchy
Level of
Model
Engineering
Engagement
Mission/Battle
Theater/Campaign
Force
Single weapon
systems,
Subsystems,
Components
One to a few friendly
entities vs. One to a
few enemy entities
engagement
Multi-platform, Multitasking force package
Joint/Combined
Level of
Detail
Highly detailed—
down to individual
piece parts, their
interaction &
phenomenology
Individual entities and
detailed subsystems
Some aggregation or
individual entities
Highly aggregated down to
individual entities (Tank,
Ship, A/C)
Time Span
Months-Subseconds
Minutes-Seconds
Hours-Minutes
Weeks-Days
Outputs
 Measures of
Performance of
System,
Subsystems, &
Components
(e.g., Miss distance,
target acquisition
range)
 System
Effectiveness
(e.g., Probability of
kill, losses, aborts,
survivability,
vulnerability)
 Mission
Effectiveness
(e.g., Loss exchange
rations, probabilities of
engagement)
 Campaign Outcome
(e.g., Air superiority, force
drawdowns, ground force
movements)
Use
 Design
 Subsystem &
Component
Performance &
Tradeoffs
 Specification
Requirements &
Compliance
 Cost, Support,
Producibility
 Test Support
 Facilitate IPPD
 Alternative Eval
(COEA)
 Requirements
(MNS, ORD)
 System
Effectiveness
 System Tradeoffs
 Tactics, Rules of
Engagement
 Test Support
 Alternative Eval
(COEA)
 Requirements
(MNS, ORD)
 Deployment
 Weapons
Integration
 Interoperability
 Tactics & Ops
Concepts
 Training and
Wargaming
 Alternative Eval (COEA)
 Requirements (MNS,
ORD)
 Tactics/Employment
 Wargaming
 Battle Staff Training
 Sustainment Issues
[Adapted from Piplani et al., 1994]
M&S Attributes and Hierarchy
59
As follows in Figure 4.4, a program will likely employ a suite of models
and simulations. The engineering-level models will provide measures of
performance along with design, cost, producibility and supportability
information for components, subsystems or systems. The military utility of
the system is evaluated within engagement- and mission/battle-level
models that indicate MOE. At the highest level, the outcomes of major
conflicts involving combined forces are evaluated within theater/campaignlevel models. Human-in-the-loop/virtual simulators and virtual prototypes
may provide information at all levels of the hierarchy. As in any analysis,
the input data and assumptions are major drivers in the results of all
simulations.
Figure 4.4.
Relationship of
Models and
Simulations
[Piplani et al., 1994]
4.4.
Chapter Summary
After first outlining the most prevalent model and simulation attributes
and categories, this chapter then reviewed the more common M&S
methods. Next, it provided an overview of the classes of models and
simulations that may be used during the acquisition lifecycle. The PM
should remember that there is no single model or simulation that will suit
all of a program’s needs. Each model or simulation has a specific purpose
for which it is intended and will provide information at the requisite level of
detail to support specific activities during the program lifecycle.
60
4.5.
Chapter 4
Chapter References
Banks, J., J. S. Carson, and B. L. Nelson. Discrete-event System Simulation. Upper
Saddle River, NJ: Prentice-Hall, 1996.
Ceranowicz, A., M. Torpey, B. Helfinstine, D. Bakeman, J. McCarthy, L. Messerschmidt,
S. McGarry, and S. Moore. “J9901, Federation Development for Joint
Experimentation.” Proceedings of the Fall 1999 Simulation Interoperability
Workshop, Orlando, FL, September 12-17, 1999.
Douglas, M. A Hierarchy of Models and Simulations Used in Support of System
Acquisition (DMSTTIAC TA-99-02). Huntsville, AL: Defense Modeling, Simulation
and Tactical Technology Information Analysis Center, April 1999.
http://www.dtic.mil/cgibin/GetTRDoc?AD=ADA363862&Location=U2&doc=GetTRDoc.pdf
Hagan, J.J., and W.G. Slack. Employing Organizational Modeling and Simulation to
Reduce F/A-18E/F F414 Engine Maintenance Time (NPS-AM-06-045).
Monterey, CA: Naval Postgraduate School, December 2006.
http://www.acquisitionresearch.org/_files/FY2006/NPS-AM-06-045.pdf
Headquarters, Air Force Materiel Command (HQ AFMC). AFMC Models and
Simulations (M&S) Guide (AFMCP 800-66). Wright-Patterson AFB, OH: Author,
July 1993.
Military Operations Research Society. Military Modeling for Decision Making, 3rd ed.
Edited by W.P. Hughes, Jr. Alexandria, VA: Author, 1997.
Piplani, Lalit, Joseph Mercer, and Richard Roop. Systems Acquisition Manger’s Guide
for the Use of Models and Simulations. Fort Belvoir, VA: Defense Systems
Management College Press, September 1994.
US Army. Simulation Operations Handbook, Ver. 1.0. Washington, DC: Author, October
30, 2003, 152.
RK=
jCp=mçäáÅó=~åÇ=íÜÉ=
aÉÑÉåëÉ=^Åèìáëáíáçå=
cê~ãÉïçêâ=
5.1
An Examination of the Evolving Defense
Acquisition Framework
Models of program structure are important to the Department of
Defense as it conveys its overall acquisition strategy. According to the
Acquisition Strategy Guide, the structure and schedule portion of the
acquisition strategy defines:
the relationship among acquisition phases, decision milestones,
solicitations, contract awards, system engineering design reviews,
contract deliveries, T&E periods, production releases, and
operational deployment objectives. It must describe the phase
transitions and the degree of concurrency entailed. It is a visual
overview and picture presentation of the acquisition strategy […]
depicted on an event-driven time line diagram. [Defense Systems
Management College, 1999: Appendix B]
In the last sixteen years, there have actually been two families of
defense acquisition lifecycle models or frameworks: Pre-2000-era and
Post-2000-era. The first of the Pre-2000-era models to consider here is
the Lifecycle Systems Management Model.
5.1.1. Lifecycle Systems Management Model
The current 2008 model (Figure 5.1.) has five phases and seven
potential major decision reviews. Eight total distinct activity periods exist in
the model, including pre-acquisition activity.
62
Chapter 5
Figure. 5.1.
2008 Defense
Acquisition
Management
Framework
[DoD, 2008: 12]
The entrance and exit criteria for each phase and work effort now
incorporate the introduction of new requirements documents from the Joint
Capabilities Integration and Development System (which has been
evolving in parallel to the acquisition system): the Initial Capabilities
Document (ICD), the Capabilities Development Document (CDD), and the
Capabilities Production Document (CPD). Interestingly, there has been a
large state of flux within this Decision Support System, replete with
changes in terminology and decision models. The overarching series of
instructions governing that requirements-generation process has also
seen two major revisions in the past three years [Chairman of the Joint
Chiefs of Staff, 2003].
The current DoD 5000 Series also includes language on evolutionary
acquisition and incremental development taken from the National Defense
Authorization Act of 2003.
A new requirement for a Technology
Development Strategy has been introduced to satisfy Section 803, Public
Law 107-314, National Defense Authorization Act for fiscal year 2003.
The ICD is a requirement to enter the Material Solution Analysis (MSA)
Phase, and a Technology Development Strategy (TDS) is a principal
output from this phase.
Following is further detail regarding each phase of the acquisition
process and the appropriate M&S applications associated which each
phase. The diagram presented in Figure 5.2 provides an outline of each
of the phases that will be reviewed in the following sections.
5.1.1.1.
Technology Opportunities and User Needs
The purpose of the process leading to the concept decision is to
confirm the concept detailed in the Initial Capabilities Document and to
develop an Analysis of Alternatives (AoA) plan to be executed following
the Concept Decision.
M&S Policy
63
At the Concept Decision, the Milestone Decision Authority (MDA)
designates the lead DoD Component to refine the initial concept selected,
approves the AoA plan, and establishes a date for a Milestone A review.
It is also important to keep in mind that the decision to begin Concept
Refinement does not in itself signal the start of a new acquisition program.
The mission of the Functional Capabilities Board (FCB) is to support
the JROC by integrating stakeholder views in concept development,
capabilities planning and force development, to ensure that the military
can execute assigned missions. FCBs provide assessments and
recommendations that assist the Milestone Decision Authority (MDA) in
his or her decision to move forward, while Sponsors must complete
Functional Area Assessment (FAA), Functional Needs Analysis (FNA),
Functional Solutions Analysis (FSA) and must post independent
analysis prior to submitting a proposal into the JCIDS process. Sponsors
are encouraged to interact with FCBs throughout all stages of JCIDS
analysis.
FCBs provide a forum to review proposed capability needs and
solutions. In addition, the FCB working group assesses the programmatic
impact of the new capability proposal by:

Examining the expected system/program costs.

Assessing the period of system/program execution.

Evaluating the impact to existing system(s)/program(s) if the
proposal is fielded. This includes reviewing the roadmap/timeline
for legacy retirement and new system initial operating capability
(IOC).

Identifying the
system/program.
ramifications
of
delaying
the
proposed
One might expect to see the FCBs use the following M&S tools during
this phase:

Architecture definition tools

Effectiveness models

Cost Estimating models
Successful Studies
64
Chapter 5
One of the most important keys to success is clear problem definition. Eliminating
value judgments (i.e., good, poor, low high, etc.) will help in establishing a definitive
problem definition. PMs must also evaluate the problem to make sure it is not simply a
symptom of a larger-scale issue. Below are 8 simple guidelines to apply in evaluating a
study.
1. What is the problem?
2. What do we know?
3. What do we think the answer is?
4. What solution techniques should we consider?
5. How do we obtain and review the data?
6. How do we “crunch the numbers”?
7. What does the answer mean?
8. How do we package the conclusions?
The following sections are adapted from DoDI 5000.02 [DoD, 2008].
Some wording has been changed for clarity and applicability.
5.1.1.2.
Material Solution Analysis (MSA) Phase
The purpose of this phase is to assess potential materiel solutions and
to satisfy the phase-specific entrance criteria for the next program
milestone designated by the MDA. Entrance into this phase depends
upon an approved Initial Capabilities Document (ICD) resulting from the
analysis of current mission performance and an analysis of potential
concepts across the DoD Components, international systems from allies,
and cooperative opportunities.
The Materiel Solution Analysis Phase begins with the Materiel
Development Decision review. The Materiel Development Decision review
is the formal entry point into the acquisition process and is mandatory for
all programs. Funding for this phase is normally limited to satisfaction of
the Materiel Solution Analysis Phase objectives.
At the Materiel Development Decision Review, the Joint Staff
presents the JROC recommendations, and the DoD Component presents
the ICD—including: the preliminary concept of operations, a description of
the needed capability, the operational risk, and the basis for determining
that non-materiel approaches will not sufficiently mitigate the capability
gap. The Director, Program Analysis & Evaluation (DPA&E), (or DoD
Component equivalent) proposes study guidance for the Analysis of
Alternatives (AoA).
The MDA approves the AoA study guidance, determines the
acquisition phase of entry identifies the initial review milestone, and
M&S Policy
65
designates the lead DoD Component(s). MDA decisions are documented
in an Acquisition Decision Memorandum (ADM). However, it is
important to note that the MDA’s decision to begin Materiel Solution
Analysis does not mean that a new acquisition program has been initiated.
Following approval of the study guidance, the lead DoD Component(s)
prepares an AoA study plan to assess preliminary materiel solutions,
identify key technologies, and estimate lifecycle costs. The purpose of the
AoA is to assess the potential materiel solutions to satisfy the capability
need documented in the approved ICD.
The ICD and the AoA study guidance determine the AoA and Materiel
Solution Analysis Phase activity. The AoA focuses on identification and
analysis of alternatives, measures of effectiveness, cost, schedule,
concepts of operations, and overall risk. The AoA assesses the critical
technology elements (CTEs) associated with each proposed materiel
solution—including technology maturity, integration risk, manufacturing
feasibility, and, where necessary, technology maturation and
demonstration needs. To achieve the best possible system solution, this
phase emphasizes innovation and competition. Decision-makers consider
both existing, commercial, off-the-shelf (COTS) functionality and
solutions drawn from a diversified range of large and small businesses. If
the MDA determines that the initial review milestone specified at the
Materiel Development Decision is inconsistent with the maturity of the
preferred materiel solution, an alternative review milestone is designated.
The Materiel Solution Analysis Phase ends when the following three
objectives have been met: the AoA has been completed; materiel solution
options for the capability need (identified in the approved ICD) have been
recommended by the lead DoD Component conducting the AoA; and the
phase-specific entrance criteria for the initial review milestone have been
satisfied.
5.1.1.3.
Technology Development Phase
The purpose of this phase is to reduce technology risk, determine and
mature the appropriate set of technologies to be integrated into a full
system, and to demonstrate CTEs on prototypes. Technology
Development is a continuous technology discovery and development
process reflecting close collaboration between the Science and
Technology (S&T) community, the user, and the system developer. It is
an iterative process designed to assess the viability of technologies while
simultaneously refining user requirements. Entrance into this phase
depends on the completion of the AoA, a proposed materiel solution, and
full funding for planned Technology Development Phase activity.
At Milestone A, the MDA reviews the proposed materiel solution and
the draft Technology Development Strategy (TDS). The Technology
66
Chapter 5
Development Phase begins when the MDA has approved a materiel
solution and the TDS, and has documented the decision in an ADM.
If, during Technology Development, the cost estimate upon which the
MDA based the Milestone A certification increases by 25% or more, the
PM notifies the MDA of the increase. The MDA then again consults with
the JROC on matters related to program requirements and the military
need(s) for the system. The MDA determines whether the level of
resources required to develop and procure the system remains consistent
with the priority level assigned by the JROC. If not, the MDA may rescind
the Milestone A approval if he/she determines that such action is in the
interest of national defense. This effort normally is funded only for the
advanced development work. Technology development for a major
defense acquisition program (MDAP) does not proceed without
Milestone A approval. For business area capabilities, commercially
available solutions are preferred. However, as with a materiel solution
analysis—a favorable Milestone A decision does not mean that a new
acquisition program has been initiated.
At Milestone A, the DoD Component submits a cost estimate for the
proposed solution(s) identified by the AoA. If requested by the MDA, the
cost analysis improvement group (CAIG) develops an independent cost
assessment. Final requests for proposals (RFPs) for the Technology
Development Phase are not released, nor is any action taken that would
commit the program to a particular contracting strategy for Technology
Development, until the MDA has approved the technology development
strategy (TDS). The TDS documents the following:

The rationale for adopting an evolutionary strategy (the preferred
approach) or using a single-step-to-full-capability strategy (e.g., for
common supply items or COTS items). For an evolutionary
acquisition, the TDS should include both a preliminary description
of how the materiel solution will be divided into acquisition
increments based on mature technology, as well as an appropriate
limitation on the number of prototype units or engineering
development models that may be produced in support of a
Technology Development Phase;

A preliminary acquisition strategy—including overall cost, schedule,
and performance goals for the total research and development
program;

Specific cost, schedule, and performance goals—including exit
criteria—for the Technology Development Phase;
M&S Policy
67

A description of the approach that will be used to ensure data
assets will be made visible, accessible, and understandable to any
potential user as early as possible;

A list of known or probable 1) critical program information (CPI)
and potential countermeasures (such as anti-tamper) in the
preferred system concept and in the critical technologies, and 2)
competitive prototypes to inform program protection and design
integration during the Technology Development Phase;

A time-phased workload assessment identifying the manpower and
functional competency requirements for successful program
execution and the associated staffing plan, including the roles of
government and non-government personnel;

A data management strategy; and

A summary of the CAIG-approved cost and software data
reporting (CSDR) plan(s) for the Technology Development Phase.
During Technology Development and succeeding acquisition phases,
the PM should give small business the maximum practical opportunity to
participate. Where feasible, the PM should leverage programs which
employ people with disabilities. The TDS and associated funding provide
for two or more competing teams producing prototypes of the system
and/or key system elements prior to, or through, Milestone B. Prototype
systems or appropriate component-level prototyping should be employed
to reduce technical risk, validate designs and cost estimates, evaluate
manufacturing processes, and refine requirements. Information technology
initiatives should prototype subsets of overall functionality using one or
more teams, with the intention of reducing enterprise architecture risks,
prioritizing functionality, and facilitating process redesign.
5.1.1.4.
Engineering and Manufacturing Development (EMD)
Phase
There are several purposes of the Engineering and Manufacturing
Development (EMD) Phase. They are as follows:

To develop a system or an increment of capability,

To complete full system integration (technology risk reduction
occurs during Technology Development),

To develop an affordable and executable manufacturing process,

To ensure operational supportability with particular attention to
minimizing the logistics footprint,
68
Chapter 5

To implement human systems integration (HSI),

To design for producibility,

To ensure affordability,

To protect critical program information (CPI) by implementing
appropriate techniques such as anti-tamper, and

To demonstrate system integration, interoperability, safety, and
utility.
The Capability Development Document (CDD), Acquisition Strategy,
Systems Engineering Plan (SEP), and Test and Evaluation Master Plan
(TEMP) shall guide this effort. Entrance into this phase depends on
technology maturity (including software), approved requirements, and full
funding. Unless some other factor is overriding in its impact, the maturity
of the technology determines the path to be followed.
Prior to beginning the engineering and manufacturing development
(EMD) phase, users must identify and the requirements authority has to
approve a minimum set of key performance parameters (KPPs),
included in the CDD, that shall guide the efforts of this phase. These KPPs
may be refined, with the approval of the requirements authority, as
conditions warrant. The CDD defines the set of KPPs that will apply to
each increment of the EMD phase (or to the entire system in a single-stepto-full-capability situation). To maximize program trade space and focus
test and evaluation, the MDA, Program Executive Officer (PEO), and
PM must work closely with the requirements authority to minimize KPPs
and limit total identified program requirements. Performance requirements
that do not support the achievement of KPP thresholds need to be limited
and considered a part of the engineering trade space during development.
During OT&E, a clear distinction should be made between performance
values that do not meet threshold requirements in the user capabilities
document and performance values that should be improved to provide
enhanced operational capability in future upgrades.
The EMD phase begins at Milestone B, which is normally the initiation
of an acquisition program. There is only one Milestone B per program or
evolutionary increment. Each increment of an evolutionary acquisition
shall have its own Milestone B unless the MDA determines that the
increment will be initiated at Milestone C. At Milestone B, the MDA
approves the Acquisition Strategy and the Acquisition Program
Baseline (APB). The MDA decision is documented in an Acquisition
Decision Memorandum (ADM). (The tables in Enclosure 4 of the DoDI
5000.02 [DoD, 2008] identify the statutory and regulatory requirements
that must be met at Milestone B).
M&S Policy
69
Final RFPs for the EMD Phase, or any succeeding acquisition phase,
are not released—nor is any action taken that would commit the program
to a particular contracting strategy—until the MDA has approved the
Acquisition Strategy. The PM’s language in the RFP should advise
offerors that: (1) the government will not award a contract to an offeror
whose proposal is based on critical technology elements (CTEs) that
have not been demonstrated in a relevant environment (relative meaning
one similar to that in which their proposal is based) and, (2) vendors
should provide reports documenting how those CTEs have been
demonstrated in a relevant environment.
The EMD phase has two major efforts: Integrated System Design, and
System Capability and Manufacturing Process Demonstration.
Additionally, the MDA must conduct a Post-Preliminary Design Review
(PDR) Assessment when consistent with the Acquisition Strategy, and a
Post-Critical Design Review (CDR) Assessment to end Integrated System
Design.
5.1.1.4.1.
Integrated System Design
This effort is intended to define system and system-of-systems
functionality and interfaces, complete hardware and software detailed
design, and reduce system-level risk. Integrated System Design includes
the establishment of the product baseline for all configuration items.
5.1.1.4.2.
Post-PDR Assessment
If a Preliminary Design Review has not been conducted prior to
Milestone B, the PM should plan for a PDR as soon as is feasible after
program initiation. PDR planning is reflected in the Acquisition Strategy
and has to be conducted consistent with policy. Following the PDR, the
PM plans, and the MDA conducts a formal Post-PDR Assessment. The
PDR report is provided to the MDA prior to the assessment and reflects
any requirements trades based upon the PM’s assessment of cost,
schedule, and performance risk. The MDA considers the results of the
PDR and the PM’s assessment, and determines whether remedial action
is necessary to achieve Acquisition Program Baseline (APB) objectives.
The results of the MDA's Post-PDR Assessment are documented in an
ADM.
5.1.1.4.3.
Post-CDR Assessment
The MDA conducts a formal program assessment following systemlevel Critical Design Review (CDR). The system-level CDR provides the
PM an opportunity to assess design maturity. Such maturity is evidenced
by measures such as: successful completion of subsystem CDRs; the
percentage of hardware and software product build-to specifications and
drawings completed and under configuration management; planned
70
Chapter 5
corrective actions to hardware/software deficiencies; adequate
developmental testing; an assessment of environment, safety and
occupational health risks; a completed failure-modes-and-effects analysis;
the identification of key system characteristics; the maturity of critical
manufacturing processes; and an estimate of system reliability based on
demonstrated reliability rates.
5.1.1.4.4.
System Capability and Manufacturing Process
Demonstration
This effort is intended to demonstrate both the ability of the system to
operate in a useful way consistent with the approved KPPs and that
system production can be supported by demonstrated manufacturing
processes. The program enters System Capability and Manufacturing
Process Demonstration upon completion of the Post-CDR Assessment
and establishment of an initial product baseline. This effort ends when the
following criteria are met:

System meets approved requirements and is demonstrated in its
intended environment using the selected production-representative
article,

Manufacturing processes have been effectively demonstrated in a
pilot line environment,

Industrial capabilities are reasonably available, and

The system meets or exceeds exit criteria and Milestone C
entrance requirements.
Several factors are critical to this effort:

Successful developmental test and evaluation (DT&E) to assess
technical progress against critical technical parameters,

Early operational assessments, and,

Where proven capabilities exist, the use of modeling and simulation
to demonstrate system/system-of-systems integration.
T&E should be used to assess improvements to mission capability and
operational support based on user needs and should be reported in terms
of operational significance to the user. The completion of this phase is
dependent on a decision by the MDA to either commit to the program at
Milestone C or to end this effort.
M&S Policy
71
5.1.1.5.
Production and Deployment Phase
The purpose of the Production and Deployment Phase is to achieve
an operational capability that satisfies mission needs. Operational test and
evaluation determines the effectiveness and suitability of the system. The
MDA makes the decision to commit the Department of Defense to
production at Milestone C and documents the decision in an ADM.
Milestone C authorizes entry into:

Low-rate initial production (LRIP) (for MDAPs and major
systems)

Production or procurement (for non-major systems that do not
require LRIP), or

Limited deployment in support of operational testing for major
automated information system (MAIS) programs or softwareintensive systems with no production components.
Entrance into this phase depends on the following criteria:

Acceptable performance in developmental test and evaluation and
operational assessment (OSD OT&E oversight programs),

Mature software capability,

No significant manufacturing risks,

Manufacturing processes under control (if Milestone C is full-rate
production),

An approved ICD (if Milestone C is program initiation),

An approved Capability Production Document (CPD),

A refined integrated architecture,

Acceptable interoperability,

Acceptable operational supportability, and

Demonstration that the system is affordable throughout the lifecycle,
fully funded, and properly phased for rapid acquisition.
The CPD reflects the operational requirements, informed by EMD results,
and details the performance expected of the production system. If
Milestone C approves LRIP, a subsequent review and decision shall
authorize full-rate production.
72
Chapter 5
For MDAPs and other programs on the OSD T&E Oversight List,
Production and Deployment has two major efforts—Low-rate Initial
Production and Full-rate Production and Deployment—and includes a Fullrate Production Decision Review. For MAIS programs or softwareintensive systems with no production components, the Full-rate Production
Decision Review is referred to as the Full Deployment Decision Review.
5.1.1.5.1.
LRIP
This effort is intended to result in the completion of manufacturing
development in order to ensure adequate and efficient manufacturing
capability. It should also:

Produce the minimum quantity necessary to provide production or
production-representative articles for Initial Operational Test &
Evaluation (IOT&E),

Establish an initial production base for the system, and

Permit an orderly increase in the production rate for the system,
sufficient to lead to full-rate production upon successful completion
of operational (and live-fire, where applicable) testing.
Evaluations are conducted in the mission context expected at time of
fielding, as described in the user’s capability document. The MDA
considers any new validated threat environments that will alter operational
effectiveness. If the program has not demonstrated readiness to proceed
to full-rate production, the MDA assesses the cost and benefits of a break
in production versus continuing buys before approving an increase in the
LRIP quantity.
LRIP is not applicable to Automated Information Systems (AISs) or
software-intensive systems with no developmental hardware; however, a
limited deployment phase may be applicable. LRIP for ships and satellites
is production of items at the minimum quantity and rate that is feasible and
that preserves the mobilization production base for that system. Except
as specifically approved by the MDA, deficiencies identified in testing
should be resolved before the program proceeds beyond LRIP, and any
fixes shall be verified in Follow-on Operational Test & Evaluation
(FOT&E).
5.1.1.5.2.
Full-rate Production Criteria
An MDAP may not proceed beyond LRIP without MDA approval. The
knowledge required to support this approval includes demonstrated
control of the manufacturing process and acceptable reliability, the
collection of statistical process control data, and the demonstrated control
and capability of other critical processes.
M&S Policy
73
For programs on the OSD T&E Oversight List, the decision to continue
beyond low-rate to full-rate production—or beyond limited deployment of
AISs or software-intensive systems with no developmental hardware—
requires completion of IOT&E and receipt of the “Beyond LRIP Report” (or
equivalent report for MDAPs that are also AISs) by, and submission
(where applicable) of the LFT&E Report to, the congressional defense
committees, the Secretary of Defense, and the USD(AT&L).
If a decision is made to proceed to operational use or to make
procurement funds available for the program prior to a final decision to
proceed beyond low-rate initial production (or limited deployment for
MDAPs that are AISs), the DOT&E submits a report to the Secretary of
Defense, the USD(AT&L), and the congressional defense committees.
The DOT&E may decide to submit an interim or partial report if the
operational testing completed to date is inadequate to determine
operational effectiveness and suitability and survivability. If an interim or
partial report is submitted, the DOT&E prepares and submits the required
final report as soon as possible after completing adequate operational
testing to determine operational effectiveness and suitability and
survivability.
5.1.1.5.3.
Full-rate Production and Deployment
Continuation into full-rate production results from a successful Fullrate Production (or Full Deployment) Decision Review by the MDA.
The decision to proceed into Full-rate Production is documented in an
ADM. This effort delivers the fully funded quantity of systems and
supporting materiel and services for the program or increment to the users.
During this effort, units typically attain Initial Operational Capability (IOC).
As technology, software, and threats change, FOT&E is considered in
order to assess current mission performance and to inform operational
users during the development of new capability requirements.
5.1.1.5.4.
Military Equipment Valuation
For Milestone C, the PM prepares a program description as part of the
Acquisition Strategy. Throughout Production and Deployment, the PM
ensures that the following objectives are met:

All deliverable equipment requiring capitalization is serially
identified and valued at full cost,

The full cost of each item of equipment is entered in the Item
Unique Identification (IUID) registry,

All solicitations, proposals, contracts, and/or orders for deliverable
equipment are structured for proper segregation of each type of
equipment based on its respective financial treatment,
74
Chapter 5

Procedures are established to track all equipment items throughout
their lifecycle, and

The status of items added, retired from operational use, or
transferred from one DoD Component to another DoD Component
are updated quarterly throughout their life.
5.1.1.6.
Operations and Support Phase
The purpose of the Operations and Support Phase is to execute a
support program that both meets materiel readiness and operational
support performance requirements, as well as sustains the system in the
most cost-effective manner over its total lifecycle. Planning for this phase
begins prior to program initiation and is documented in the LCSP.
Operations and Support has two major efforts: Lifecycle Sustainment and
Disposal. Entrance into the Operations and Support Phase depends on
meeting the following criteria: an approved CPD, an approved Lifecycle
Sustainment Plan (LCSP), and a successful Full-rate Production (FRP)
Decision.
5.1.1.6.1.
Lifecycle Sustainment
Lifecycle sustainment planning and execution seamlessly span a
system’s entire lifecycle, from Materiel Solution Analysis to disposal. It
translates force provider capability and performance requirements into
tailored product support to achieve specified and evolving lifecycle
product-support availability, reliability, and affordability parameters.
Lifecycle sustainment planning is considered during Materiel Solution
Analysis, and matures throughout Technology Development. An LCSP is
prepared for Milestone B. The planning should be flexible and
performance-oriented,
reflect
an
evolutionary
approach,
and
accommodate modifications, upgrades, and reprocurement. The LCSP
shall be a part of the program’s Acquisition Strategy and integrated with
other key program planning documents. The LCSP should be updated and
executed during Production and Deployment and Operations and Support.
Lifecycle sustainment considerations include: supply; maintenance;
transportation; sustaining engineering; data management; configuration
management; HSI; environment, safety (including explosives safety), and
occupational health; protection of critical program information and antitamper provisions; supportability; and interoperability.
Effective sustainment of systems results from the design and
development of reliable and maintainable systems through the continuous
application of a robust systems engineering methodology. Accordingly, the
PM should:
M&S Policy
75

Design the maintenance program to minimize total lifecycle cost
while achieving readiness and sustainability objectives,

Optimize operational readiness via:
o
Human-factors engineering to design systems that meet the
following objectives: require minimal manpower; provide
effective training; can be operated and maintained by users;
and are suitable (habitable and safe with minimal
environmental and occupational health hazards) and
survivable (for both the crew and equipment).
o
Diagnostics, prognostics, and health management
techniques in embedded and off-equipment applications
when feasible and cost-effective.
o
Embedded training and testing, with a preference for
approved DoD Automatic Test Systems (ATS) Families to
satisfy ATS requirements.
o
Serialized item-management techniques and the use of
automatic identification technology (AIT), radiofrequency identification (RFID), and iterative technology
refreshment. PMs must ensure that data syntax and
semantics for high-capacity AIT devices conform to
International Organization for Standardization ISO 15418
and ISO 15434.
The PM works with the user to document performance and
sustainment requirements in performance agreements specifying objective
outcomes, measures, resource commitments, and stakeholder
responsibilities. The PM employs effective Performance-based Lifecycle
Product Support (PBL) planning, development, implementation, and
management. Performance-based Lifecycle Product Support represents
the latest evolution of Performance-based Logistics. Both can be
referred to as “PBL.” PBL offers the best strategic approach for delivering
required lifecycle readiness, reliability, and ownership costs. Sources of
support may be organic, commercial, or a combination, with the primary
focus optimizing customer support, weapon system availability, and
reduced ownership costs. The DoD Components must document
sustainment procedures that ensure integrated combat support.
The DoD Components, in conjunction with users, conduct continuing
reviews of sustainment strategies—comparing performance expectation,
as defined in performance agreements, to actual performance results.
PMs must continuously identify deficiencies in these strategies and adjust
the LCSP as necessary to meet performance requirements.
76
Chapter 5
5.1.1.6.2.
Disposal
At the end of its useful life, a system is demilitarized and disposed of in
accordance with all legal and regulatory requirements and policy relating
to safety (including explosives safety), security, and the environment.
During the design process, PMs must document hazardous materials
contained in the system in the Programmatic Environment, Safety, and
Occupational Health Evaluation (PESHE), and should estimate and plan
for the system’s demilitarization and safe disposal. The demilitarization of
conventional ammunition (including any item containing propellants,
explosives, or pyrotechnics) must be considered during system design.
5.1.1.7.
Review Procedures
5.1.1.7.1.
Review of ACAT ID and IAM Programs
The USD(AT&L) designates programs as ACAT ID or ACAT IAM when
the program has special interest based on one or more of the following
factors: technological complexity, Congressional interest, a large
commitment of resources, the program is critical to achievement of a
capability or set of capabilities, or the program is a joint program. However,
even if a program exhibits one or more of these characteristics, it is not
automatically designated as an ACAT ID or IAM.
5.1.1.7.2.
Defense Acquisition Board (DAB) Review
The Defense Acquisition Board (DAB) advises the USD(AT&L) on
critical acquisition decisions. The USD(AT&L) chairs the DAB. An ADM
documents the decision(s) resulting from the review.
5.1.1.7.3.
Information Technology (IT) Acquisition Board (ITAB)
Review
The ITAB advises the USD(AT&L) or his or her designee on critical IT
acquisition decisions, excluding defense business systems. These reviews
facilitate the accomplishment of the DoD CIO’s acquisition-related
responsibilities for IT. An ADM documents the decision(s) resulting from
the review.
5.1.1.7.4.
Configuration Steering Boards (CSB)
The Acquisition Executive of each DoD Component establishes and
chairs a CSB with broad executive membership—including senior
representatives from the Office of the USD(AT&L) and the Joint Staff.
Additional executive members shall include representatives from the office
of the chief of staff of the Armed Force concerned, other Armed Forces
representatives where appropriate, and the Program Executive Officer
(PEO).
M&S Policy
5.1.1.7.5.
77
Overarching Integrated Product Team (OIPT)
An Overarching Integrated Product Team (OIPT) reviews program
planning, facilitates program communications and issue resolution, and
supports the MDA for ACAT ID and IAM programs. The Investment
Review Board (IRB) shall perform this function for MAIS business
systems.
5.1.1.7.6.
Program Support Reviews (PSRs)
Program Support Reviews (PSRs) are a means to inform an MDA
and Program Office of the status of technical planning and management
processes by identifying cost, schedule, and performance risk and
recommendations to mitigate those risks. PSRs should be conducted by
cross-functional and cross-organizational teams appropriate to the
program and situation. PSRs for ACAT ID and IAM programs are planned
by the Director, Systems and Software Engineering (SSE) to support OIPT
program reviews. At other times, they are conducted as directed by the
USD(AT&L) and in response to requests from PMs.
5.1.1.7.7.
Independent Management Reviews (“Peer Reviews”)
Peer Reviews are conducted on all Supplies and Services contracts.
The reviews should be advisory in nature and conducted in a manner
which preserves the authority, judgment, and discretion of the contracting
officer and senior officials of the acquiring organization. Pre-Award
reviews are conducted on Supplies and Services contracts; Post-Award
reviews are conducted on Services contracts. The Director, Defense
Procurement, Acquisition Policy, and Strategic Sourcing (DPAP), in the
Office of the USD(AT&L), conducts Peer Reviews for contracts with an
estimated value of $1 billion or more (including options).
5.2.
Systems Engineering within Development
Projects
Although M&S is an important tool for analyzing and designing
complex systems M&S is only meaningful if the underlying models are
adequately accurate and if the models are evaluated using the proper
simulation algorithms. Systems Engineering plays an important role in
accomplishing this result. This next section will define and discuss
Systems Engineering as a process. As defined by the International
Council on Systems Engineering (INCOSE):
Systems Engineering is an interdisciplinary approach and means
to enable the realization of successful systems. It focuses on
defining customer needs and required functionality early in the
development cycle, documenting requirements, then proceeding
78
Chapter 5
with design synthesis and system validation while considering the
complete problem. Systems Engineering integrates all the
disciplines and specialty groups into a team effort, forming a
structured development process that proceeds from concept to
production to operation. Systems Engineering considers both the
business and the technical needs of all customers with the goal of
providing a quality product that meets the user needs. [INCOSE,
1999]
After years in the defense and aerospace industry, authors Forsberg
and Mooz saw developmental project management (in a systems
engineering sense) as a V-model, decomposing complexity and flowing
down requirements on the left side, then integrating technologies and
verifying attainment of customer requirements on the right side (see
Figure 5.1).
Figure 5.1.
The Vmodel
[Adapted from Forsberg, Mooz and Cotterham, 2000]
Key in this paradigm or framework is the relationship between sides of
the V-model, particularly with regard to requirements traceability, for
functional and physical linkage and accountability throughout development.
While most applicable to actual product or advanced development, the
model does allow for concurrent activities, thus exploiting both the
potential for schedule efficiency, as well as for development iterations for
requirements definition and user feedback.
Projects seem to be better visualized with graphic representations or
models. Models help us reduce complexity and, thereby, understand it.
They can be used, as we will see in the next group of models, to reduce
investment risk via investment exit points (which Mooz calls control
M&S Policy
79
gates) and to prevent progression beyond the appropriate stage. They
help us to delineate and allocate our diverse project management efforts.
With some context provided by these previous models, we can now
briefly examine the evolution of the DoD 5000 Series framework with six
versions of the project lifecycle management model used by DoD over the
last sixteen years.
5.3.
Important Acquisition References
As with any policy-related dialogue, it is important for the reader to
realize the nature of policy is to change over time. Thus, policy that was
contemporary to the publication of the guidebook may not necessarily be
accurate a year from now. For this reason, the following section provides
an outline of critical M&S resources for the reader’s consultation.
5.3.1. DoD
DoD Directive 5000.01, Defense Acquisition System, 20 November 2007
DoD Instruction 5000.02, Operation of the Defense Acquisition System, 8
December 2008
Defense Acquisition Guidebook, Version 1.0, 17 October 2004
5.3.2. Joint Chiefs
CJCSI 3170.01F, Joint Capabilities Integration and Development System,
11 May 2007
CJCSM 3170.01C, Operation of the Joint Capabilities Integration and
Development System, 1 May 2007
5.3.3. Services
AR 70-1, Army Acquisition Policy, 31 December 2003
DA Pam 70-3, Army Acquisition Procedures, 15 July 1999
SECNAVINST 5000.2C, Operation of the Defense Acquisition System, 19
November 2004
AFPD 63-1, Capabilities-based Acquisition System, 10 July 2003
AFI 63-101, Operations of Capabilities-based Acquisition System, 29 July
2005
80
Chapter 5
5.3.4. M&S References
DoDD 5000.59, DoD Modeling and Simulation Management, 8 August
2007
AR 5-11, Management of Army Models and Simulations, 1 February 2005
SECNAVINST 5200.38A, Department of the Navy Modeling and
Simulation Program, 28 February 2002
OPNAVINST 5200.34, Navy Modeling and Simulation Management, 28
May 2002
AFPD 16-10, Modeling and Simulation Management, 1995
AFI 16-1002, Modeling and Simulation Support to Acquisition, 1 June
2000
5.5.
Chapter Summary
As the previous sections have demonstrated, M&S tools can support
the acquisition lifecycle at every phase of the process. Because of its
potential to reduce costs and improve efficiencies, M&S is a valuable
resource to take advantage of in any system acquisition lifecycle.
5.6.
Chapter References
Chairman of the Joint Chiefs of Staff. Operation of the Joint Capabilities Integration and
Development System (CJCSM 3170.01). Washington, DC: Author, June 24,
2003.
Costa, K.J. “5000.2 Changes Await Approval.” Inside the Pentagon, January 16, 2003.
Defense Systems Management College. Acquisition Strategy Guide, 4th ed. Fort
Belvoir, VA: Author, December 1999.
Defense Systems Management College. Defense Acquisition Framework. Fort Belvoir,
VA: Author, 2001.
Defense Systems Management College. Defense Systems Acquisition Management
Process. Fort Belvoir, VA: Author, January 1997.
Department of Defense (DoD). Operation of the Defense Acquisition System (DoDI
5000.02). Washington, DC: Author, December 8, 2008.
Dillard, J.T. Centralized Control of Defense Acquisition Programs: A Comparative
Review of the Framework from 1987-2003. Acquisition Research Sponsored
M&S Policy
81
Report Series (NPS-AM-03-003). Monterey, CA: Naval Postgraduate School,
September 29, 2003, 1-39.
Forsberg, K., M. Mooz, and H. Cotterham. Visualizing Project Management, 2nd ed.
Hoboken, NJ: Wiley, 2000.
International Council on Systems Engineering (INCOSE). What is Systems
Engineering? http://www.incose.org/whatis.html, June 1999.
Wideman, R.M. Wideman Comparative Glossary of Project Management Terms, ver.
3.1. Vancouver, B.C., Canada: Author, 2002.
Secretary of Defense. Defense Acquisition (Interim Guidance 5000.1). The Defense
Acquisition System, Attachment 1. Memorandum. Washington, DC: Author,
October 30, 2002, p. 6.
Under Secretary of Defense (Acquisition) (USD(A)). The Defense Acquisition System
(DoDD 5000.1). Washington, DC: Author, February 23, 1991.
Under Secretary of Defense (Acquisition & Technology) (USD(A&T)). Defense
Acquisition (DoDD 5000.1). Washington, DC: Author, March 15, 1996.
Under Secretary of Defense (Acquisition & Technology) (USD(A&T)). DoD Integrated
Product and Process Development Handbook. Washington, DC: Author, August
1998.
Under Secretary of Defense (Acquisition, Technology & Logistics) (USD(AT&L)).
Operation of the Defense Acquisition System (DoDI 5000.2). Washington, DC:
Author, May 12, 2003.
Under Secretary of Defense (Acquisition, Technology & Logistics) (USD(AT&L)). The
Defense Acquisition System (DoDD 5000.1). Washington, DC: Author, May 12,
2003.
Wolfowitz, P. Cancellation of DoD 5000 Defense Acquisition Policy Documents.
Memorandum for Director, Washington Headquarters Services. Washington, DC:
Author, October 30, 2002.
=
THIS PAGE INTENTIONALLY LEFT BLANK
SK=
j~å~ÖáåÖ=jCp=ïáíÜáå=~=
mêçÖê~ã=
Like the findings of the ACAT I & II Program Management Office’s
survey 5 in 1993, the 2007 SimSummit Survey on US DoD M&S
Management/Leadership results indicate that respondents appreciate the
potential value and cost-savings associated with the appropriate
application of M&S during a program’s lifecycle. One of the key factors in
this effort is the careful management of M&S in the program. The
intention of this chapter is to help the Program Manager establish a
context from which to approach M&S management. This context will
range from the initial planning stage of an M&S effort to the development
of a generic Simulation Support Plan used to track the M&S effort
throughout the lifecycle.
6.1.
Planning for the M&S Effort
M&S should assist PMs in the critical thinking process and provide
insight for them as they evaluate potential program development and
outcomes. It should also help them establish a framework around which
quantitative decisions can be made [Seglie, 2002: 13]. When initiating the
M&S planning effort, the Program Manager must take a bottom-up
approach. In order to be effective, the modeling or simulation tool has to
fit the need; in other words, the M&S requirements should be based upon
what the program is trying to accomplish. The decision to employ models
and simulations—as well as the subsequent decision(s) to use or modify
an existing model or to create a new model—will depend on the
circumstances and specifics of a given program.
6.1.1. Questions for the M&S Effort
The questions which follow are not offered to trivialize the process of
managing M&S in an acquisition program with a textbook solution;
however, they help to provide an initial framework through which readers
can begin thinking about using M&S in a program.

5
What am I trying to achieve? What’s the objective? What
question(s) am I trying to answer?
Conducted by the Defense Modeling and Simulation Officer (DMSO) Acquisition and Task Force on
Modeling and Simulation (ATFN&S).
84
Chapter 6

What analyses will have to be conducted? When will the results be
needed? How long will the analyses take? Who will conduct them?

What information is required to support the analyses? How
accurate does the information have to be?

What’s the most efficient way to get the information? Are several
excursions (or iterations) going to be required? Is it a one-time
requirement, or will this be an on-going requirement? Do I need a
model to provide the information?

Can any existing models or simulations provide the information I
need? What is the verification, validation and accreditation
(VV&A) status (see Chapter 7)? Are these existing models
accredited for my class of system? Will they need modification?
What would be the extent of necessary modification? Can the
model owner(s) do the modification in-house? Can I? Are there
any proprietary issues that may lock me into a sole-source
situation?

What data are required by these model(s)? Where and how can
that data be obtained?

What resources (funds, people, time, test articles, hardware,
software, range facilities, documentation) will I need to:
o Build or modify the model(s)?
o Conduct VV&A on the model(s) or modification?
o Implement configuration management (CM)?
o Obtain and validate the data?
o Run the model(s)?
o Analyze the output?
o Ensure that the model(s) are maintained to accurately represent
my system or program?
o Transition the models and simulations to a supporting activity for
maintenance upon dissolution of my PMO? Does “operation and
support” funding provide for model(s) maintenance after
transition?

Does
the
system
design
accommodate
plans
for
hardware/software-in-the-loop (HW/SWIL) with regards to test and
Managing M&S within a Program
85
instrumentation ports, etc.? What do I need to populate my test
bed(s) or simulation facility(ies)?

Are my models and simulations consistent and integrated with the
rest of my program? Are they reflected as tools contributing to
requirements verification in a requirements correlation matrix
(RCM)? Are their characteristics consistent with the Analysis of
Alternative (AoA), test and evaluation master plan (TEMP),
operational requirements document (ORD), and acquisition
program baseline (APB) with respect to measures of outcome,
effectiveness and performance (MOOs, MOEs, and MOPs)?
6.1.2. M&S Plan Elements
Once the Program Manager has completed the thought process
outlined in the previous section, all the elements of a plan are in place,
including:

The tasks, functions or decisions to be supported by M&S,

The specific M&S tools required,

The timeframe of when they are needed,

The plan of how the M&S are going to be acquired, and

The resources required to acquire and manage the M&S.
6.1.3. The Simulation Support Plan
In an attempt to ensure early consideration and to plan the use of M&S
in major programs, Program Managers should develop a Simulation
Support Plan (SSP). By developing an SSP, Program Managers are
forced to view the entire program in the context of the decisions to be
made, timing, and impact (relative importance). It also forces Program
Managers to consider information needs in light of the decisions to be
supported, and to assess the applicability of models and simulation to the
information. Section 6.1.3.1 provides a sample SSP outline.
86
Chapter 6
I.
Purpose

Brief statement as to why plan is required
-
II.
Executive Summary

III.
Summary narrative of Section V
System Description

IV.
Brief summary of weapon system
Program Acquisition Strategy

Brief synopsis of system acquisition strategy

Overview of M&S acquisition strategy
Outline for a Simulation Support Plan
V.
Focusing on the use of M&S in the program
Including role of weapon system M&S in the distributed environment
Simulation Approach/Strategy and Rationale

What M&S is being done, and why
A. List of M&S used to date

Discussion of all previous M&S used to support the program, including the
following:
-
Name/Type of M&S (Live, Constructive or Virtual)
-
V&V performed on M&S—Accreditation status of M&S
-
To what phase/milestone M&S was applied
-
Issues addressed and results
-
Areas M&S has supported:
— Mission-area analyses
— Operational analyses
— Requirements trade-offs
— Conceptual design studies
— Systems engineering trade-offs
— Cost and operation effectiveness analyses
— Logistics analyses
— Test and evaluation
— Training
B. Future Simulation

Possibilities of on-going M&S

A list of all planned M&S for future milestones

A description of how planned M&S will support future milestones

A discussion of how planned M&S will support the Service’s vision for M&S
Managing M&S within a Program
VI.
Outline for a Simulation Support Plan (Continued)
VII.
VIII.
Related Simulation Activities

Other M&S activities the system relies upon

Other systems that rely upon this system’s M&S tools

All other related M&S that affect this system
Items Provided by Management

Wiring diagram of PMO

Inclusion of the simulation manager (if assigned) in diagram

A description of how the simulation manager interacts with acquisition
community
Facilities/Equipment Requirements


IX.
-
Listing those provided by PM, other Gov’t activities, contractor(s)
-
Identifying who will provide
-
Identifying schedule requirements and availability of items to support
schedule
Assurance by management of government ownership of equipment (including
simulators, hardware, software, data, etc.) critical for cost-effective
government management of M&S
An outline of all expected expenditures to support M&S program
-
Including funded and unfunded
-
Designating type of funding (by Program Elements (PE), project, etc.)
Remarks/Supplemental Information

XI.
A description of facility requirements for all M&S (all facilities, hardware,
software, data, etc.)
Funding

X.
87
Any comments or related information
Appendices

Program Schedules

M&S Schedules

Acronyms and abbreviations

Related standards

Related government documents
88
Chapter 6
6.1.3.2.
Applying the Simulation Support Plan
It is not the SSP itself, but the PM’s journey through the process of
identifying the program’s M&S needs that is more valuable. Creating a
bureaucracy that simply requires “another plan” would be counterproductive.
The Program Manager must consider the resources required to build
his/her program, which includes the SSP. The SSP is considered an
evolutionary document and is intended to be refined, through periodic
review, as the program progresses. Like other components of an
acquisition program, the M&S requirements will coalesce and get more
detailed over time.
One of the initial challenges a manager will face is in trying to identify
existing resources that could be used (either as-is or with modifications) to
address his/her program’s M&S needs. A plethora of models and
simulations have already been developed and are in various stages of
accreditation for different purposes. However, the PM must know how to
get the information needed to make a decision as to whether one or more
of these could satisfy an M&S need.
6.2.
Contracting for M&S
Once the Program Manager has determined how and when to best
apply M&S during the program’s lifecycle, it is then necessary for him/her
to begin reviewing the program’s contracting options and capabilities. The
following sections will begin to explore the best practices to obtain
contracted M&S resources.
6.2.1. Models and Simulations as Contract Deliverables
The PM must be aware that some models and simulations will be
developed by the prime contractor as a natural by-product of the system
design and development process. However, he/she must also be aware
of the capabilities and limitations of these models and simulations with
respect to the acquisition process. With this understanding, he/she must
make decisions regarding whether or not to require specific M&S as
deliverables on a case-by-case basis.
6.2.1.1.
Assessing the Impact of Contractor Restrictions
Contractors may also be reluctant to share key algorithms included in
simulations specified for delivery. Based on the program’s acquisition
strategy, the PM must assess impacts of any restrictions the contractor
Managing M&S within a Program
89
may include, and must determine whether (and how much) it would be
worth paying for their removal.
6.2.1.2.
Lifecycle System Support
A PM must also recognize that when production is complete and a
contract ends, M&S support will still be required for the remainder of the
system’s lifecycle. Models and simulations that were constructed during
earlier phases of the acquisition process, and refined as the system
evolved, will play a major role in evaluating system modification
alternatives.
6.2.2. Selecting a M&S Contractor
This section will assist the Program Manager in identifying or selecting
a contractor to assist in the M&S effort. This M&S developer could be a
government agency, an independent contractor or the prime system
contractor. Obviously, the developer must understand M&S and the
Program Manager’s unique requirements.
6.2.2.1.
Questions to Consider when Choosing the M&S
Contractors
The following list of questions that may help the Program Manager
select a contractor was provided by Van B. Norman. This list is based
upon his twenty years of building simulation models and of hiring and
managing simulation consultants. In the DoD’s case, these questions can
form the basis for part(s) of a Request for Proposal (RFP), as well as
provide ingredients for establishing Source Selection Criteria. The
specific questions, and the level to which they will have to be pursued, will
depend upon certain factors within the program acquisition strategy. Some
of these factors are as follows: whether the main effort and the simulation
effort are under a common contract, who is integrating the simulation effort,
whether the system and simulation development efforts are
complementary (with each leveraging from the other), or whether they are
independent.
a.
What is the contractor’s experience with this type of system?
What is the contractor’s track record?
A contractor’s experience with similar systems is important since it
normally generates efficiency, but it is not essential. If a specific contractor
has an unproven record, a software capability evaluation [Defense
Systems Management College, 1993] may provide some insight into that
vendor’s ability to take on a complex software modeling effort. This is
particularly true if the M&S development is a parallel effort to the system
development.
90
Chapter 6
The evaluation augments the acquisition process by determining a
contractor’s strengths and weaknesses with respect to a maturity model. It
establishes a method for comparing a contractor’s software capability
against a standard set of criteria as a source-selection criterion.
b.
How will the contractor approach the construction of the
model? What simulation software will be used?
Because of preexisting systems or specific software capabilities, there
may exist a requirement for a specific software tool. If possible, choose a
simulation tool that is widely used and will be around for a few years.
c.
Will the contractor produce a written specification describing
the system to be modeled, including all assumptions and questions
to be answered? Are you going to be providing the contractor a
specification?
In any case, the specification is necessary to ensure that everyone is
working toward the same objective. It is important to ensure that everyone
at the table is working towards the same end-goal(s). What may be an
obvious objective or capability to the Program Manager may not
necessarily be seen as such by the contractor. By requiring written
specifications, discrepancies between the Program Manger’s expectations
and the contractor’s expectations can be brought to light early and
resolved appropriately.
d.
What questions about the system cannot be answered through
the use of the model?
Models need to be constructed with specific questions in mind pertaining
to the acquisition process. Additionally, an understanding of what the
model will not answer is crucial if a misunderstanding between the project
office and the developing contractor is to be prevented. This is yet another
reason why the model specification is so important.
e.
What is the development schedule?
The model or simulation supports certain information needs of the project
office. Unless this information is timely, it could be worthless. The
contractor’s prior record with respect to on-time delivery should be an
important criterion in selection.
f.
How did the contractor arrive at the cost estimate for the
projects?
Regardless of whether the contractor is working on a cost-plus or fixedprice basis, the contractor must understand the scope of work and
schedule to develop a credible cost estimate.
Managing M&S within a Program
g.
91
How do I determine value for cost?
Norman likened simulation consulting to brain surgery: “if you want the
lowest priced surgeon opening your head, then good luck” [Defense
Systems Management College, 1993]. He emphasized the need to know
the experience, expertise, and record of each candidate contractor, and to
balance these against the price being charged to determine value.
h.
What data are required for the model?
Norman contended that most contractors will not know a specific program
well enough to collect the required data. The government will often need
to provide it. Fortunately, sources of valid data exist within each Service
and the DoD. Each Service’s M&S office can provide authoritative sources
of data. Despite the unique needs of each Service, a common dictionary
of data is required among the Services.
i.
Who will collect the data? When will it be needed? In what
format will it be needed? Who will certify the data?
These are all crucial questions if the government is going to be required to
provide these to the modeler. In fact, any potential disconnect between the
contractor’s requirements for data and the government’s ability to provide
it must be worked out early. The PM of the program must also understand
the resource implications of data collection.
j.
What parts of the system will be detailed, and what parts will
be simplified?
This is another issue the contractor must address in the proposed
specification. On one level, the Program Manger is seeking contracted
support to avoid developing an in-house expertise on the subject; however,
it is equally important for the Program Manager to require a certain
amount of detail in order to be able to validate the contractor’s efforts. A
compromise must be met in determining the a[appropriate amount of
specification provided by the contractor to the PM.
k.
What types of model experiments will be run?
As the model is built, experimentation provides answers. If the modeler
knows what types of experiments are contemplated, the model can be
geared to make the experiments easier.
l.
How much time will be allowed for experimentation?
The user’s understanding of the system may change after the PM reviews
experiment results, and the scope of work may be impacted. A wellstructured contract management process will make this less painful, as all
92
Chapter 6
parties involved will be
requirements/expectations.
m.
working
from
the
same
set
of
How will you be assured the model is “correct”?
The government must be an active player in the V&V process. The PM
must also be mindful of the accreditation authority’s requirements from the
start; the requirements must be built into the contract.
n.
What is the schedule for periodic model review meetings?
This is a crucial management mechanism for ensuring that incremental
model development is on track from cost, schedule and performance
viewpoints.
o.
Can you use the model internally after the contractor is done
working on a specific program?
The model has long-term value. The system will probably change over
time, and the model must be modified. This relates back to the need for
CM and the requirement for adequate documentation to allow the PM to
work either in-house or with a contractor to make necessary modifications
in the future. Even if the requirements have changed slightly, it will be
more cost effective to work from an existing framework than to start a new
M&S effort from scratch.
Reusability of software and the increasing move toward reconfigurable
simulators and simulations make this even more important. Also, the
government’s aversion to locking itself into a developer drives the need for
the government to identify in the contract all its M&S deliverable
requirements, and the timing involved with them. Inclusion of contractor
proprietary material or data, without adequate rights being released to
(procured by) the government, could lead to future problems in reusing the
material.
p.
What could go wrong with this part of the project?
A PM must monitor the model (or simulation) development effort as it
proceeds. Given the criteria for establishment of Work Breakdown
Structure (WBS) elements (form MIL-STD 881B), and Configuration Items
(form MIL-STD 973), PMs must:

Ensure visibility of the M&S efforts at the appropriate level for
management, and

Incorporate the development efforts into their risk-management
program.
Managing M&S within a Program
93
q.
What kinds of analyses will the contractor perform, such as
confidence-interval calculation and design of experiments?
Whether it is the contractor or someone else who is going to be
performing an analysis using the model’s output, the analyst’s
requirements have to be considered in designing the model.
Norman also specified that the contractor will assist in gaining
management and team support for the model and for its use in support of
the analyses at hand. The contractor is, after all, part of the team.
r.
How will the contractor assist the PM in explaining the benefits
and limitations of the model? Will the contractor assist in presenting
the model results to management (decision-makers)? Does the
contractor have the capability to provide a video of the model’s
animation?
The answer to all of these should be, “Yes.” A PM should ensure the M&S
vendor is aware that it is expected to assist in explaining and presenting
the model. All of these situations will go far in gaining and maintaining
support for the effort, since they help decision-makers better understand
the need for the model and the subsequent analyses of its results.
6.3.
Evaluating Contract Proposals
In addition to the general questions outlined above, the following
sections will help to outline specific contract details for the Program
Manager to consider when evaluating an M&S contract proposal.
Furthermore, it is important for the Program Manager to not only consider
the implications of the contract during the period of performance, but also
the exit strategy associate with the contract. The following sections will
elaborate specific details which should be considered and included in the
acquisition strategy associated with an M&S effort.
6.3.1. Modular Contracting and Contract Bundling
As described in Federal Acquisition Regulation (FAR) Section
39.103, the program manager should use modular contracting for major IT
acquisitions to the extent practicable [General Services Administration et
al., 2005]. He/she should also consider modular contracting for other
acquisition programs. (See also section 7.8.3.10 of this guidebook). In
addition, FAR 7.103(s) requires that acquisition planners avoid
unnecessary and unjustified bundling to the maximum extent practicable,
which precludes small business participation as contractors.
94
Chapter 6
6.3.2. Major Contract(s) Planned
For each major contract planned to execute a portion of the acquisition
strategy, the acquisition strategy should describe the following:

What the basic contract buys,

How major deliverable items are defined,

Options, if any, and prerequisites for exercising them, and

The events established in the contract to support appropriate exit
criteria at each phase of the acquisition lifecycle.
6.3.3. Multi-year Contracting
The acquisition strategy should address both the Program Manager's
consideration of multi-year contracting for full-rate production as well as
the Program Manager's assessment of whether the production program is
suited to the use of multi-year contracting based on the requirements in
FAR Subpart 17.1 [2005].
6.3.4. Contract Type
For each major contract, the acquisition strategy must identify:

The type of contract planned (e.g., firm fixed-price (FFP), fixedprice incentive, firm target, cost-plus-incentive fee; or cost-plusaward fee), and

The reasons it is suitable—including considerations of risk
assessment and reasonable risk-sharing by the Government and
the contractor(s).
Additionally, the acquisition strategy should not include cost ceilings
that, in essence, convert cost-type research and development contracts
into fixed-price contracts or that unreasonably cap annual funding
increments on research and development contracts.
6.3.4.1.
Special Contract Terms and Conditions
The Acquisition Strategy should identify any unusual contract terms
and conditions and all existing or contemplated deviations from the FAR or
Defense Federal Acquisition Regulation Supplement (DFARS).
6.3.4.2.
Warranties
Warranties are one example of such special terms and conditions.
When structuring warranties, the Program Manager should consider doing
the following:
Managing M&S within a Program
95

Examine the value of warranties on major systems and pursue
them when appropriate and cost-effective.

Incorporate warranty requirements into major systems contracts in
accordance with FAR Subpart 46.7 [2005].
6.3.4.3.
Component Breakout
In many instances, it may be appropriate to structure a contract such
that the government is able to provide particular components required for
the final deliverable or to identify an alternative supplier for such
components. When considering component breakout, the Program
Manager should address the following points:

Component breakout on every program,

Component breakout when there are significant cost savings
(inclusive of Government administrative costs),

The technical or schedule risk of furnishing Government items to
the prime contractor—if it is manageable or not, and

Any other overriding Government interests (e.g., industrial
capability considerations or dependence on contractor logistics
support).
Furthermore, the Acquisition Strategy should also address component
breakout and should briefly justify the component breakout strategy. It
should list all components considered for breakout and provide a brief
rationale (based on supporting analyses from a detailed component
breakout review—which shall not be provided to the Milestone Decision
Authority unless specifically requested) for those not selected. The
Program Manager should provide the rationale for a decision not to break
out any components.
6.4.
Affordability and Lifecycle Resource
Estimates
The next section addresses acquisition program affordability and
resource estimation. In doing so, it provides explanations of the program
and pre-program activities and information required by DoD Instruction
5000.02 [DoD, 2008]. It will also discuss the support and documentation
provided by Office of the Secretary of Defense staff elements.
96
Chapter 6
6.4.1. Total Lifecycle and Ownership Costs
Both DoD Directive 5000.01, The Defense Acquisition System [DoD,
2007], and DoD Instruction 5000.02, Operation of the Defense Acquisition
System [DoD, 2008], make reference to lifecycle cost and total ownership
cost. For a defense acquisition program, lifecycle cost consists of
research and development costs, investment costs, operating and support
costs, and disposal costs over the entire lifecycle of the program. These
costs include not only the direct costs of the acquisition program, but also
include indirect costs that would be logically attributed to the program.
The concept of total ownership cost is related to lifecycle cost but is
broader in scope. When programs are less mature (in pre-systems
acquisition or system development and demonstration), program cost
estimates that are supporting the acquisition system normally are focused
on lifecycle cost or elements of lifecycle cost. Total ownership costs, on
the other hand, encompass all costs—from development costs at the
beginning of the lifecycle to disposal costs at the end of the lifecycle, as
defined above. Examples of cases in which cost estimates support the
acquisition system at a macro level include affordability assessments,
analyses of alternatives, cost-performance trades, and establishment of
program cost goals.
More refined and discrete lifecycle cost estimates may be used within
the program office to support internal decision-making—such as
evaluations of design changes and assessment of produceability,
reliability, maintainability, and supportability considerations. However, as
programs mature (and, thus, transition from production and deployment to
sustainment), cost estimates that support the acquisition system or
program management in many cases may need to be expanded in scope
to embrace total ownership cost concepts.
6.4.2. Lifecycle Cost Categories and Program Phases
DoD 5000.4-M-1, DoD Cost Analysis Guidance and Procedures [2007],
provides standardized definitions of cost terms that, in total, comprise
system lifecycle costs. Lifecycle cost can be defined as the sum of four
major cost categories; each of those categories is associated with
sequential but overlapping phases of the program lifecycle.
Lifecycle cost consists of the following elements:

Research and development costs: associated with the Material
Solution Analysis (MSA) phase, Technology Development phase,
and the Engineering and Manufacturing Development (EMD) phase,
Managing M&S within a Program
97

Investment costs: associated with the Production and Deployment
phase,

Operating and support costs: associated with the Sustainment
phase, and

Disposal costs: occurring after initiation of system phase-out or
retirement—possibly including demilitarization, detoxification, or
long-term waste storage.
M&S costs can be a component of all of the above cost categories.
6.5.
Chapter Summary
A major component associated with the effective use of M&S in the
system acquisition lifecycle is the planning given to M&S application. This
chapter provided the PM with some guidelines and questions so he/she
can begin thinking about M&S and how it could potentially tie into the
overall acquisition process. By reviewing options early, the PM can be
sure to take full advantage of possible cost-saving M&S alternatives.
6.6.
Chapter References
Defense Modeling and Simulation Officer (DMSO): Acquisition Task Force on Modeling
and Simulation (ATFM&S). ACAT I & II Program Management Office’s Survey.
Fort Belvoir, VA: Author, 1993.
Defense Systems Management College. Mission Critical Computer Resources
Management. Fort Belvoir, VA: Author, 1993, Section 8.6.3.6.
Department of Defense (DoD). DoD Cost Analysis Guidance and Procedures (DoD
5000.4-M-1). Washington, DC: Author, April 18, 2007.
Department of Defense (DoD). Operation of the Defense Acquisition System (DoDI
5000.02). Washington, DC: Author, December 8, 2008.
Department of Defense (DoD). The Defense Acquisition System (DoDD 5000.1).
Washington, DC: Author, May 12, 2003.
General Services Administration, Department of Defense, National Aeronautics and
Space Administration. Federal Acquisition Regulation (FAR). Washington, DC:
Author, March 2005.
Norman, V.B. “Twenty Questions for Your Simulation Consultant.” Industrial
Engineering (May 1, 1993).
98
Chapter 6
Office of the Secretary of Defense (OSD). Defense Federal Acquisition Regulation
Supplement (DFARS). Washington, DC: Author, January 15, 2009.
http://www.acq.osd.mil/dpap/ dars/dfars/html/current/tochtml.htm
Seglie, E. “Modeling and Simulation: Is VV&A the Real Stumbling Block? Are we using
M&S Correctly?” Paper presented at the MORS Workshop (Military Operations
Research Society), The Energy Training Complex, Kirtland AFB, Albuquerque,
NM, October 15-17, 2002, 79-95, 13.
TK=
sÉêáÑáÅ~íáçåI=s~äáÇ~íáçå=
~åÇ=^ÅÅêÉÇáí~íáçå=EssC^F=
çê=`ÉêíáÑáÅ~íáçå=EssC`F=
It is understood that we are to use models and simulations that are
properly verified, validated, and accredited to support the test
process […]. [V]alidation is an important word, but I think perhaps
poorly used nowadays. We started it; I am probably as much to
blame as anybody. But we do want to have some basis of belief
that our simulations have some creditability, that they are not pulled
out of some place in the air, that they can be measured one way or
another.
Walter Hollis, Former Deputy Under Secretary
of the Army for Operations Research [2002]
In recent years, general interest in data verification, validation, and
accreditation and/or certification has increased dramatically. The
relationship between good data and successful M&S projects is evident.
Recent studies indicate the necessity for project team members to give
considerable time, effort and resources to strengthening data-collection
integrity and processes. For example, a study conducted by Dr. Gary
Horne and Ted Meyer [2004] demonstrated the profound need to devote
more attention to this part of the modeling and simulation effort. Their
study discussed the concept of data farming. Data farming is defined as
the opportunity to grow more data in a particular area of interest. If a
modeler is interested in learning as much as he/she can about how certain
factors react within specific scenarios and environments, then the
individual will want to consider how data is generated and how many
permutations or combinations are possible for a specific planned objective.
Therefore, team members must have a solid knowledge of data “behavior”
if the modeling effort is to be truly successful. A team can only gain a keen
understanding of data integrity if its members are aware of the basic
foundation of how data is verified, validated and accredited/certified
[Horne and Meyer, 2004: 807-813].
7.1.
VV&A Policy Background
Since 1996, the DoD has established policies and recommended
practices to provide a basis for practicing Verification, Validation, and
Accreditation (VV&A). VV&A is critical for ensuring M&S is appropriate,
used correctly and producing trusted results. VV&A stands out as a key
100
Chapter 7
issue within the M&S community because of widely held misperceptions
regarding it. Among some of the misperceptions are the following:

That VV&A is too costly,

That VV&A is not properly integrated into the M&S process, and,
thus, not adequately resourced,

That it is often conducted without proper scoping up-front, resulting
in unnecessary work and cost, and

That reuse has not occurred as anticipated. [MORS, 2002: 7]6
One of the goals of the Acquisition Modeling and Simulation Master
Plan (AMSMP) in April 2006 was to foster cost-effective VV&A. The
Master Plan observed that, “the inability to clearly understand what VV&A
has accomplished has degraded the usefulness of much M&S […].
Documentation is difficult to find and to understand, and contacting
anyone with knowledge about prior VV&A activities is often difficult”
[OUSD(AT&L), 2006: 34]. Along these lines, the AMSMP proposed DoDwide standardized documentation of M&S VV&A: the DoD M&S VV&A
Documentation Templates (DVDTs). This standardization, in turn, was
to help facilitate reuse and to signal a move toward improving the
strengths and weaknesses of M&S [2006: 35].
7.1.1. DoDI 5000.61
DoD Instruction 5000.61 sets forth policies, roles and responsibilities
for DoD M&S VV&A. “Models and simulations used to support major DoD
decision-making organizations and processes […] shall be accredited for
that specific purpose by the DoD Component M&S Application Sponsor”
[DoD, 2003: 4.1, 2].
The directive further states that V&V shall “be
incorporated into the development and life-cycle [sic] management
processes of all M&S […and shall] be documented as part of the VV&A
documentation requirements” [2003: 7]. In theory, by implementing DoDwide VV&A documentation templates (or DVDTs), potential users save
time and money in identifying legacy M&S [Charlow, 2007].
7.2.
The Role of Data
The data-collection processes within the M&S industry require great
attention to detail to ensure valid results at the end. Data collection is one
6
Reuse is defined as “the practice of using again, in whole or part, existing M&S tools, data, or services”
[DoD, 2007: 7].
Verification, Validation & Accreditation 101
of the first elements in the VV&A process this chapter will consider. Proper
collection processes are vitally important if PMs are to ensure that all parts
of the process have integrity.
7.2.1. Questions for Consideration
Questions such as, “What is the intended model resolution?”, or
“Which technique will a modeler use in defining data parameters?” must
be answered during the initial stages of data collection. Once data have
been collected, they must be vigorously tested by the modeler for
verification, validation, and accreditation/certification. Without this level of
testing, data may prove to be faulty or useless in terms of the overall
purpose of the modeling and simulation project [Balci, 2004: 122-129].
7.3.
Purpose and Definitions
The following section will provide Program Managers with some
standard definitions relating to VV&A.
7.3.1. Purpose of VV&A
The purpose of VV&A, according to the Defense Modeling and
Simulation Office, is as follows:
To determine whether a model or simulation or federation should
be used in a given situation, its credibility should be established by
evaluating fitness for the intended use. In simplest terms,
Verification, Validation and Accreditation are three interrelated but
distinct processes that gather and evaluate evidence to determine,
based on the simulation’s intended use, the simulation’s capabilities,
limitations, and performance relative to the real-world objects it
simulates. The purpose of VV&A is to assure development of
correct and valid simulations and to provide simulation users with
sufficient information to determine if the simulation can meet their
needs. [Defense Modeling and Simulation Office, 2006].
7.3.2. VV&A Definitions
As published in the DoD Instruction (DoDI) 5000.61 [DoD, 2003: 10-15],
VV&A are three discrete processes that are defined as:
Verification
The process of determining that a model implementation and
its associated data accurately represent the developer’s
conceptual description and specifications.
Validation
The process of determining the degree to which a model and
its associated data provide an accurate representation of
102
Chapter 7
the real world from the perspective of the intended uses of
the model.
Accreditation
The official certification that a model, simulation, or
federation of models and simulations and their associated
data are acceptable for use for a specific purpose.
7.3.3. VV&C Definitions
The following definitions are established in the DoD Directive (DoDD)
5000.59 [DoD, 2007: 11]:
Data Verification
Data-producer verification is the use of techniques and
procedures to ensure that data meets constraints defined by
data standards and business rules derived from process and
data modeling. Data-user verification is the use of
techniques and procedures to ensure that data meets userspecified constraints defined by data standards and
business rules derived from process and data modeling, and
that data are transformed and formatted properly.
Data Validation
Data validation is the documented assessment of data by
subject-matter experts and its comparison to known or bestestimate values. In other words, data-user validation is the
documented assessment of data as appropriate for use in an
intended model.
Data Certification Data certification is the determination that data have been
verified and validated. Data-user certification is the
determination by the application sponsor or designated
agent that data have been verified and validated as
appropriate for the specific M&S usage. Data-producer
certification is the determination by the data producer that
data have been verified and validated against documented
standards or criteria.
7.4.
VV&C Data
According to DoD Directive 5000.59 [2007], each DoD component
must establish VV&A policies, procedures, and guidelines for models,
simulations, and their associated data. The application of M&S requires
accurate and reliable data in order to define, for instance: a) doctrine, b)
environments, c) scenarios, and d) weapon and system performance. In
an environment that relies heavily on the credibility of M&S results, the
quality of data is as important as the performance of the models and
simulations themselves. However, unlike VV&A, which has been
addressed in detail in the DoDD 5000.59, Data Verification, Validation &
Verification, Validation & Accreditation 103
Certification (VV&C) is still not generally understood nor practically
implemented [VV&A Technical Working Group, 1998: 3-12].
7.4.1. The VV&C Tiger Team
The VV&C Tiger Team (VVCTT) was founded in 1997 under the
leadership of the VV&A Technical Working Group (TWG). The Tiger
Team is a group of M&S practitioners composed of Modeling and
Simulation Executive Agents (MSEA), as well as representatives from the
military Services (Army, Navy, Air Force) and the Office of the Secretary of
Defense (OSD).
In 2005, a DoD-sponsored Tri-Service VV&A Templates Tiger Team
developed templates for four core VV&A documents: Accreditation Plan,
V&V Plan, V&V Report, and Accreditation Report [Charlow, 2007: 9]. Each
document is designed to stand alone in representing all the information
available at the particular time in the V&V processes.
7.4.2. VV&C Tasks and Objectives
The VVCTT was tasked to identify key issues and gaps that existed
within the data verification, validation, and certification process—
specifically as it related to the DoD modeling and simulation methodology.
Additionally, the VVCTT was charged with examining and reviewing
current processes, policies, and practices as they apply to the VV&C data
activities. These directions include:

Assess the current state of DoD VV&C products.

Leverage relevant VV&C activities of the M&S community at large.

Convert these activities into specific products:

o
Generic user template for VV&C,
o
Data-user integrated VV&A/VV&C model, and
o
Suggested topics for inclusion in the rewrite of the VV&A
Recommended Practices Guide.
Ascertain what remaining activities are needed to reach the desired
technical end-state of the program and make appropriate
recommendations. [VV&A Technical Working Group, 1998: 3-12]
104
Chapter 7
7.4.3. VV&C Process Definitions
According to the VV&C Tiger Team report, four sub-groups were
formed to identify individual elements and objectives. The associated
tasks of the sub-groups were to:
Leverage
Exploit the current state of VV&C resources, information and
knowledge.
Template
Create a user-driven template of data quality information.
Model
Develop a data-user integrated VV&A, VV&C model.
RPG
Suggest topics for rewriting the VV&A Recommended
Practices Guide (RPG).
7.4.4. Products
After the VVCTT was divided into these sub-groups, each of the four
teams was challenged to develop a deliverable that either would: a)
improve existing product(s) or b) outline policies and/or guidelines that
assist data producers in providing useful information to data users.
VV&C Bibliography
The
leverage
group
produced
a
bibliography of existing literature as well as
a compilation of existing V&V tools.
Additionally, the publication includes
references to pilot projects and their
lessons-learned reports.
Data Quality Metadata Template
The Data Quality Metadata Template
(DQMT) is a data-user guide that provides
methods and methodologies for identifying
producer-generated Data Quality (DQ)
information in support of VV&A activities.
M&S Lifecycle Process Model
The team involved in this effort updated the
existing M&S Lifecycle Process Model that
was originally developed for supporting the
development of the DoD VV&A RPG.
VV&C Content for the VV&A RPG The two recommendations that were
updated included: a) VV&A policy guidance
documents
and
b)
the
VV&A
Recommended Practices Guide.
Verification, Validation & Accreditation 105
7.5.
Important Distinctions
It is important to note that there are clear distinctions between the
manner in which the end-user and the producer of data employ data in
their work. Producer data is defined by a parameter called data quality—
as outlined by the DoD 8320 series [MORS, 2002]—that focuses on the
integrity of the original data and its application to future endeavors. Data
quality management is focused on the issues and problems pertaining to
the creation, management and future use of data. On the other hand, the
user data V&V activities are typically inculcated into the M&S accreditation
process. Further, the M&S lifecycle plays an integral role in how data is
defined. Data may have different meanings and levels of significance
depending on where the M&S project is in its lifecycle. Similarly, data can
be captured at various points along the lifecycle, or continuum, of the M&S
process.
7.5.1. Gaps
The VV&A Technical Working Group’s study of VV&C with the Tiger
Team [1998: 3-12] identified several gaps between current processes and
desired long-term results. The study pointed to several opportunities for
further research and/or improvement. They are:

Inconsistency within user application data V&V activities,

Inherent disconnect between producer and user requirements, and

No central resource data bank or library for user data V&V
information.
7.5.2. Emerging Issues
There are numerous issues arising from current studies of data VV&C.
Since there is no real infrastructure in place to house any new knowledge
with respect to VV&C, the case can be made that the industry should
consider dedicating resources to developing a coordinated knowledge
management program in a joint environment. Further, there are many
business opportunities to study the role of intensive training and technical
assistance; these focus on providing the DoD with cutting-edge
technological tools that capture the true value of having accurate, verified,
and validated data at the right time. Additionally, the need for data that has
been properly calibrated, validated and verified will become increasingly
important if M&S professionals and DoD clients are to take full advantage
of other emerging technologies, e.g., wargaming. If data is not properly
verified, disastrous results for projects within the M&S industry may result.
Therefore, experts in the field will have to foster the growth and
development of specialists and experts in this specialized sub-field.
106
Chapter 7
7.5.3. Considerations
Finally, when models or data are modified, the new system must
always be tested to prove its accurate representation of the real world.
However, PMs may need to realize that the reaccreditation and
recertification processes of common M&S models and their underlying
data sets could be further simplified—especially if different projects and/or
clients will be drawing upon the data for multiple uses. Moreover, PMs
must further consider methods and processes (M&P) that clearly outline
criteria for streamlining frequent reaccreditation and recertification
procedures throughout a system’s development.
7.6.
Best Practices
An initial set of goals emerged from a 2002 Military Operations
Research Society (MORS) Workshop to address general concerns about
the capability and expectations for M&S in acquisition [Hollis, 2002: 79-95].
These initial set of goals proposed:

To understand the problem,

To determine the use of M&S,

To focus accreditation criteria, and

To contract to get what a program requires. [Ibid.]
When exploring a new program or reviewing an existing one, it is
invaluable for a PM to ask plenty of questions so he/she can develop
sound criteria that are focused and explicit against which to evaluate
possible solutions for the specific program. The unfocused use of M&S,
unbounded VV&A, and unnecessary expenditure of time and money
without meaningful results are some of the leading reasons for program
failure [Ibid.].
The MORS Workshop further explained how PMs can properly apply
M&S. For instance, PMs should use M&S to help catalyze critical thinking,
generate hypotheses, perform sensitivity analyses to identify logical
consequences of assumptions, and generate imaginable scenarios. In
order to know up-front what M&S will be used for, PMs should ask, “What
part of the program’s problem can be answered by M&S, and what
requirements does the program have that M&S can address?”
Often, the PM should emphasize reducing program risk in order to
focus accreditation criteria. Accreditation criteria are best developed by a
collaboration of all involved stakeholders [Hollis, 2002: 90]. In order to
scope V&V, PMs must plan the V&V effort in advance to use test data to
Verification, Validation & Accreditation 107
validate the model. The data should be used correctly, and the conditions
under which the data was collected should be documented to ensure the
integrity of the process.
7.7.
Chapter Summary
In summary, data collection, verification, validation, and/or certification
techniques are fundamental to the long-term success of any modeling and
simulation project. Without proper data collection—in which specific
parameters are set—modeling and simulation projects are at risk for
failures throughout a project’s lifecycle. Essentially, data integrity through
VV&C is the basis upon which any successful project should be founded.
Without “good” data, the overall project may be compromised—thereby
causing a significant loss in revenue to clients or knowledge for the
Department of Defense. There is ample justification for further research,
as well as resource allocation, into this field. DoD modelers may benefit
from studying this field more in depth.
7.8.
Chapter References
Balci, O. Quality Assessment, Verification, and Validation of Modeling and Simulation
Applications. Blacksburg, VA: Virginia Tech, 2004.
Charlow, K. “Standardized Documentation
Accreditation—A Status Report to the
Presented by DoD M&S Project (DMSP)
National Defense Industrial Association,
Conference, October 24, 2007.
for Verification, Validation, and
Systems Engineering Community.”
Project Management Team (PMT),
10th Annual Systems Engineering
Defense Modeling and Simulation Office. VV&A Recommended Practice Guide (RPG
Build 3.0). Washington, DC: Author, 2006. http://vva.dmso.mil/.
Department of Defense. DoD Modeling and Simulation (M&S) Verification, Validation,
and Accreditation (VV&A) (DoDI 5000.61). Washington, DC: Author, May 13,
2003.
Department of Defense. DoD Modeling and Simulation (M&S) Management (DoDD
5000.59). Washington, DC: Author, 2007.
Hollis, W. “Test and Evaluation, Modeling and Simulation and VV&A: Quantifying the
Relationship between Testing and Simulation.” Paper presented at the MORS
Workshop (Military Operations Research Society), The Energy Training Complex,
Kirtland AFB, Albuquerque, NM, October 15-17, 2002, 79-95.
Horne, G.E., and Meyer, T.E. Data Farming: Discovering Surprise. Woodbridge, VA:
The MITRE Corporation, 2004.
108
Chapter 7
Military Operations Research Society (MORS). Proceedings of the MORS Workshop:
Test & Evaluation, Modeling and Simulation and VV&A: Quantifying the
Relationship between Testing and Simulation. The Energy Training Complex,
Kirtland AFB, Albuquerque, NM, October 15-17, 2002.
Office of the Under Secretary of Defense (Acquisition, Technology & Logistics) Defense
Systems. Department of Defense Acquisition Modeling and Simulation Master
Plan. Washington, DC: Author, April 17, 2006.
VV&A Technical Working Group. Report of the Verification, Validation & Certification
(VV&C) Tiger Team. Washington, DC: Author, 1998.
UK= `çããçå=fëëìÉë=~åÇ=íÜÉ=
cìíìêÉ=çÑ=jCp=
8.1.
Intellectual Property
In closing a review of Modeling and Simulation, it is only appropriate to
spend some time exploring intellectual property (IP). This term is often
vaguely defined, but it has potentially significant ramifications with respect
to a program’s outcome. The term intellectual property covers a variety of
areas, including:

patents,

copyrights,

trademarks, and

trade secrets.
In dealing with IP rights, the DoD has promulgated policies and
regulations on patents, copyrights, technical data, and computer software.
Because intellectual property is a varied topic, there are various
objectives a Program Manager must keep in mind in regards to IP. At the
forefront of this discussion is the fair treatment of intellectual property
owners. In order for the Department of Defense to encourage future
technology development, it is paramount that those developers feel they
are being treated fairly and that they will be able to appreciate gains from
their efforts. In addition, PMs should explore opportunities in which
commercially produced products and services can be used to benefit the
DoD; by expanding to a larger market, the DoD will be able to both
disburse R&D costs, as well as leverage more opportunities for both the
developer and the DoD to gain from its work. Overall, a PM is prudent to
encourage collaboration between the DoD and commercial developers on
more commercially friendly terms.
Intellectual property considerations have a critical impact on the cost
and affordability of technology and, as such, should not be treated as a
separate from Modeling and Simulation. Decision-makers should carefully
plan and consider documents at the start of the program to ensure that
both the PM and the contractor have clearly defined objectives and
deliverables. If intellectual property is not considered beforehand, it could
potentially lead to high unexpected costs later in the program lifecycle.
110
Chapter 8
Additionally, intellectual property is integral with all types of DoD
requirements, including:

Production,

Acceptance testing,

Installation, operation,

Maintenance,

Upgrade/modification,

Interoperability with other systems, and

Transfer of technologies to other programs/systems/platforms.
8.1.1. Intellectual Property Regulations and Practices
Although the intellectual property topic is broad, there are specific
policies and guidelines which the Program Manager should note. The
standard FAR and DFARS clauses require that certain information be set
forth in the contract (e.g., the pre-award listing of proprietary IP). PMs
should take care in developing a clear and explicit intellectual property
clause, as incomplete or ambiguous clauses are not effective and
potentially costly. However, standard FAR and DFARS clauses do not
always resolve critical IP issues. For example, there is no clause
establishing rights regarding commercial computer software, although the
DFARS establishes procedures for the early identification of restrictions on
noncommercial technical data and computer software.
Furthermore, the DoD may own the delivered physical medium on
which the intellectual property resides, but generally it will not own the
actual intellectual property rights. Generally the contractor will own the
intellectual property rights, unless the IP was a predetermined deliverable.
It may also be the case that although the DoD may not own the IP rights, it
will own licensing rights, which allow for the use, replication, modification
and release of the IP deliverable. If the terms for intellectual property,
rights and licensing rights are negotiated carefully and in a flexible manner,
there is the potential for both the DoD and the contractor to benefit from
the interaction.
In order to ensure successful program execution, PMs should take
steps early in the acquisition process to identify commercial software and
technical data that will be delivered.
M&S Common Issues and Future
111
8.1.2. Commercial Software and Technical Data
It is important to bear in mind that the DoD normally receives only
those deliverables and associated license rights that are customarily
provided to the public. If additional needs or requirements exist, the PM
must clearly identify and specify them. As mentioned above, there is no
standard clause for commercial computer software in the guidance
literature. Thus, it is left to the parties involved to incorporate the relevant
license agreement into the contract and to ensure that its provisions are
understood and meet the DoD’s needs.
Along this same line of reasoning, “one-size-fits-all” license
agreements are rarely effective because terms and conditions are likely
inapplicable or irrelevant across different projects. Additionally, a generic
license agreement can create inefficiencies that may impact other
important contract terms (e.g., price) to account for the inefficiencies as
the program progresses. Likewise, when a PM is negotiating contractual
terms, it is important for him/her to distinguish “off-the-shelf” or nondevelopmental acquisitions from development partnerships.
Commercial software should be acquired using the commercial license
terms whenever these terms are available. Changes in commercial
license terms should be negotiated only when there is a specific DoD
need and when the PM is willing to pay the associated cost.
8.1.3. DoD Acquisition Policy and Flexibility
In negotiating terms and mutually beneficial outcomes, Program
Managers must realize that DoD acquisition policy does provide flexibility
for IP issues. DoD policy requires delivery of only the technical data and
computer software necessary to satisfy agency needs.
Program
Managers can help mitigate program costs if they avoid requiring delivery
of technical data and computer software “just in case” it is needed in the
future with little foundation of a quantifiable need.
Furthermore, PMs should also explore contingency-based delivery
requirements for potential future needs for technical data and computer
software to allow for greater flexibility as the program matures. Similarly,
by separating delivery needs/requirements from the technical data and
computer software that is needed only for viewing, PMs can help defer
costs while obtaining the benefits of M&S.
As a general rule under DoD contracts, the contractor is allowed to
retain ownership of the technical data and computer software it developed.
In contrast, the DoD receives only a license to use that technical data and
computer software and, thus, does not “own” the technical data and
computer software included in deliverables.
112
Chapter 8
The scope of the license may depend on the following points:

Nature of the technical data and computer software,

Relative source of funding for development, and

Negotiations between the parties.
DoD clauses related to intellectual property are currently built around
the following general framework:

Contractors
own
IP
rights
for
technologies/information
developed/delivered under DoD contracts, while

The DoD receives a nonexclusive license(s) based on the nature of
the data, the relative source of funding for development, and
negotiation between the parties.
DFARS Subparts 227.71 and 227.72 establish the DoD process for
acquiring IP license rights and specify the above framework: that the
contractor retains the intellectual property title, and the DoD receives a
nonexclusive license to use, reproduce, modify, release, perform, display,
or disclose the data software. Furthermore, the specific license granted
depends on whether the technical data or computer software qualifies as
noncommercial or commercial technology.
8.1.4. Commercial and Noncommercial Technologies
When DoD acquisitions involve a mix of commercial and
noncommercial technologies, the contract should be segregated into
separate line-items, with each line-item being governed by the appropriate
clauses or attached license agreements. Additionally, the contract should
allow for provisions to cover both types of technologies with a statement
clarifying how they apply to the deliverables.
It is important for the PM to identify commercial deliverables so the
DoD can plan for maintenance and support. Additionally, the reduction in
intellectual property deliverables and license rights on commercial
data/software may significantly impact the acquisition plan. To help
identify and resolve potential issues early, the PM should consider
requiring a list of commercial data/software restrictions at the forefront of
the effort.
8.1.5. Additional Intellectual Property Forms
There are other forms of valuable intellectual property that may not be
covered by any of the previously mentioned lists, such as a trade secret or
copyrighted information that does not meet the definition of “technical
M&S Common Issues and Future
113
data” or “computer software.” Although these may not fall into a
previously mentioned category, they may qualify as “special works” or
“existing works.” For example, some other forms of company-proprietary
information might include financial, cost, business, or marketing
information. To prevent any future complications, the PM should consider
requiring the contractor to identify and assert any restrictions on the DoD’s
use of the IP.
8.1.6. Intellectual Property vs. Licensing
It is necessary to be able to distinguish between intellectual property
deliverables and licensing rights. IP deliverables refer to the contractual
obligation to deliver intellectual property, having a predetermined content
and format. As discussed above, the DoD may own the delivered physical
medium on which the IP resides, but it generally will not own the IP rights.
In contrast, license rights refer to the DoD’s ability to use, reproduce,
modify, and release the delivered intellectual property. Although distinctly
unique, these two deliverables are integrally related.
A PM’s ability to use creative flexibility in both areas can result in a
mutually beneficial outcome for both the DoD and the contractor.
Furthermore, intellectual property deliverables should be established in
terms of Content, Format, and Delivery Medium. The contract should
require delivery of all information necessary to accomplish each element
of the acquisition strategy because, as mentioned previously, the standard
DFARS clauses that establish the rights in technical data or computer
software do not specify requirements. Likewise, delivery requirements for
technical data/computer software should specify content (e.g., level of
detail or nature of information), recording/storage format, and
delivery/storage medium (e.g., paper, CD-ROM, or on-line access).
8.2.
Acquisition Planning
Good acquisition planning, including market research, begins with a
review and complete understanding of a program’s requirements. A
secure understanding of the intellectual property program requirements
allows the Program Manager to anticipate the valid DoD interests in
intellectual property, thus shaping the procurement process. Early
planning and market research will best enable the PM to achieve the
following objectives:

Assess the environment and requirements,

Incorporate this knowledge into the acquisition strategy, and

Make the best business deal for the DoD.
114
Chapter 8
8.2.1. Long-term Planning
It is imperative for the Program Manager to consider both the
immediate project requirements as well as any expected production and/or
support, follow-on activity that may be required at a future stage. By
clearly defining and understanding these expectations, the PM is able to
reduce the need for superfluous technical data and other intellectual
property. Examples of program coast savings may result from the fact
that no future buys are planned or that maintenance and support are to be
conducted through warranties.
Furthermore, a PM’s ability to embrace the concept of contractor
logistics support may alleviate the need for technical data and may
remove that intellectual property barrier from the procurement. In contrast,
if organic maintenance capability is required at some level, new
assumptions should be considered—thus focusing appropriate attention
on the intellectual property issues early in the acquisition process.
8.2.2. Summary
As has been outlined above, a successful acquisition program requires
early and continued communication among all members of the team—
including the program, contracting, logistics, and legal components. The
involvement of commercial industry in the planning process will provide
the necessary commercial input that can help shape the acquisition
strategy and program plan—especially through effective market research
and the potential sources found via draft solicitations.
Again, the Program Manager must ensure that the IP terms and
conditions negotiated are appropriate for the particular project; he/she
must also understand both the short- and long-term implications of those
conditions. Commercial firms may not necessarily know of or understand
the defense-related contractual clauses or recognize that they are
negotiable.
The simplest and yet most important aspect of acquiring intellectual
property is identifying the critical issues prior to contract award. The PM is
able to preserve the contractor’s valuable IP interests by asserting
restrictions on trade secret information; he/she is also able to facilitate
source selection by identifying IP-based restrictions that impact the overall
lifecycle cost of competing technologies. In addition, the PM can both
facilitate structured negotiations by ensuring that the parties are fully
aware of the critical IP issues as well as provide convenient methods for
incorporating the results of IP negotiations into the contract
M&S Common Issues and Future
8.3.
115
The Evolution and Future of M&S
M&S has rapidly evolved toward a state that permits the increasingly
sophisticated implementation of integrated product and process teams;
however, it is a challenge to the Acquisition Manager to evolve in
directions that will allow the program to take full advantage of this
integration.
Over the past several years, M&S has progressed from the
predominant use of live and constructive simulations to increased interest
in the use of virtual simulations. This shift is supported by rapid
improvements in the sophistication of information processing and display
technologies. However, today’s technical and managerial use of M&S in
support of systems acquisition is largely characterized by use of these
tools in stand-alone and system-specific modes.
Two of the most effective ways a PM can see additional benefits
resulting from the current M&S advancements are by:

Increasing communication among functional areas observed
throughout the acquisition community, and

Coupling that communication with the continuing revolution in
information processing technologies.
Looking towards the future state, M&S in acquisition will consist of
environments which seamlessly integrate simulations.
Furthermore,
integration will occur among simulations of similar and different classes
(live, constructive and virtual) and across all levels of the M&S hierarchy
(engineering, engagement, mission/battle and theater/campaign). In
addition, it will provide information that will support planning and decisionmaking in all functional areas and at the requisite level of resolution for
specific decisions.
8.3.1. Getting to the Future State of M&S
Many of the enabling technologies associated with emerging M&S and
that have the potential to contribute to the acquisition process are
commercially driven. While this allows the DoD to leverage common
advances made in the commercial market for these technologies, others
are of specific interest to the military. Development of some of these latter
technologies will largely be determined by the DoD’s ability to marshal
industry innovation in the direction of its interests, as their specific nature
may make these technologies vulnerable to neglect. Much of this onus in
these advances lies with the Program Manager: he/she must make certain
that the appropriate planning and action is taken to ensure not only the
116
Chapter 8
appropriate application of M&S within a program but also to capture those
M&S elements that can be translated across projects and applications.
Hopefully this Guidebook has made it clear that the future of Modeling
and Simulation will continue to play an integral role in potentially lowering
costs, providing pivotal program insight, allowing more time in program
planning phases and generally reducing cost overruns. PMs must take
care to plan for and apply these capabilities in a way appropriate for both
their specific programs as well as the DoD as a whole.
8.4.
Chapter References
General Services Administration, Department of Defense, National Aeronautics and
Space Administration. Federal Acquisition Regulation (FAR). Washington, DC:
Author, March 2005.
Office of the Secretary of Defense (OSD). Defense Federal Acquisition Regulation
Supplement (DFARS). Washington, DC: Author, January 15, 2009.
http://www.acq.osd.mil/dpap/ dars/dfars/html/current/tochtml.htm
pÉäÉÅíÉÇ=_áÄäáçÖê~éÜó=
This bibliography represents the books and articles that proved useful
in the creation of the guidebook. It is by no means a complete record of all
of the works consulted, but gives a broad overview of the areas covered.
This bibliography is intended to be useful to readers interested in pursuing
further study in the area of Modeling and Simulation.
Acker, David D. “The Maturing of the DoD Acquisition Process.” Defense Systems
Management Review 3, no. 3 (Summer 1980): 7-77.
Adamy, David. Introduction to Electronic Warfare Modeling and Simulation. Boston, MA:
Artech House, 2003.
Air Force Research Laboratory. Success Stories: A Review of 2000. Wright-Patterson
Air Force Base, OH: Air Force Research Laboratory, August 27, 2001.
Air Force Research Laboratory Materials and Manufacturing Directorate. Toward More
Affordable Defense Technology: IPPD [Integrated Product and Process
Development] for S&T [Science and Technology] Quick Reference. James
Gregory Associates, Inc. http://www.JamesGregory.com.
Aitcheson, Leslie, and the Ballistic Missile Defense Organization Technology
Applications Program. “Technology Commercialization: How Is It Working for
BMDO?” The Update (Summer/Fall 1998): 1, 12, 13.
Aitcheson, Leslie. “Cashing in the Chips: BMDO Technology Is Making Big Payoffs for
Semiconductor Producers.” BMDO Update, no. 34 (Summer 2000): 1-3.
Aldridge, Edward C., Under Secretary of Defense for Acquisition, Technology, &
Logistics, and Delores M. Etter, Deputy Directory, Defense Research and
Engineering. “Technological Superiority for National Security.” Statement before
the Senate Armed Services Committee, Emerging Threats and Capabilities
Subcommittee, Defense Wide Research and Development, June 5, 2001.
Aldridge, Edward C., Under Secretary of Defense for Acquisition, Technology, &
Logistics. “Intellectual Property.” Memorandum, December 21, 2001.
Allison, David K. “U.S. Navy Research and Development since World War II.” In Military
Enterprise and Technical Change: Perspectives on the American Experience,
edited by Merritt Roe Smith. Cambridge, MA: The MIT Press, 1985.
American Association for the Advancement of Science. “Guide to R&D Data—Historical
Trends in Federal R&D (1955-).” http://www.aaas.org/spp/dspp/rd/guihist.htm.
118
Selected Bibliography
American Institute of Physics. “Recommendations of Hart-Rudman National Security
Report: R&D.” In FYI: The AIP Bulletin of Science Policy News, no. 22, February
28, 2001. http://www.aip.org/enews/fyi/2001/022.html.
Anderson, Warren M., John J. McGuiness, and John Spicer. From Chaos to Clarity:
How Current Cost-Based Strategies Are Undermining the Department of Defense.
Fort Belvoir, VA: National Defense University, September 2001.
Anderson, Warren M., John J. McGuiness, and John S. Spicer. “And the Survey
Says…The Effectiveness of DoD Outsourcing and Privatization Efforts.”
Acquisition Review Quarterly, (Spring 2002).
Ballistic Missile Defense Organization. The 2000 BMDO Technology Applications
Report: Technology, Working for You Now. Alexandria, VA: National Technology
Transfer Center—Washington Operations, June 2000.
Ballistic Missile Defense Organization. Ballistic Missile Defense Organization 1998
Technology Applications Report. Alexandria, VA: BMDO Office of Technology
Applications.
Ballistic Missile Defense Organization Technology Applications Program.
“Commercialization Is a Continuous Process.” BMDO Update, no. 40 (Winter
2001/2002).
Ballistic Missile Defense Organization Technology Applications Program. “White LEDs
Illuminate a New Lighting Path: Wide-Bandgap Semiconductors Are
Revolutionizing General Lighting.” BMDO Update, no. 36 (Winter 2000/2001): 1-3.
Banks, Jerry, ed. Handbook of Simulation: Principles, Methodology, Advances,
Applications and Practice. New York: Wiley, 1998.
Beck, Charles L., Nina Lynn Brokaw, and Brian A. Kelmar. A Model for Leading
Change: Making Acquisition Reform Work. Fort Belvoir, VA: Defense Systems
Management College, December 1997.
Benson, Lawrence R. Acquisition Management in the United States Air Force and its
Predecessors. Washington, DC: Air Force History and Museums Program, 1997.
Bhattacharyya, Shuvra S., Ed F. Deprettere, and Jürgen Teich. Domain-Specific
Processors: Systems, Architectures, Modeling, and Simulation. Signal
processing and Communications. New York: 2003.
Board on Manufacturing and Engineering Design (BMED). Modeling and Simulation in
Manufacturing and Defense Acquisition: Pathways to Success. Washington, DC:
National Academy Press, 2002.
119
Borck, James R. “Disruptive Technologies: How to Snare an 800-Pound Gorilla: Stun
Him with a Disruptive Technology and Watch as He Stumbles to Adapt.”
InfoWorld 24, no. 1 (January 7, 2002).
Bureau of National Affairs, Inc. “Bush Signs Executive Order Creating Science,
Technology Advisory Council.” Federal Contracts Report 76, no. 13 (October 9,
2001).
Chairman of the Joint Chiefs of Staff (CJCS). Requirements Generation System (CJCSI
3170.01B), April 15, 2001.http://www.dtic.mil/doctrine/jel/cjcsd/cjcsi/3170_01b.pdf.
Chait, Richard, John Lyons, and Duncan Long. Critical Technology Events in the
Development of the Abrams Tank: Project Hindsight Revisited. Fort Belvoir, VA:
National Defense University, December 2005.
Cho, George, Hans Jerrell, and William Landay. Program Management 2000: Know the
Way: How Knowledge Management Can Improve DoD Acquisition. Fort Belvoir,
VA: Defense Systems Management College, January 2000.
Chong, K. P. Modeling and Simulation-Based Life Cycle Engineering. Spon's Structural
Engineering Mechanics and Design Series. London: Spon Press, 2002.
Cloud, David J., and Larry B. Rainey. Applied Modeling and Simulation: An Integrated
Approach to Development and Operation. Space Technology Series. New York:
McGraw-Hill, 1998.
Committee on Integration of Commercial and Military Manufacturing in 2010 and
Beyond, Board on Manufacturing and Engineering Design, Division on
Engineering and Physical Sciences, National Research Council. Equipping
Tomorrow’s Military Force: Integration of Commercial and Military Manufacturing
in 2010 and Beyond. Washington, DC: National Academy Press, 2002.
Coulam, Robert F. Illusions of Choice: The F-111 and the Problem of Weapons
Acquisition Reform. Princeton, NJ: Princeton University Press, 1977.
Coyle, Philip E., III. “Evolutionary Acquisition: Seven Ways to Know If You Are Placing
Your Program at Unnecessary Risk.” Program Manager, (November-December
2000).
Culver, C.M. Federal Government Procurement—An Uncharted Course Through
Turbulent Waters. McLean, VA: National Contract Management Association,
1984.
Davis, Paul K., Amy E. Henninger, National Defense Research Institute (U.S.), and
RAND Corporation. Analysis, Analysis Practices, and Implications for Modeling
and Simulation (Occasional paper OP-176-OSD). Santa Monica, CA: RAND
Corporation, 2007.
120
Selected Bibliography
Davis, P.K., and R. Hillestad. Exploratory Analysis for Strategy Problems with Massive
Uncertainty. Santa Monica, CA: RAND, 2000.
Defense Acquisition University (DAU). Acquisition Strategy Guide. Fort Belvoir, VA:
Defense Acquisition University Press, June 2003.
Defense Acquisition University (DAU). A Guide to the Project Management Body of
Knowledge (PMBOK Guide). Fort Belvoir, VA: Defense Acquisition University
Press, June 2003.
Defense Acquisition University (DAU). Defense Acquisition Guidebook, Help/Print Page.
Fort Belvoir, VA: Defense Acquisition University Press, December 2008.
https://akss.dau.mil/DAG/help_welcome.asp.
Defense Acquisition University (DAU). Introduction to Defense Acquisition Management.
Fort Belvoir, VA: Defense Acquisition University Press, September 2005.
Defense Acquisition University (DAU). Risk Management Guide for DoD Acquisition.
Fort Belvoir, VA: Defense Acquisition University, June 2003.
Defense Acquisition University (DAU). Systems Engineering Fundamentals. Fort Belvoir,
VA: Defense Acquisition University Press, January 2001.
Defense Contract Audit Agency. Independent Research and Development and Bid and
Proposal Costs Incurred by Major Defense Contractors in the Years 1998 and
1999. Fort Belvoir, VA: Author, December 2000.
Defense Science Board (DSB). Report of the Defense Science Board Task Force on
Advanced Modeling and Simulation for Analyzing Combat Concepts in the 21st
Century. Washington, DC: Office of the Under Secretary of Defense (Acquisition
and Technology), 1999.
Defense Systems Management College (DSMC). Introduction to Defense Acquisition
Management, 5th ed. Fort Belvoir, VA: Defense Systems Management College
Press, January 5, 2001.
Defense Systems Management College (DSMC). Scheduling Guide for Program
Managers. Fort Belvoir, VA: Defense Systems Management College Press,
January 2000.
Department of Defense (DoD). Defense Acquisition Guidebook. Washington, DC:
Author, 2004. https://akss.dau.mil/dag/.
Department of Defense (DoD). The Defense Acquisition System (DoDD 5000.1).
Washington, DC: Author, October 23, 2000; with Change 1, January 4, 2001.
121
Department of Defense (DoD). Mandatory Procedures for Major Defense Acquisition
Programs (MDAPs) and Major Automated Information System (MAIS) Acquisition
Programs (DoD 5000.2-R). Washington, DC: Author, April 5, 2002.
Department of Defense (DoD). VV&A Recommended Practices Guide. Washington,
DC: Author, 2006. http://vva.dmso.mil/.
Deputy Under Secretary of Defense (Advanced Systems and Concepts). Fiscal Year
2003 Advanced Concept Technology Demonstration (ACTD) Proposals.
Washington, DC: Author, October 30, 2001.
Deputy Under Secretary of Defense (Science and Technology). Joint Warfighting
Science and Technology Plan. Washington, DC: Author, February 2000.
Deputy Under Secretary of Defense (Science and Technology), Office of Technology
Transition. Dual-Use Science and Technology Process: Why Should Your
Program Be Involved? What Strategies Do You Need to Be Successful?
Washington, DC: Author, July 2001. http://www.dtic.mil/dust.
Deputy Under Secretary of Defense (Science and Technology). Technology Transition
for Affordability: A Guide for S&T Program Managers. Washington, DC: Author,
April 2001.
Deputy Secretary of Defense. “Procedures and Schedule for Fiscal Year (FY) 20052009 Program, Budget, and Execution Review.” Memorandum. Washington, DC:
Author, May 21, 2003.
DeSimone, L.D., Chairman of the Board and Chief Executive Officer, 3M. Intellectual
Capital: The Keystone to Competitiveness. St. Paul, MN: 3M, 1999.
Digital System Resources, Inc. “Innovation in Defense Systems.” Statement of Mr.
Richard Carroll, Founder and CEO, Digital System Resources Inc., to House
Armed Services Committee, Military Research and Development Subcommittee,
March 22, 2001.
Digital System Resources, Inc. “Toward Greater Public-Private Collaboration in
Research & Development: How the Treatment of Intellectual Property Rights Is
Minimizing Innovation in the Federal Government.” Statement of Mr. Richard
Carroll, Founder and CEO, Digital System Resources, Inc., to House Committee
on Government Reform, Subcommittee on Technology and Procurement Policy,
US House of Representatives, July 17, 2001.
Director for Test, Systems Engineering and Evaluation (DTSE&E). Study on the
Effectiveness of Modeling and Simulation in the Weapon System Acquisition
Process. Washington, DC: Author, 1996.
Doebelin, Ernest O. System Dynamics: Modeling, Analysis, Simulation, Design. New
York: Marcel Dekker, 1998.
122
Selected Bibliography
Etter, Paul C. Underwater Acoustic Modeling and Simulation, 3rd ed. London: Spon
Press, 2003.
Evans, Jimmy. “Navy Strategic Planning Process for Science and Technology
Demonstrations: Transitioning R&D Advanced Technology into the Fleet.”
Program Manager, (July-August 2000).
Federal Grant and Cooperative Agreement Act of 1997 (P.L. 95-224). Subsequently
recodified as Chapter 63 of P.L. 97-258 (31 U.S.C. 6301 et seq.).
Federal Register. “New Challenge Program.” http://www.mailgate.org/gov/gov.us.fed.
dod.announce/msg01106.html.
Fellows, James. “Councils of War.” The Atlantic Monthly, (February 2002).
Fiorino, Thomas D., Sr. Vice President, Andrulis Corporation. “Engineering
Manufacturing Readiness Levels: A White Paper.” White paper, October 30,
2001.
Fishwick, Paul A. Handbook of Dynamic System Modeling. Chapman & Hall/CRC
Computer and Information Science Series. Boca Raton: Chapman & Hall/CRC,
2007.
Forsberg, K., H. Cotterman, and H. Mooz. Visualizing Project Management: A Model for
Business and Technical Success. New York: Wiley, 2000.
Fox, Ronald J., and James L. Field. The Defense Management Challenge: Weapons
Acquisition. Boston, MA: Harvard Business School Press, 1988.
Fox, J. Ronald, Edward Hirsch, George Krikorian, and Mary Schumacher. Critical
Issues in the Defense Acquisition Culture: Government and Industry Views from
the Trenches. Fort Belvoir, VA: Defense Systems Management College—
Executive Institute, December 1994.
http://www.history.army.mil/acquisition/research/pdf_materials/crit_issues_def_ac
q_culture.pdf.
“From Beginning to End: The Life Cycle of Technology Products.” The Wall Street
Journal, October 15, 2001.
Gansler, Jacques S. Affording Defense. Cambridge, MA: The MIT Press, 1989.
General Accounting Office (GAO). Best Practices: Better Management of Technology
Development Can Improve Weapon System Outcomes (Report number
GAO/NSIAD-99-162). Washington, DC: Author, July 30, 1999.
123
General Accounting Office (GAO). Best Practices: DoD Can Help Suppliers Contribute
More to Weapons System Programs (Report number GAO/NSIAD-98-87).
Washington, DC: Author, March 17, 1998.
General Accounting Office (GAO). Best Practices: Successful Application to Weapon
Acquisitions Requires Changes in DoD’s Environment (Report number
GAO/NSIAD-98-56). Report to the Subcommittee on Acquisition and Technology,
Committee on Armed Services, US Senate. Washington, DC: Author, February
1998.
General Accounting Office (GAO). Defense Manufacturing Technology Program: More
Joint Projects and Tracking of Results Could Benefit Program (Report number
GAO-01-943). Report to Congressional Committees. Washington, DC: Author,
September 2001.
General Accounting Office (GAO). DoD Research—Acquiring Research by
Nontraditional Means (Report number NSIAD-96-11). Washington, DC: Author,
March 29, 1996.
General Accounting Office (GAO). Export Controls: Clarification of Jurisdiction for
Missile Technology Items Needed (Report number GAO-02-120). Report to the
Subcommittee on Readiness and Management Support, Committee on Armed
Services, US Senate. Washington, DC: Author, October 2001.
General Accounting Office (GAO). Intellectual Property: Industry and Agency Concerns
over Intellectual Property Rights (Report number GAO-02-723T). Testimony by
Jack L. Brouck, Jr., Managing Director, Acquisition and Sourcing Management,
before the Subcommittee on Technology and Procurement Policy, Committee on
Government Reform, House of Representatives. Washington, DC: Author, May
10, 2002.
General Accounting Office (GAO). Joint Strike Fighter Acquisition: Mature Critical
Technologies Needed to Reduce Risks. Washington, DC: Author, October 19,
2001.
General Accounting Office (GAO). Military Operations: Status of DOD Efforts to Develop
Future Warfighting Capability. Washington, DC: Author, 1999.
General Accounting Office (GAO). NASA: Better Mechanisms Needed for Sharing
Lessons Learned (Report number GAO-02-195). Report to the Subcommittee on
Space and Aeronautics, Committee on Science, House of Representatives.
Washington, DC: Author, January 2002.
General Accounting Office (GAO). National Laboratories: Better Performance Reporting
Could Aid Oversight of Laboratory-Directed R&D Program (Report Number GAO01-927). Report to Congressional Requesters. Washington, DC: Author,
September 2001.
124
Selected Bibliography
General Motors Corporation. “Virtual Factory Enabled GM to Save Time and Costs in
Design of Lansing Grand River Assembly.” News Release, January 9, 2002.
Government Executive. “The Only Game in Town: Now government is America’s hottest
technology market,” December 2001.
Graham, Margaret B.W., and Alec T. Shuldiner. Corning and the Craft of Innovation.
New York: Oxford University Press, 2001.
Haines, Linda. “Technology Refreshment within DoD: Proactive Technology
Refreshment Plan Offers DoD Programs Significant Performance, Cost,
Schedule Benefits,” Program Manager, (March-April 2001).
Hanks, Christopher H., Elliot I. Axelband, Suna Lindsay, Mohammed Rehan Malik, and
Brett D. Steele. Reexamining Military Acquisition Reform: Are We There Yet?
Santa Monica, CA: RAND, 2005.
Hollenbach, J.W. “Department of the Navy (DON) Corporate Approach to Simulation
Based Acquisition.” Paper presented at the Fall 2000 Simulation Interoperability
Workshop, Orlando, FL, September 17-22, 2000.
Hollenbach, J.W. “Collaborative Achievement of Advanced Acquisition Environments.”
Paper presented at the Spring 2001 Simulation Interoperability Workshop,
Orlando, FL, March 25-30, 2001.
Hollis, W.W., and A. Patenaude. “Simulation Base Acquisition: Can We Stay the
Course.” Army RD&A, (May-June 1999): 11-14.
Hundley, Richard O. DARPA Technology Transitions: Problems and Opportunities
(Report number PM-935-DARPA). Project Memorandum prepared for DARPA,
National Defense Research Institute, June 1999.
IEEE Circuits and Systems Society. BMAS 2003: Proceedings of the 2003 IEEE
International Workshop on Behavioral Modeling and Simulation: San Jose,
California, October 7-8, 2003. Piscataway, NJ: IEEE, 2003.
Ince, A. Nejat, and Ercan Topuz. Modeling and Simulation Tools for Emerging
Telecommunication Networks: Needs, Trends, Challenges and Solutions. New
York: Springer, 2006.
“Independent Research and Development (IR&D), Information for DoD Personnel.”
Brochure. Fort Belvoir, VA: Defense Technical Information Center.
John, Vicki L. Department of Defense (DoD) and Industry—A Healthy Alliance. Master’s
thesis, Naval Postgraduate School, Monterey, CA, June 2001.
125
Johnson, Michael V.R., Mark F. McKeon and Terence R. Szanto. Simulation Based
Acquisition: A New Approach. Fort Belvoir, VA: Defense Systems Management
College Press, December 1998.
Joint Chiefs of Staff. Joint Vision 2010: Focused Logistics: A Joint Logistics Roadmap.
Washington, DC: Author.
Jones, Jennifer. “Moving into Real Time: Enterprises Can Now Do Business with Up-tothe-minute Data Feeds, but Getting All the Pieces in Place May Be Challenging.”
InfoWorld 24, no. 3. (January 21, 2002).
Jones, Wilbur D., Jr. Arming the Eagle: A History of U.S. Weapons Acquisition since
1775. Fort Belvoir, VA: Defense Systems Management College Press, 1999.
Kadish, Ronald, Gerald Abbott, Frank Cappuccio, Richard Hawley, Paul Kern, and
Donald Kozlowski. A Report by the Assessment Panel of the Defense Acquisition
Performance Assessment Project for the Deputy Secretary of Defense.
Washington, DC: Defense Acquisition Performance Assessment Project, January
2006. http://www.acq.osd.mil/dapaproject/documents/DAPA-Report-web/DAPAReport-web-feb21.pdf.
Kang, Keebom, and R.J. Roland. “Military Simulation.” In Handbook of Simulation,
edited by J. Banks, 645-658. New York: Wiley, 1998.
Kuipers, Benjamin. Qualitative Reasoning: Modeling and Simulation with Incomplete
Knowledge. Cambridge, MA: MIT Press, 1994.
Ladner, Roy, and F. Petry. Net-Centric Approaches to Intelligence and National
Security. New York: Springer Science+Business Media, 2005.
Laguna, Manuel, and Johan Marklund. Business Process Modeling, Simulation, and
Design. Upper Saddle River, NJ: Pearson/Prentice Hall, 2005.
Laird, Robbin F. “Transformation and the Defense Industrial Base: A New Model.”
Defense Horizons, no. 26 (May 2003). Fort Belvoir, VA: Center for Technology
and National Security Policy, National Defense University.
Lorell, Mark, Michael Kennedy, Julia Lowell, and Hugh Levaux. Cheaper, Faster,
Better?: Commercial Approaches to Weapons Acquisition. Santa Monica, CA:
RAND, 2000.
Lucas, T.W. Credible Uses of Combat Simulation: A Framework for Validating and
Using Models. Santa Monica, CA: RAND, 1997.
Macgregor, Douglas A. Transforming under Fire: Revolutionizing How America Fights.
Westport, CN: Praeger, 2004.
126
Selected Bibliography
Mayr, Herwig. Virtual Automation Environments: Design, Modeling, Visualization,
Simulation. New York: Marcel Dekker, 2002.
McNaugher, Thomas L. New Weapons, Old Politics: America’s Military Procurement
Muddle. Washington, DC: The Brookings Institution, 1989.
Melin, Patricia, and Oscar Castillo. Modeling, Simulation and Control of Non-Linear
Dynamical Systems: An intelligent Approach using Soft Computing and Fractal
theory, 2nd ed. London: Taylor & Francis, 2002.
Mielke, Alexander. Analysis, Modeling and Simulation of Multi-scale Problems. Berlin:
Springer, 2006.
Morrow, Walter E., Jr. Summary of the Defense Science Board Recommendations on
DoD Science & Technology Funding. Washington, DC: Office of the Secretary of
Defense, June 1, 2000.
Military Operations Research Society (MORS). “Test & Evaluation, Modeling and
Simulation and VV&A: Quantifying the Relationship between Testing and
Simulation.” Paper presented at the MORS Workshop, The Energy Training
Complex, Kirtland Air Force Base, Albuquerque, New Mexico, October 15-17,
2002.
Motaghedi, Pejmun, Society of Photo-optical Instrumentation Engineers, Inc., Optech,
and Ball Aerospace & Technologies Corporation (USA). “Modeling, Simulation,
and Verification of Space-Based Systems II.” In Proceedings of SPIE—the
International Society for Optical Engineering. Vol. 5799. Bellingham, WA: SPIE,
2005.
National Center for Advanced Technologies. COSSI “Executive
Independent Assessment. Arlington, VA: Author, June 11, 2001.
Roundtable”
National Center for Advanced Technologies. An Evaluation and Assessment of the DoD
Commercial Operations & Support Savings Initiative: Final Report of the DoD
COSSI Program Independent Assessment Executive Roundtable (Report
Number 01-CO1A). Arlington, VA: Author, September 2001.
National Center for Advanced Technologies. An Evaluation and Assessment of the DoD
Dual Use Science & Technology Program: Final Report of the DoD Dual Use
Science and Technology Program Independent Assessment Panel (Report
Number 01-1A). Arlington, VA: Author, June 2000.
National Center for Advanced Technologies. Toward More Affordable Avionics: An
Industry Perspective (Report No. 01-AAI-1). Final report of the Affordable
Avionics Initiative Working Group. Arlington, VA: Author, November 2001.
127
National Research Council, Board on Science, Technology, and Economic Policy. The
Small Business Innovation Research Program SBIR: An Assessment of the
Department of Defense Fast Track Initiative. Washington, DC: National Academy
Press, 2000.
National Research Council, Committee on Modeling and Simulation Enhancements for
21st Century Manufacturing and Acquisition. Modeling and Simulation in
Manufacturing and Defense Systems Acquisition: Pathways to Success. The
Compass Series. Washington, DC: Author, 2002.
National Research Council, Committee on Modeling and Simulation for Defense
Transformation. Defense Modeling, Simulation, and Analysis: Meeting the
Challenge. Washington, DC: The National Academies Press, 2006.
National Research Council. Equipping Tomorrow’s Military Force: Integration of
Commercial and Military Manufacturing in 2010 and Beyond. Report by the
Committee on Integration of Commercial and Military Manufacturing in 2010 and
Beyond, Board on Manufacturing and Engineering Design, Division on
Engineering and Physical Sciences. Washington, DC: National Academy Press,
2002.
National Science Foundation. National Patterns of R&D Resources: 1996—An SRS
Special Report, Division of Science Resources Studies, Directorate for Social,
Behavioral, and Economical Sciences. Washington, DC: Author, July 1996.
http://www.nsf.gov/statistics/nsf96333/nsf96333.pdf
National Technology Alliance. National Technology Alliance: Accomplishments &
Projects 1997 – 1998. Bethesda, MD: National Imagery and Mapping Agency,
January 1999.
Nelson, J.R., and Karen W. Tyson. A Perspective on the Defense Weapons and
Acquisition Process (IDA Paper P-2048). Alexandria, VA: Institute for Defense
Analyses, September 1987.
Nicol, David M., Christopher D. Carothers, Stephen J. Turner, Association for
Computing Machinery, Special Interest Group in Simulation, IEEE Computer
Society, Technical Committee on Simulation, and Society for Modeling and
Simulation International. Proceedings: Workshop on Principles of Advanced and
Distributed Simulation (PADS 2005), Monterey, California, June 1-3, 2005. Los
Alamitos, CA: IEEE Computer Society, 2005.
Noor, Ahmed Khairy, and Langley Research Center. Multiscale Modeling, Simulation
and Visualization and their Potential for Future Aerospace Systems (NASA CP.
2002-211741). Hampton, VA: National Aeronautics and Space Administration,
Langley Research Center, 2002.
128
Selected Bibliography
Office of the Assistant Secretary of the Army. Constructing Successful Business
Relationships: Innovation in Contractual Incentives. San Diego, CA: Science
Applications International Corporation.
Office of the Deputy Under Secretary of Defense for Acquisition Reform. Commercial
Item Acquisition: Considerations and Lessons Learned. Washington, DC: Author,
July 14, 2000.
Office of the Deputy Under Secretary of Defense for Acquisition Reform. Incentive
Strategies for Defense Acquisitions. Washington, DC: Author, April 2001.
Office of the Inspector General. Army Transition of Advanced Technology Programs to
Military Applications (Acquisition report number D-2002-107). Washington, DC:
Author, June 14, 2002.
Office of the Secretary of Defense Cost Analysis Improvement Group. Operating and
Support Cost-Estimating Guide. Washington, DC: Author, May 1992.
Office of the Secretary of Defense DDR&E. Department of Defense Independent
Research and Development (IR&D) Program Action Plan. Washington, DC:
Author, November 2000.
Office of the Secretary of Defense (OSD). “New Challenge Program.” Federal Register
64, no. 71 (April 14, 1999): 19744. http://www.mailgate.org/gov/gov.us.fed.dod.
announce /msg01106.html.
Office of the Under Secretary of Defense for Acquisition and Technology
(OUSD(AT&L)). Report of the Defense Science Board Task Force on Acquisition
Reform Phase IV. Washington, DC: Author, July 1999.
Office of the Under Secretary of Defense for Acquisition & Technology. Report of the
Defense Science Board Task Force on Defense Science and Technology Base
for the 21st Century. Washington, DC: Author, June 1998.
Olwell, David H., Jean M. Johnson, Jarema M. Didoszak, and Joseph Cohn. “Systems
Engineering of Modeling and Simulation for Acquisition Curricula.” In
Proceedings, The Interservice/Industry Training, Simulation & Education
Conference (I/ITSEC), 2007.
Pace, D.K. “Issues Related to Quantifying Simulation Validation.” Paper presented at
the Spring 2002 Simulation Interoperability Workshop, Orlando, FL, March 10-15,
2002.
Pentland, Dr. Pat Allen, U.S. Commission on National Security/21st Century. “Creating
Defense Excellence: Defense Addendum to Road Map for National Security.”
Defense
addendum
to
Hart-Rudman
report,
May
15,
2001.
www.nssg.gov/addendum/Creating_Defense_Excellence.pdf.
129
Proteus Group, LLC and Technology Strategies & Alliances. “Office of Naval Research
Technology Transition Wargame Series: Organic Mine Countermeasures Future
Naval Capabilities Wargame.” After Action Report, May 28, 2002.
Purdue, Thomas M. “The Transition of ACTDs—Getting Capability to the Warfighter:
Demonstrating Utility Is Only Part of the Job.” Program Manager (March-April
1997)
Purdy, E. “Simulation Based Acquisition Lessons Learned: SMART Collaboration for
Future Combat Systems.” SISO Simulation Technology, no. 75 (June 20, 2001).
Robinson, S. “Simulation Verification, Validation and Confidence: A Tutorial.”
Transactions of the Society for Computer Simulation International 16, no. 2
(1999): 63-69.
Roland, Alex. The Military-Industrial Complex. Washington, DC: American Historical
Association, 2001.
Rubenstein, R. Y., and B. Melamel. Modern Simulation and Modeling. New York: John
Wiley and Sons, 1998.
Sage, A.P., and S.R. Olson. “Modeling and Simulation in Systems Engineering: Whither
Simulation Based Acquisition?” Modeling and Simulation Magazine (March 2001).
Schum, William K., Alex F. Sisti, and Society of Photo-optical Instrumentation
Engineers. Proceedings: Modeling and Simulation for Military Applications, April
18-21, 2006, Kissimmee, Florida, USA. Bellingham, WA: SPIE, 2006.
Schrage, M. Serious Play: How the World’s Best Companies Simulate to Innovate.
Cambridge, MA: Harvard Business School Press, 1999.
Simon, H.A., and A. Newell. “Information Processing in Computer and Man,” American
Scientist 52 (September 1964): 281-300.
Smith, Giles, Jeffrey Drezner, and Irving Lachow. “Assessing the Use of ‘Other
Transactions’ Authority for Prototype Projects.” RAND-documented briefing
prepared by the National Defense Research Institute for the Office of the
Secretary of Defense, 2002.
Stevenson, James P. The $5 Billion Misunderstanding: The Collapse of the Navy’s A-12
Stealth Bomber Program. Annapolis, MD: Naval Institute Press, 2001.
Tewari, Ashish. Atmospheric and Space Flight Dynamics: Modeling and Simulation with
MATLAB and Simulink. Modeling and simulation in science, engineering and
technology. Boston: Birkhäuser, 2007.
130
Selected Bibliography
Under Secretary of Defense (Acquisition, Technology, & Logistics) (USD(AT&L)).
“Evolutionary Acquisition and Spiral Development.” Memorandum, April 12, 2002.
Under Secretary of Defense (Acquisition, Technology, & Logistics) (USD(AT&L)). “Joint
Strike Fighter (JSF) Milestone I Acquisition Decision Memorandum (ADM).”
Memorandum, November 15, 1996.
United States Army (USA). Simulation Operations Handbook, Ver. 1.0. Washington,
DC: Author, October 30, 2003. www.FA-57.army.mil.
US House of Representatives. Small Business Innovation Research Program Act of
2000
(P.L.
106-554),
Appendix
1—HR
5667,
Title
1.
http://www.acq.osd.mil/sadbu/sbir/pl106-554.pdf (accessed August 1, 2002).
Ward, Dan, and Chris Quaid. “It’s About Time.” Defense AT&L (January-February 2006).
Weir, Gary E. Forged in War: The Naval-Industrial Complex and American Submarine
Construction, 1940-1961. Washington, DC: Naval Historical Center, 1993.
Wu, Benjamin, Stanley Fry, Richard Carroll, Gilman Louie, Tony Tether, and Stan
Soloway. “Intellectual Property and R&D for Homeland Security.” Testimonies
made at oversight hearing before the Subcommittee on Technology and
Procurement Policy, Committee on Government Reform, House of
Representatives, US Congress, May 10, 2002.
Zeigler, Bernard P., Herbert Praehofer, and Tag Gon Kim. Theory of Modeling and
Simulation: Integrating Discrete Event and Continuous Complex Dynamic
Systems, 2nd ed. San Diego: Academic Press, 2000.
^ééÉåÇáñ=^K= iáëí=çÑ=^Åêçåóãë=
Note: The glossary represents terms relevant to this guide. If you need
any further information regarding military terms and acronyms, please
refer to the DoD Dictionary of Military and Associated Terms at
http://www.dtic.mil/doctrine/jel/new_pubs/jp1_02.pdf
ACAT
Acquisition Category
ACEIT
Automated Cost Estimating Integrated Tools
ADM
Acquisition Decision Memorandum
ADS
Advance Distributed Simulation
AIS
Automated Information Systems
AIT
Automatic Identification Technology
AIMD
Aircraft Intermediate Maintenance Division
ALSP
Aggregate-level Simulation Protocol
AMSMP
Acquisition Modeling and Simulation Master Plan
AoA
Analysis of Alternatives
APB
Acquisition Program Baseline
ATS
Automatic Test System
C4I
Command, Control, Communications, Computers and
Intelligence
CAD
Computer-aided Design
CAE
Computer-aided Engineering
CAIG
Cost Analysis Improvement Group
CAM
Computer-Aided Manufacturing
CARD
Cost Analysis Requirements Document
CATIA
Computer-aided, Three-dimensional Interactive
Application
CCTT
Close Combat Tactical Trainer
CDD
Capability Development Document
CER
Center for Educational Resources
CDR
Critical Design Review
CINC
Commander in Chief
CIO
Chief Information Officer
132
Appendix A
CJCSI
Chairman of the Joint Chiefs of Staff Instruction
CJCSM
Chairman of the Joint Chiefs of Staff Memorandum
CM
Configuration Management
COEA
Cost and Operational Effectiveness Analysis
COTS
Commercial, off-the-shelf
CPD
Capabilities Production Document
CPI
Critical Program Information
CSB
Configuration Steering Board
CSDR
Cost and Software Data Reporting
CTE
Critical Technology Element
DAB
Defense Acquistion Board
DARPA
Defense Advanced Research Projects Agency
DAU
Defense Acquisition University
DBT
Design/Build Team
DFARS
Defense Federal Acquisition Regulation Supplement
DIA
Defense Intelligence Agency
DIS
Distributed Interactive Simulation
DMSO
Defense Modeling Simulation Office
DMSP
DoD M&S Project
DoD
Department of Defense
DoDD
Department of Defense Directive
DoDI
Department of Defense Instruction
DOT&E
Director, Operational Test & Evaluation
DPA&E
Director, Program Analysis & Evaluation
DPAP
Defense Procurement, Acquisition Policy and
Strategic Sourcing
DPG
Defense Planning Guidance
DQ
Data Quality
DQMT
Data Quality Metadata Template
DT
Developmental Test
DT&E
Developmental Test and Evaluation
DVDT
DoD VV&A Documentation Tool
DVDTs
DoD M&S VV&A Documentation Templates
List of Acronyms
133
ECP
Engineering Change Proposal
EMD
Engineering and Manufacturing Development
EXCIMS
Executive Council on M&S
FAA
Functional Area Assessments
FAR
Federal Acquisition Regulation
FCB
Functional Capabilities Board
FEM
Finite Element Model
FFP
Firm Fixed-price (Contract)
FNA
Functional Needs Analysis
FOT&E
Follow-on Operational Test & Evaluation
FRP
Full-rate Production
FSA
Functional Solutions Analysis
GIG
Global Information Grid
HITL
Human-in-the-loop (Simulation)
HLA
High-level Architecture
HSI
Human Systems Integration
HWIL
Hardware-in-the-loop (Simulation)
ICD
Initial Capabilities Document
IEEE
Institute of Electrical and Electronic Engineers
ILS
Integrated Logistics Support
INCOSE
International Council on System Engineering
IOC
International Operating Capability
IOT&E
Initial Operational Test & Evaluation
IPPD
Integrated Product and Process Development
IPT
Integrated Product Team
IRB
Investment Review Board
ITAB
Information Technology Acquisition Board
IUID
Item-unique Identification
JCIDS
Joint Capabilities Integration Development System
JCS
Joint Chiefs of Staff
JROC
Joint Requirements Oversight Council
JSIMS
Joint Simulation System
KPP
Key Performance Parameter
134
Appendix A
LCSP
Lifecycle Sustainment Plan
LFT
Live Fire Testing
LFT&E
Live Fire Test & Evaluation
LORA
Level of Repair Analysis
LRIP
Low-rate Initial Production
LSA
Logistics Support Analysis
LSAR
Logistics Support Analysis Record
LSI
Lead Systems Integration
M&P
Methods & Processes
M&S
Modeling and Simulation; Model(s) and Simulation(s)
M&S CO
M&S Coordination Office
MAIS
Major Automated Information System
MDA
Milestone Decision Authority
MDAP
Milestone Decision Authority Program
MDD
Method Definition Document
MOE
Measures of Effectiveness
MOO
Measures of Outcome
MOP
Measures of Performance
MORS
Military Operations Research Society
MSA
Material Solution Analysis
MSEA
Modeling & Simulation Executive Agents
MSSC
M&S Steering Committee
NAS
Naval Air Station
O&S
Operating and Support
ODUSD (A&T)
Office of the Deputy Under Secretary of Defense
(Acquisition & Technology)
OFP
Operational Flight Program
OIPT
Overarching Integrated Product Team
ORD
Operational Requirements Document
OSD
Office of the Secretary of Defense
OT
Operational Test
OT&E
Operational Test & Evaluation
PDR
Preliminary Design Review
List of Acronyms
135
PE
Program Elements
PEO
Program Executive Officer
PBL
Performance-based Lifecycle Product Support
PBL
Performance-based Logistics
PDM
Periodic Depot Maintenance
PESHE
Programmatic Environment, Safety, and Occupational
Health Evaluation
PMO
Program Management Office
PMT
Project Management Team
POA&M
Plan of Action and Milestones
POM
Program Objective Memorandum
PPBES
Planning, Programming, Budgeting and Execution
System
PSR
Program Support Review
R&D
Research & Development
RCM
Requirements Correlation Matrix
RFPs
Requests for Proposals
RGS
Requirements Generation System
RPG
Recommended Practice Guide
S&T
Science & Technology
SBA
Simulation-based Acquisition
SDD
System Development and Demonstration
SEP
Systems Engineering Plan
SIDAC
Supportability Investment Decision Information
Analysis Center
SIMNET
Simulator Network
SSE
Systems & Software Engineering
SSP
Simulation Support Plan
SURVIAC
Survivability/Vulnerability Information Analysis Center
SWIL
Software-in-the-loop (Simulation)
T&E
Test and Evaluation
TD
Technology Development
TDS
Technology Development Strategy
136
Appendix A
TEMP
Test and Evaluation Master Plan
TWG
Technical Working Group
USD(AT&L)
Under Secretary of Defense (Acquisition, Technology
& Logistics)
V&V
Verification and Validation
VV&A
Verification, Validation, and Accreditation
VV&C
Verification, Validation and Certification
VVCTT
VV&C Tiger Team
WBS
Work Breakdown Structure
XML
Extensible Mark-up Language
XSLT
Extensible Stylesheet Language for Transformations
^ééÉåÇáñ=_K= aça=oÉëçìêÅÉë=
The following are websites and web resources pertinent to the acquisition community.
The paragraphs describing their merits and potential benefits are either drawn about or
from the websites in question.
Acquisition Community Connection (ACC)
https://acc.dau.mil/CommunityBrowser.aspx
This site highlights communities of practice for various acquisition career fields
and special interest groups.
Acquisition Streamlining and Standardization Information System (ASSIST)
http://assist.daps.dla.mil/online/start/
Users of this site can download Military and Federal Specifications Standards,
Commercial Item Descriptions, Qualified Manufacturers, and Qualified Products
Lists. There is no charge for the required registration.
AT&L Knowledge Sharing System (AKSS)
https://akss.dau.mil/default.aspx
Formerly Defense Acquisition Deskbook, this site provides the most current
acquisition policy and guidance for all DoD services and agencies. It includes
access to over 1300 mandatory and discretionary policy documents (laws,
directives and regulations).
Defense Acquisition University (DAU)
http://www.dau.mil/
This site provides training and other resources for the Defense Acquisition
Workforce.
Defense Acquisition Resource Center (from DAU)
https://akss.dau.mil/dapc/index.aspx
This site includes the latest changes to the DoD 5000 Series documents,
contains an interactive version of the 5000 Guidebook, a tutorial about the 5000
Series governing principles and framework, and a review of terminology.
Defense Advanced Research Projects Agency (DARPA)
http://www.darpa.mil/
DARPA is the central research and development organization for the DoD. It
manages and directs selected basic and applied research and development
projects.
138
Appendix B
Defense Contract Management Agency (DCMA)
http://www.dcma.mil/
The Defense Contract Management Agency (DCMA) is the Department of
Defense (DoD) component that works directly with Defense suppliers to help
ensure that DoD, Federal, and allied government supplies and services are
delivered on time, at projected cost, and meet all performance requirements. The
DCMA directly contributes to the military readiness of the United States and its
allies and helps preserve the nation's freedom.
Defense Information Systems Agency (DISA)
http://www.disa.mil/
This source describes the structure and mission of DISA and its core mission
areas, links to relevant DoD publications, and other pertinent information.
Defense Logistics Agency (DLA)
http://www.dla.mil/default.aspx
The Defense Logistics Agency supplies the nation’s military services and several
civilian agencies with the critical resources they need to accomplish their
worldwide missions. The DLA provides wide-ranging logistical support for
peacetime and wartime operations, as well as emergency preparedness and
humanitarian missions.
Defense Modeling and Simulation Office (DMS0)
https://www.dmso.mil/public/
The Defense Modeling and Simulation Office (DMSO) is the catalyst organization
for Department of Defense (DoD) modeling and simulation (M&S) and ensures
that M&S technology development is consistent with other related initiatives. The
DMSO performs those key corporate-level functions necessary to encourage
cooperation, synergism, and cost-effectiveness among the M&S activities of the
DoD Components. The DMSO supports the warfighter by leading a defense-wide
team in fostering the interoperability, reuse, and affordability of M&S and the
responsive application of these tools to provide revolutionary warfighting
capabilities and to improve aspects of DoD operations.
Defense Procurement and Acquisition Policy
http://www.acq.osd.mil/dpap/
DPAP is responsible for all acquisition and procurement policy matters in the
Department of Defense (DoD). The DPAP office serves as the principal advisor
to the Under Secretary of Defense for Acquisition, Technology and Logistics
(AT&L), Deputy Under Secretary of Defense for Acquisition and Technology
(A&T), and the Defense Acquisition Board on acquisition/procurement strategies
for all major weapon systems programs, major automated information systems
programs, and services acquisitions.
DoD Resources
139
Defense Systems Management College (DSMC)
http://www.dau.mil/regions/dsmc_spm.asp
Co-located with DAU Headquarters at Fort Belvoir, Virginia, the Defense
Systems Management College—School of Program Managers (DSMC-SPM) is
chartered to provide executive-level and international acquisition management
training, consulting, and research.
DoD Single Stock Point for Specifications and Standards (DODSSP)
http://dodssp.daps.dla.mil/
The Department of Defense Single Stock Point was created to centralize the
control, distribution, and access to the extensive collection of Military
Specifications, Standards, and related standardization documents either
prepared by or adopted by the DoD. The DODSSP mission and responsibility
was assumed by DAPS Philadelphia Office in October 1990. The responsibilities
of the DODSSP include electronic document storage, indexing, cataloging,
maintenance, publishing-on-demand, distribution, and sale of Military
Specifications, Standards, and related standardization documents and
publications comprising the DODSSP Collection. The DODSSP also maintains
the Acquisition Streamlining and Standardization Information System (ASSIST)
management/research database.
DoD 5000 Series Documents
DoD Directive 5000.01: The Defense Acquisition System
https://akss.dau.mil/dag/DoD5000.asp?view=document&doc=1
DoD Instruction5000.02: Operation of the Defense Acquisition System
https://akss.dau.mil/dag/DoD5000.asp?view=document&doc=2
Modeling and Simulation Resources
A&TL M&S Master Plan (AMSMP)
https://acc.dau.mil/CommunityBrowser.aspx?id=111019&lang=en-US
Defense Acquisition Guidebook
https://akss.dau.mil/dag/DoD5000.asp?view=document&rf=GuideBook\IG
_c4.5.7.6.asp
DoD Directive 5000.59: DoD Modeling and Simulation (M&S) Management
http://www.dtic.mil/whs/directives/corres/pdf/500059p.pdf
THIS PAGE INTENTIONALLY LEFT BLANK
jçÇÉäáåÖ= ~åÇ= páãìä~íáçå
E d u c a t i n g
t h e
D o D
C o m m u n i t i e s
a n d
S e r v i c e s