lMethodology for Technology Selection for Department of

advertisement
lMethodology for Technology Selection for Department of
Defense Research and Development Programs
by
Michael L. Nair
S. B., Mechanical Engineering (2003)
Massachusetts Institute of Technology
Submitted to the System Design and Management Program in Partial Fulfillment of
the Requirements for the Degree of
Master of Science in Engineering and Management
at the
ARC IVES
Massachusetts Institute of Technology
OF TECHNOLCO.y
January 10, 2011
MAR 08 2012
© 2011 Michael Nair
All Rights Reserved
LiBrRARIES
The author hereby grants to MIT permission to reproduce and to distribute publicly
paper and electronic copies of this thesis document in whole or in part in any
medium no known or hereafter created.
Signature of Author
Michael L. Nair
System Design and Management
January 04, 2011
Certified by
Ricardo Valerdi
Research Associate, Engineering Systems Division
Thesis Supervisor
Certified by
Patrick Hale
System Design & Management Program
Director
Methodology for Technology Selection for Department of Defense
Research and Development Programs
by
Michael L. Nair
Submitted to the System Design and Management Program on 14 January, 2011 in
partial fulfillment of the requirements for the Degree of
Master of Science in Engineering and Management
Abstract
In recent years, many of the Department of Defense's major acquisition programs
have experienced significant budget overruns and schedule delays. Closer
examination of these programs reveals that in many cases, technologies were
selected for these programs that did not meet expectations to enable the overall
weapons system to achieve its intended goals.
A methodology is proposed to extend systems analysis techniques to individual
technologies to utilize a rational basis for technology selection. An example of this
methodology is shown based on selecting technologies for the US Army's Active
Protection System. The example demonstrates that use of this methodology can
provide decision makers with a clear understanding of the effects choosing
particular technologies.
Thesis Supervisor: Ricardo Valerdi
Title: Research Associate, Engineering System Division
Disclaimer
The views expressed in this thesis are solely the personal views of the author and in
no way reflect the position of the United States Government, Department of Defense,
or Department of the Army.
Table of Contents
Table of Figures ..................................................................................................................
5
T able of T ables .........................................................................................................................
6
Chapter 1-Introduction....................................................................................................
7
Chapter 2- Background ..................................................................................................
10
Current Practices ...........................................................................................................................
Legacy Projects ...............................................................................................................................
14
19
Futu re Com b at System ...............................................................................................................................
jo in t Strik e F ighter.......................................................................................................................................2
Airborne Laser (ABL)/Airborne Laser Test Bed (ALTB) ......................................................
Legacy Program Sum m ary..........................................................................................................
19
3
29
32
Chapter 3- Proposed Technology Selection Approach .........................................
38
Chapter 4- Methodology....................
.....
...
.......... 47
Simulation Description ......
...................
.......................
...................
Technology Cost Analysis Modeling..............__
.................................................
Development Cost_.......................
.....................
..........................
Chapter 5- Results _.....-__...............................
Chapter 6- Discussion -..................
Chapter 7- Conclusion......._
Bibliography .............
........................
.......
........ 67
.......
...74
..........................................
............
.......
..
.........
47
47
58
81
.......
82
Table of Figures
Figure 1: US DoD budget - adjusted for inflation (from NY Times) ............................
10
Figure 2: Delays in Program Initial Operating Capability (GAO, 2009) ......................... 15
Figure 3: Cost Increases in Major DOD Acquisition Programs (GAO, 2009)...........16
Figure 4: Weapons Systems Quality Problem Source (GAO, 2008)............................ 17
Figure 5: US Army Future Combat System (FCS) Artists Conception (GAO, 2009).. 20
Figure 6: F-35A Joint Strike Fighter (PEO JSF)....................................................................
24
Figure 7: F-35 Variants (PEO JSF)..............................................................................................
25
Figure 8: General Dynam ics F-111.............................................................................................
28
Figure 9: Airborne Laser Aircraft (GAO, 2010)....................................................................
29
Figure 10: DOD Schedule Delays as of December 2007 (GAO, 2009)........................ 33
Figure 11: JLTV Prototypes (PM JLTV).....................................................................................
35
Figure 12: Average RDTE&E Cost Growth in GAO Study (GAO, 2009)...................... 39
Figure 13: Artist's Conception of Iron Curtain APS (Crane, 2009).............................. 45
Figure 14: Tracking Time Development Cost Models.......................................................
50
Figure 15: Launch Time Delay Development Cost Models ..............................................
51
Figure 16: CM Minimum Range Development Cost Model.............................................
51
Figure 17: APS Model System Probability of Detection ....................................................
54
Figure 18: Convergence of Monte Carlo Simulations.........................................................
56
Figure 19: APS Effectiveness - Baseline Case.......................................................................
57
Figure 20: Return on Investment for Launch Time Delay...............................................
58
Figure 21: Return on Investment for CM Velocity...............................................................
59
Figure 22: Return on Investment for CM Minimum Range ...........................................
59
Figure 23: APS System Effectiveness .......................................................................................
61
Figure 24: APS Effectiveness as a function of Launch Time Delay.............................. 62
Figure 25: APS Effectiveness as a function of CM Velocity..............................................
63
Figure 26: APS Effectiveness as a function CM Minimum Range................................
63
Figure 27: Tracking Tim e Cost Models.....................................................................................
65
Figure 28: Launch tim e Delay Cost Models ............................................................................
65
Figure 29: APS Effectiveness as a Function of Investment Cost...................................
67
Figure 30: System Effectiveness as a Function of Tracking Time................................. 68
Figure 31: System Effectiveness as a Function of Launch Time Delay.......................69
Figure 32: Cost-effective Variable Combinations ..............................................................
70
Figure 33: Launch Time Delay Development Path............................................................
71
Figure 34: Tracking Time Development Path........................................................................
71
Figure 35: CM Minimum Range Development Path............................................................
72
Figure 36: Cost-Effective Development Path.......................................................................
73
Figure 37: Optimal Development Path with Cost/Requirement Limits ................... 75
Figure 38: Notional Technology Development Cost.........................................................
78
Table of Tables
Table 1: 2003 FCS Cost Estimates (GAO, 2005)
..........................
Table 2: 2004 FCS Cost and Schedule Estimate (GAO, 2005), (GAO, 2009), (GAO,
2 0 1 0 ) ..................................................................................................................................................
Table 3: JSF Program Changes (Sullivan, 2010).................................................................
Table 4: JSF Schedule Changes (Sullivan, 2010)..................................................................
Table 5: Changes in Cost and Acquisition Quantities (GAO, 2009).............................
Table 6: Threat Munition Specifications..................................................................................
Table 7: Baseline case results......................................................................................................
Table 8: Baseline Simulated Variable Values.......................................................................
Table 9: Simulation Variable Values..........................................................................................
Table 10: Simulation Cost Matrix..............................................................................................
20
21
26
26
34
53
58
60
64
66
Chapter 1-Introduction
In an increasingly complex world, the US Department of Defense (DoD)
Project Managers (PMs) frequently find themselves faced with decisions to select
among various technologies to develop capabilities for future weapons systems.
Oftentimes, these technologies are in their infancy and it is difficult for PMs to
clearly determine which technologies provide the optimal choice when years of
development remain prior to their inclusion in a fielded weapons system.
The current government-contractor model encourages PMs to compare
various contractor proposals and make award decisions based on which is likely to
deliver the required capability at the lowest cost. This decision-making process
implicitly decides whether a given technology is included in weapons system or not,
even if no explicit decision is made specifically regarding the technology. For
example, in the Army's upcoming Manned Ground Vehicle (MGV) solicitation, BAE
Systems announced it would submit a proposal utilizing a hybrid-electric
powertrain (Clark, 2010). While the Army's decision will be based on the
capabilities of the whole MGV system, its selection of a prime contractor will
determine whether or not this hybrid technology will be developed for a military
application. Unfortunately, this approach has led to problems over the years as
many contractors have over-promised and under-delivered on performance, cost,
and schedule on numerous weapons systems such as the Marine Corps
Expeditionary Fighting Vehicle (GAO, 2008), the Army's Future Combat Systems
(GAO, 2009), and the Department of Defense's Joint Strike Fighter (Ackerman,
2010).
While this thesis is intended to address challenges faced by the US
Department of Defense, the challenge of technology selection is not unique to DOD.
Modern industry is constantly challenged by technology selection decisions that
could also benefit from improvements to the process. Just a handful of examples
include Boeing's selection of composite technologies for the 787 commercial
airplane, General Motor's hybrid technology for the Chevrolet Volt, and even the
computer industry's development of tablet devices. Each of these systems requires
the program manager to select different technologies to be included as the system is
developed.
In many cases, technologies have been selected that required significantly
more resources than planned in order to achieve the maturity needed for fielding.
These challenges have delayed and even cancelled weapons systems resulting in
significant expenditures of funds that could have been better utilized if a more
systematic approach to technology selection was undertaken during the initial
phases of the system engineering.
To address this shortfall, this thesis answers the following question:
How can a decision-makerdetermine which technology should be selectedfor
investment to increasethe capabilitiesof a given weapons system?
Typically, programs can be viewed from a cost, performance, and schedule
perspective. As many weapons system programs have very long schedules (years or
decades), for the purposes of this research, it will be assumed that the development
schedule for different technologies is not materially different and that two
technologies could be developed in the same timeframe by applying additional
resources. Additionally, performance will be considered as a single variable even
though many variables (size, weight, durability, reliability, etc.) would need to be
considered in a real-world case. Therefore, for the purposes of this thesis,
technology evaluation will be based exclusively on a cost-performance comparison.
This thesis presents the current methods for technology selection, examples
of how the current approach is failing, a methodology for technology selection, a
discussion of the merits and shortfalls of the proposed methodology, and
recommendations for future technology selection decisions. In this thesis, a casestudy of the US Army's Active Protection System will be utilized to apply the
proposed methodology and test its utility with numerical simulations. Because of
the narrow scope, the proposed methodology may not be universally applicable to
all technology selection decisions but should be considered as a tool that can be
utilized to provide a systematic approach to a multi-attribute decision.
Chapter 2- Background
Beginning in 2003, the United States government embarked on one of the
largest expansions in military spending since World War II. The increase in funding
has been justified as a response to the September 11th attacks of 2001 but has
included funding for virtually all aspects of the United States Department of Defense
(DoD) budget. The increase in defense spending is depicted in Figure 1 showing the
inflation adjusted military spending since 1940.
WA*k0M00
MAy 23, 2010
Tho War Chest
The UnMte StteS currenly spends about two-Iowds
as mci money now On the matay as Atdod dunng
000
he
pea spnlng yew inVWWld V40 i
FYISO
1960
90
170
soee congessaRnc sence Cote 00MangmeWt
and
Confe Or Strategc "Ad
eerr
Asmet
-Nge
low
1
2010
3
300
* %I
o
Figure 1: US DoD budget - adjusted for inflation (Shanker & Drew, 2010)
a:n
Caryl describes the magnitude of the Fiscal Year 2011 defense budget as
follows:
In February the Pentagon requested $708.2 billion for fiscal year 2011 -- which would
make the coming year's defense budget, adjusted for inflation, the biggest since World
War II.As one analysis of the budget points out, that would mean that total defense
spending -- including the wars in Afghanistan and Iraq -- has grown 70 percent in real
terms since 2001. Defense spending now accounts for some 20 percent of federal
discretionary spending. That's even more than Social Security. (Caryl, 2010)
Since the worldwide economic downturn in 2008, a growing minority has
begun to question the wisdom of allocating such a large portion of the US
discretionary spending to the DoD. While few engaged in this discussion seriously
advocate slashing defense spending, comparisons to other countries certainly
suggest that cuts could be accomplished without jeopardizing American military
supremacy. Caryl continues:
As a consequence, every year the United States accounts for just under half of the
entire world's military spending. (By way of comparison, China spends about 8
percent; Russia, 5 percent.) As Benjamin Friedman, a research fellow at the
libertarian Cato Institute, recently noted in one report: "The closest thing the United
States has to state enemies -- North Korea, Iran, and Syria -- together spend about
$10 billion annually on their militaries -- less than one-sixtieth of what we do."
(Caryl, 2010)
With this perspective, some politicians have begun testing the waters of
defense cuts with their constituents as they begin tackling how to reduce the
growing US budget deficit. House Majority Leader Steny Hoyer, D-Md was quoted as
saying "Any conversation about the deficit that leaves out defense spending is
seriously flawed before it begins." (Sahadi, 2010)
In response to these challenges, the Secretary of Defense Robert Gates has
begun an effort to head off defense cuts while trying to increase the efficiency of the
DoD. "Military spending on things large and small can and should expect closer,
harsher scrutiny," Mr. Gates said. "The gusher has been turned off, and will stay off
for a good period of time." (Shanker &Drew, 2010) While Secretary Gates
challenges the need to cut the DoD budget, and in fact has called for slight increases
in the DoD budget each year, he has recognized that "the Department of Defense
cannot expect America's elected representatives to approve budget increases each
year unless we are doing a good job; indeed, everything possible to make every
dollar count." (Gates, 2010) Despite Secretary Gates wishes, others disagree on the
future of the US Defense Budget. After years of unfettered growth in military
budgets, Defense Department planners, top commanders and weapons
manufacturers now say they are almost certain that the financial meltdown will
have a serious impact on future Pentagon spending." (Shanker &Drew, 2008)
Among the areas that Secretary Gates has attempted to make more efficient
is DoD Research and Development (R&D) and Acquisition. This effort began in 2009
by "reforming the department's approach to military acquisition, curtailing or
canceling about 20 troubled or excess programs, programs that, if pursued to
completion, would have cost more than $300 billion. Additional program savings
have been recommended in the budget we submitted this year." (Gates, 2010) As
part of that effort, the US military has terminated the following programs because
they "were over cost, behind schedule, no longer suited to meet the warfighters'
current needs, or based on a single service, instead of a joint solution" - VH-71
Presidential Helicopter, Combat Search and Rescue Helicopter (CSAR), NextGeneration Bomber, Future Combat System - Manned Ground Vehicles,
Transformational Satellite, Ballistic Missile Defense- Multiple Kill Vehicle, and
recommended ending production of the C-17 Cargo Aircraft, DDG-1000 Destroyer,
and the F-22 Fighter. (GAO, 2010)
While these programs were terminated or curtailed, the requirement for the
capabilities offered by some of these systems still exist. For example, while the
CSAR program was cancelled, the Air Force still has a requirement to replace 112
HH-60J rescue helicopters, In the case of the DDG-1000 destroyer; the Navy is now
considering building new Arleigh Burge class destroyers as a cheaper alternative.
Efforts have already begun to start replacement programs for the Future Combat
Systems, Manned Ground Vehicles, the VH-71 helicopter, the Multiple Kill Vehicle,
and the Next Generation bomber.
Upon examining the DOD's acquisition plan, the Government Accountability
Office stated that:
"DOD's portfolio of major defense acquisition programs grew to 102 programs in
2009-a net increase of 6 since December 2007. Eighteen programs with an estimated
cost of over $72 billion entered the portfolio, while 12 programs with an estimated cost
of $48 billion, including over $7 billion in cost growth, left the portfolio. When the
Future Combat System is added to the programs leaving the portfolio, the total cost of
these programs increases to $179 billion, including over $47 billion in cost growth."
(GAO, 2010)
No matter how DOD juggles the acquisition plan for these systems, a significant
amount of money will be spent on these weapons systems regardless of whether or
not a given system is terminated, curtailed, or replaced. As Secretary Gates stated in
August 2010, "the current and planned defense budgets, which project modest but
steady growth, represent the minimum level of spending necessary to sustain a
military at war and to protect our interests and future capabilities in a dangerous
and unstable world." (Gates, 2010) If these vast sums of money are going to be
spent on weapons system development and acquisition, the alternative is to make
such costs more efficient to provide the Warfighter with the capability needed at the
lowest possible cost.
Few in the defense industry would argue that weapons acquisition is an
efficient and well-run process. The US Congress has passed a number of acquisition
reform acts, including the latest Weapon Systems Acquisition Reform Act of 2009, in
an attempt to make the process more efficient and effective but to date these efforts
have appeared largely unsuccessful. Despite legislative acts, changing contractor
models, and recent threats of program terminations, the cost of developing and
acquiring weapons systems has grown tremendously over the last several years.
While the GAO has consistently recommended the use of systems
engineering practices to help control costs, these practices have not been widely
applied and do not include all practices that could reduce the never-ending cost
growth (GAO, 2008). This report will outline the current limited use of systems
engineering practices in weapons systems development and acquisition and present
a case study of how modern systems modeling practices, specifically Monte Carlo
simulations, can help program managers to control costs while delivering a capable
weapons system on time and on budget.
CurrentPractices
In 2010, the US Government Accountability Office (GAO) conducted a review
of a number of DOD acquisition programs to study the effect of various acquisition
reforms. GAO reported a number of successes primarily that more recently started
programs have performed better in terms of cost, schedule, and performance. GAO
concluded that "for 42 programs GAO assessed in depth in 2010, there has been
continued improvement in the technology, design, and manufacturing knowledge
programs had at key points in the acquisition process. However, most programs are
still proceeding with less knowledge than best practices suggest, putting them at
higher risk for cost growth and schedule delays" (GAO, 2010).
Despite these improvements, most major weapons systems are currently
exhibiting significant cost increases and schedule delays. GAO reports that of 72
programs only 20 (28%) are achieving Initial Operating Capability (IOC) within one
month of the intended IOC date as illustrated in Figure 2. GAO further reports that
14% of these programs have experienced IOC delays of more than 48 months (GAO,
2010)..
Programs planning to achieve 1OC
on time (or less than i month iate)
(20 programs
28%
14%
24%
*
-14%8is%
\between
Programs planning to achieve
between 1 to 12 months late
(17 programs)
IOC
Programs planning to achieve
13 to 24 months late
(13 programs
IOC
Programs planning to achieve
between 25 to 48 months late
(12 programs)
IOC
Programs planning to achieve IOC
more than 48 months late
(10 programs
Note lnitiaJ operalonal capabsty JOC i is goreralty aclheo wihen sorne urats or orgarztions that
We sce0ue to lecere a system have rceived itand have the aary to employ ard enantain it
Figure 2: Delays in Program Initial Operating Capability (GAO, 2009)
Further, GAO also reported the cost increases for 10 major programs, shown in
Figure 3. Note that the Future Combat System, the F-22A Raptor, and the C-17
Globemaster III programs have been terminated or halted by Secretary Gates.
Total cost
(fiscal year 2009
d0lrs In mllions)
Progam
Joint Strike Fighter
Future Combat System
First full
Current
estimst
estliAte
206,410
89,776
Acquisition
unit cost
Total quantity
First full Current
*stlmatsestmat
244,772
2,866
2,456
change
38
15
15
45
30
30
40
648
184
195
Virginia Class Submanrne
F-22A Raptor
C-17 Globemaster Ill
58,378
88,134
129,731
81.556
73,723
51,733
73,571
210
190
57
V-22 Joint Services
Advanced Vertical Lilt
Aircraft
F/A-i 8E/F SuperKornet
Trident If Missile
CVN 21 Nuclear Aircraft
Class Carner
P-8A Poseidon Multmission Maritime Aircraft
38,726
55,544
913
458
186
78,925
49,939
34,360
51.787
49,614
29,914
1.000
493
561
3
33
845
3
50
-13
29,974
29,622
115
113
1
se'e
-C (eAO Woemo
f'I
CM 4M
Figure 3: Cost Increases in Major DOD Acquisition Programs (GAO, 2009)
Of these 10 programs, 5 (F-22, F/A-18 E/F, Trident II,CVN 21, P-8A) experienced
cost decreases from the first full estimate to the current estimate as of the GAO
report. However, at the same time, 4 of these 5 programs had significant reductions
in the total acquisition quantity resulting in significantly higher unit costs for these
4 programs. For the four, the percentage change in unit acquisition cost ranged
from 1% to 195%. Of the other 5 programs, the total program costs increased from
19% to 45% and 3 of these 5 also had reduced quantity purchases resulting in unit
acquisition cost increases from 38% to 186%.
However, despite the general trend towards improvement, GAO also identified
numerous specific problems on a host of weapons systems such as "a laser jammer
that did not work as intended, peeling coating on ships, deficient welding, and
nonconforming parts" (GAO, 2008). In this review, GAO attributed a number of quality
and performance issues to "defense contractors' poor practices for systems engineering
activities as well as manufacturing and supplier quality problems" (GAO, 2008) GAO
identified 11 weapons systems that suffered quality problems and categorized the source
of the problems as Systems Engineering, Manufacturing, and/or Supplier Quality.
These quality problems, with associated cost and schedule increases are identified in
Figure 4: Weapons Systems Quality Problem Source (GAO, 2008)Figure 4.
S
enginnedn
AdVncd SEAL DelhtrY
Maraufcturing
iSUPPOie
/
Ad&eoed Ttveaftntrataed
Counterwneaue
Coman AssIe Warn-g
/
t e qenal"y
pVobtm
,or
o Quolfty probn
Quality
I datArs n frolsons I
Schedule
$1
Progam Ka~d
*1~~
Expeoditoary r9"hhg
Ve"Kai
/
~1
a-e~r delay
I
A
Joint At-icSuza
6-mnC
1
esus
V-22
LPOi JAAhtS
Irshopon DoiW
p AApIfl
C'
d3o*
S846
11
I
MH4OSfFteetCwuMat
/
No cost impart
10 programi
Pt~nT1tth-Ta
Capatty-3
Inht Catng
V-22 JOIN!
50fvice
Amafexod
Vtca
tLt Adw(Aft
rat patns
SATCOM
-
I I
'AD80'-1
' -
-1,
.' L%,!,
Figure 4: Weapons Systems Quality Problem Source (GAO, 2008)
While the table does not separate out the cost and schedule impact from each
of the three quality problem sources, two programs, the Expeditionary Fighting
Vehicle (EFV) and the V-22 Joint Services Advanced Vertical Lift Aircraft suffered
quality problems resulting from only Systems Engineering shortfalls. These two
programs alone had a net cost effect of $915 Million and a delay to the EFV four
years and a halt of Flight Operations for the V-22 for 17 months. It can safely be
assumed that the other programs exhibited significant cost and schedule effects due
to poor implementation of systems engineering practices.
Over the years, the GAO has highlighted the benefits of early use of systems
engineering practices. In 2008, GAO stated that:
Systems engineering is a key practice that companies use to build quality into new products.
Companies translate customers' broad requirements into detailed requirements and designs,
including identifying requisite technological, software, engineering, and production
capabilities. Systems engineering also involves performing verification activities, including
testing, to confirm that the design satisfies requirements. Products borne out of a knowledgebased approach stand a significantly better chance to be delivered on time, within budget,
and with the promised capabilities. (GAO, 2008)
Additionally, GAO defined systems engineering in the same report as
A sequence of activities that translates customer needs into specific capabilities and
ultimately into a preferred design. These activities include requirements analysis,
design, and testing in order to ensure that the product's requirements are achievable
and designable given available resources, such as technologies. (GAO, 2008)
GAO further claims that poor systems engineering contributed to problems on the
specific DoD development and acquisition programs including the EFV, the Threat
Infrared Countermeasure/Common Missile Warning System, and Joint Air-toSurface Standoff Missile (JASSM). (GAO, 2008)
To further understand the cause of the numerous problems seen with DOD
acquisition programs, a detailed examination of three legacy programs was
conducted. The three systems selected are the Army's Future Combat System (FCS),
the Air Force, Navy, and Marine Corps F-35 Joint Strike Fighter (JSF), and the Missile
Defense Agency's Airborne Laser Test Bed (ALTB - formerly the Airborne Laser).
Legacy Projects
In this section, the Future Combat System, the F-35 Joint Strike Fighter, and
the Airborne Laser Test Bed programs are described and the cost, capability, and
schedule challenges are described followed by a summary of the cause of these
challenges. Each of these three programs has suffered from poor systems
engineering practices, including poor technology selection, poor program
management, poor requirement definition, and poor systems architecture. Most
notably, many of the challenges encountered in these three programs relate to the
selection of immature technologies as critical components of the programs.
Future Combat System
In the mid-1990s, the US Army embarked on a major reorganization to build
a "networked" brigade-centered force structure. The concept was based on the
premise that timely information flow up and down the command structure would
allow for destruction of enemy targets prior to the enemy engaging American
troops. This assumption prompted the Army to develop a new generation of
armored vehicles that were much lighter than the current M-1A1 main battle tank
and the M-2/M-3 Bradley Infantry/Cavalry Fighting Vehicle.
An "integrated family of advanced, networked combat and sustainment
systems; unmanned ground and air vehicles; and unattended sensors and
munitions" formed the Future Combat Systems (FCS) program initiated in May of
2000 (GAO, 2009). An artist's conception of all of these subsystems in shown in
Figure 5.
ft*
u~.
-
4
I
I
#4
I.
4
Figure 5: US Army Future Combat System (FCS) Artists Conception (GAO, 2009)
When development started of the FCS in May of 2003, the GAO reports that
initial cost (in 2005 dollars) and schedule estimates were as shown in Table 1.
Table 1: 2003 FCS Cost Estimates (GAO, 2005)
Category
Estimate
(5/2003)
R&D Cost
Procurement Cost
Total Program Cost
Program Unit Cost
Total Quantities (brigades)
Acquisition Cycle Time (Months)
$18.6B
$60.6B
$79.8B
$5.32B
15
91
Up to 2009, surprisingly little progress was made on the FCS program
despite spending billions of dollars in R&D costs. In 2005, the GAO conducted a
review of the FCS program and reported that only "one of the FCS program's 54
critical technologies is currently mature. Overall, the program's current technology
maturity is slightly less than it was in May 2003 when the program began
development" (GAO, 2005). As development continued, cost estimates increased
significantly.
Table 2 shows two FCS program cost and schedule estimates in 2009 dollars.
Table 2: 2004 FCS Cost and Schedule Estimate (GAO, 2005; 2009; 2010)
Category
R&D Cost
Procurement Cost
Total Program
Estimate
(09/2004)
2005
dollars
$20.9B
$68.2B
$89.8B
Estimate
(12/2007)
2009
dollars
$28.8B
$100.1B
$129.7B
Estimate
(03/2010)
2009
Dollars
$29.0B
$129.3B
$159.3B
5.99B
15
$8.65B
15
$10.62B
15
139
147
147
Cost
Program Unit Cost
Total Quantities
(brigades)
Acquisition Cycle
Time
(Months)
I
From the program start in 2003 until September of 2004, total program cost
increased 35% (approximately 18 billion dollars) and the expected acquisition time
increased by 48 months. Then in the following years up to 2009, the total program
cost increased further and the schedule slipped significantly. GAO reported that the
primary source of much of these problems was poor systems engineering practices,
described as follows:
The program is not appropriately applying best practices to maturing its critical
technologies. It considers technical risk acceptable as long as it can estimate that the
technologies will be demonstrated in a relevant environment before design review.
Also, it does not consistently include form or fit in technology maturation because it
views sizing the technology as an integration risk, not a technology risk. In addition,
the program could assess a technology as mature simply because it is part of another
program. (GAO, 2005)
Despite the fact that the GAO determined that some technologies were actually
less mature in 2005 than when the program started, the Army Project Manager still
expected that all technologies would be mature by 2008 (GAO, 2005). However, in
2009, the GAO reported that
Of the FCS program's 44 critical technologies, 3 are fully mature and 27 are nearing
maturity.... Since 2003, the Army has not advanced the maturity of 11 technologies.
Two others, which are central to the Army's plans to replace armor with superior
information, are now rated less mature than when the FCS program began. (GAO,
2009)
Based on these cost and schedule challenges, Secretary of Defense "Robert
Gates boldly slashed several high-profile, big-ticket weapons programs, including
the Army's $160 billion Future Combat Systems" (Caryl, 2010). Despite cancellation
of the FCS program, the Army reinitiated development of new manned ground
vehicles following the FCS cancellation to fulfill a standing requirement that the
Army has for new vehicles to replace the Bradley Infantry Fighting Vehicle.
However, in August of 2010, the US Army cancelled the solicitation for the
Ground Combat Vehicle. This decision was based on a "Red Team" analysis which
compared the current battlefield threats to the submitted vehicle proposals.
Unfortunately, the official announcement noted that "the Army determined that it
must revise the acquisition strategy to rely on mature technologies in order to
reduce significant developmental risk" (Grant, 2010). This description seems to
follow closely to some of the challenges the Army faced in the cancelled FCS
program. While some credit must be given to the Army for realizing the problem
and cancelling the program prior to beginning development, the fact that this
problem is recurring suggests that the Army Acquisition Corps has not learned all of
the lessons from the failed FCS program.
In summary, the FCS program collapsed under never ending technology
development programs. Systems engineering practices were never brought to bear
against the requirements to determine whether or not the current technology was
actually capable of providing the capability required. Sadly, the GAO had been
accurately reporting these shortfalls for years but the Program Managers chose to
let the warnings fall to the wayside.
Joint Strike Fighter
The F-35 Joint Strike Fighter (JSF), shown in Figure 6, is a fifth generation
multi-role aircraft that is intended to replace legacy Air Force F-16 and A-10
aircraft, Navy F/A-18 aircraft, Marine Corps F/A-18 and AV-8B aircraft, as well as a
host of foreign aircraft, most notably the UK's Harrier fleet.
Figure 6: F-35A Joint Strike Fighter (PEO JSF)
Three different variants of the F-35 are planned, one for each service, shown in
Figure 7. The F-35A is the baseline aircraft designed for the Air Force. The F-35B is
a short take off vertical landing (STOVL) version that uses a complex series of ducts,
doors, and a lift fan to redirect engine thrust downwards to permit Marine Corps
operations on amphibious assault ships. The F-35C is a large wingspan version of
the F-35A with longer-range and a strengthened structure to withstand operations
on Navy aircraft carriers.
F-35
Conventional Take Off O&
Landing (CTOL)
Span (ft) 35
Length (ft) 50.5
Wing Area (ft2) 460
Internal Fuel (Ib) 18,498
Short Take Off/Vertcal Landing (STOVL)
Span (ft) 35
Length (ft) 50.5
Wing Area (ft2) 460
Internal Fuel (Ib) 13,326
Carrier Variant (CV)
Span (ft) 43
Length (ft) 50.8
Wing Area (ft2) 620
Internal Fuel (Ib) 19,624
Figure 7: F-35 Variants (PEO JSF)
The Joint Strike Fighter had its origins in a number of research programs in the
1980s and 1990s. In 1997, the DoD selected Lockheed Martin and Boeing to
participate in a concept demonstration effort. Each manufacturer built several
flying demonstrators culminating in a selection of Lockheed Martin as the prime
contractor for the Joint Strike Fighter in 2001. Since then, the JSF program has faced
significant challenges as described by Michael Sullivan of the GAO:
The F-35 Lightning II, also known as the Joint Strike Fighter (JSF), is the Department of
Defense's (DOD) most costly and ambitious aircraft acquisition, seeking to simultaneously
develop and field three aircraft variants for the Air Force, Navy, Marine Corps, and eight
international partners. The JSF is critical for recapitalizing tactical air forces and will
require a long-term commitment to very large annual funding outlays. The current
estimated investment is $323 billion to develop and procure 2,457 aircraft. (Sullivan, 2010)
In the same report, the GAO laid out the schedule, cost, and acquisition quantity changes to the
JSF program outlined in Table 3 and
Table 4. As shown in the table, the JSF program has been replanned and
restructured several times resulting in significant cost increases, reduction in
aircraft purchases, and delays in aircraft deliveries.
Table 3: JSF Program Changes (Sullivan, 2010)
October 2001
(system December 2003
deveopment start)
(2004 Replan)
March 2007
(Approved
Basewn.)
Fiscal Year
2011 Budget
Expected quantites
Development quantities
Procurement quantiies (U.S only)
Total quantities
Cost estimates (then-year dollars in taions)
Development
Procurement
Total progm Acquialoei
ae note)
unit cost estimates (then-year doas A rnabans)
Program acquisition
Average procurement
Estimated defvery dates
First operational aircralt debvery
initial operational capaotmy
14
2,852
2,166
14
2.443
2.457
$34.4
196.6
5231.0
S44.8
199.8
$244.6
$81
69
$100
82
2008
2010-2012
15
14
2.443
2.458
2443
2.457
S44.8
S49 3
231 7
3276.5
5322.6
$113
95
2009
2012-2013
273.3
$131
112
2010
2010
2012-2015
2012-2015
Table 4: JSF Schedule Changes (Sullivan, 2010)
Program of
record December
2007
Program of
record December Restructure
2006
February 2010
Development testng
complete
October 2012
October 2013
March 2015
Irdial opeational test and
evalualion compeete
System development and
demonstration phase
October 2013
October 2014
January 2016
October 2013
October 2014
April 2016
October 2013
October 2014
April 2016
Major miestonea
complete
Fu#-rale productbon
decasion
To date, the JSF program is still struggling to meet their new schedules for test
flights and test milestones. In September, Graham Warwick reported that
Thanks to the performance of the F-35A development jets, the JSF test program is
running well ahead of plan of the year... But that disguises the fact that STOVL testing
is well behind schedule, because of reliability issues with the F-35B test jets, with 122
26
flights by the end of August against a plan of 153. (Warwick, F-35's Unequal Progress,
2010)
Surprisingly, many of the reliability issues with the F-35B variant are with
seemingly simple components. Lockheed Martin CEO Bob Stevens states that "the
components that are failing are more of the things that would appear either smaller
or more ordinary like thermal cooling fans, door actuators, selected valves or
switches or components of the power system" (Wall, 2010). Further, Stevens notes
that "in some cases, we've had to remove the engine to get access to the component"
in order to conduct the repair or replacement (Wall, 2010).
Bill Sweetman, Editor in Chief of Defense Technology International and
frequent JSF critic, sarcastically notes:
You design a jet with seven medium-to-large doors that all have to open in a
combination of high airflow, vibration, noise and heat. Ifthey don't close perfectly
after take-off, the aircraft is no longer stealthy. Ifone of them won't open for
transition, the jet can't recover to the carrier. Who could possibly have anticipated
problems with that? (Wall, 2010)
In fact, many of the programmatic challenges encountered in the JSF program
could have easily been foreseen by studying the history of the TFX fighter program
which resulted in the F-111 Fighter/Bomber shown in Figure 8. Similar to the JSF,
the F-111 program was initially intended to develop versions for both the Air Force
and Navy but the Navy withdrew from the program as it floundered.
In 1961, Secretary of Defense Robert S. McNamara
asked the Air Force to determine with the Army and Navy if the TFX could provide
close air support (CAS) to ground troops; air defense of the fleet; as interdiction of
enemy logistics- the Air Force's primary objective. Army and Navy CAS objections to
the TFX finally prevailed in May. Notwithstanding, Secretary McNamara remained
convinced that the TFX could satisfy other Navy and Air Force needs. In June he
instructed the Air Force to 'work closely' with the Navy in trying the two services'
requirements in a new, cost-effective TFX configuration. (Knaack, 1978)
Figure 8: General Dynamics F-111
As the TFX program progressed, the Air Force and Navy both ended up with
versions of aircraft that did not fully meet their requirements. In addition, the
designs continued to diverge as the Air Force and Navy versions of the aircraft
reached flight readiness. In 1968, the Navy's version, the F-111B was cancelled as
the aircraft weight did not meet requirements and the crew capsule did not have
sufficient visibility for carrier operations. (Knaack, 1978)
Similar challenges are seen in the JSF program. As the JSF has progressed,
each service is dealing with the fact that the aircraft is a combination of
compromises. Despite the fact that the F-35B STOVL version just took off on its first
flight, the rumor mill is already discussing whether or not the version may be
cancelled since the UK has now cancelled its order of STOVL aircraft. (Sweetman,
2010)
In summary, the JSF program is suffering from problems occasioned by poor
systems architecture and unrealistic project management. By attempting to merge
the requirements of the three services, the DoD has demanded the construction of
an aircraft that is overly complex whose technology has not matured sufficiently for
high reliability. Coupled with this, the contractor has exhibited poor management of
the program resulting in the significant cost overruns and schedule slips. Despite
PEO JSF's claim that "the F-35 Lightning II Program is the Department of Defense's
focal point for defining affordable next generation strike aircraft weapon systems"
(PEO JSF) the current program leaves a lot to be desired. Unfortunately, the
prognosis is not improving as media reported in November 2010 report that "the
$382 billion stealth plane might get pushed back as much as three years, with an
added $5 billion price tag." (Ackerman, 2010)
Airborne Laser (ABL)/Airborne Laser Test Bed (ALTB)
The Airborne Laser is a modified Boeing 747 aircraft carrying several high
power lasers intended to shoot-down ballistic missiles in the boost-phase portion of
flight. The ABL "employs a battle management subsystem to plan and execute
engagements, a high-energy chemical laser to rupture the fuel tanks of enemy
missiles, and a beam control/fire control subsystem to focus the high-energy laser
beam on the target." (GAO, 2010)
Figure 9: Airborne Laser Aircraft (GAO, 2010)
In concept, the ABL is intended to orbit a given area to detect and then
destroy ballistic missiles during powered flight. Because of the short duration of a
ballistic missile's powered flight and the ABL's range of 50-100 miles, the ABL
would need to orbit within close proximity to a launch site to await the launch of
missiles in order to be in position to destroy them during powered flight
(Schachtman, Raygun 747 Botches Another Test, 2010). This requirement proved to
be the programs undoing.
The Missile Defense Agency describes the ABL functionality during a test as
follows:
Within seconds, the Airborne Laser Test Bed [ALTB] used on-board sensors to detect
the boosting missile and used a low-energy laser to track the target. The ALTB then
fired a second low-energy laser to measure and compensate for atmospheric
disturbance. Finally, the ALTB fired its megawatt-class High Energy Laser, heating
the boosting ballistic missile to critical structural failure. The entire engagement
occurred within two minutes of the target missile launch, while its rocket motors
were still thrusting. (Missile Defense Agency, 2010)
In 1996, the Air Force and Missile Defense Agency planned to build two
prototypes and then have the Air Force purchase 5 operational aircraft with a
development cost of $2.5B. (Duffy, 2006) However, by 2006, costs had risen to
$7.3B causing further scrutiny of the program. (Schachtman, Laser Jet's Toxic
Interior, 2006).
In 2009, Secretary of Defense Gates reviewed the MDA portfolio and made the
following statement, highlighting one of the shortfalls of the ABL concept.
"Idon't know anybody at the Department of Defense who thinks that this program
should, or would, ever be operationally deployed," Gates told Congress last year. "The
reality is that you would need a laser something like 20 to 30 times more powerful
than the chemical laser in the plane right now to be able to get any distance from the
launch site to fire.
"So, right now the [jet] would have to orbit inside the borders of Iran in order to be
30
able to try and use its laser to shoot down that missile in the boost phase. And if you
were to operationalize this you would be looking at 10 to 20 747s, at a billion-and-ahalf dollars apiece, and $100 million a year to operate. And there's nobody in uniform
that I know who believes that this is a workable concept." (Schachtman, Video: Laser
Jet Blasts Ballistic Missile in Landmark Test, 2010)
Similar to FCS and JSF, the ABL program was envisioned prior to the
maturation of the requisite technologies to make it operationally useful. The GAO
stated in 2010 that:
None of ABL's seven critical technologies are fully mature. Program officials assessed
one of ABL's seven critical technologies-managing the high- power beam-as fully
mature, but the technology has not yet been demonstrated in a flight environment.
The remaining six technologies-the six-module laser, missile tracking, atmospheric
compensation, transmissive optics, optical coatings, and jitter control-were assessed
as nearing maturity. (GAO, 2010)
In the same report, the GAO reported that "the program currently estimates that the
cost of the ABL through the first lethality demonstration is nearly $5.1 billion,
almost five times the approximate $1 billion estimated for the original contract in
1996." (GAO, 2010)
Based on the operational limitations, technology maturity levels, and
program challenges, the DoD decided to halt the procurement of further ABL aircraft
after the purchase of the initial YAL-1A prototype aircraft. Instead, the DoD has
decided to utilize the YAL-1A aircraft as a testbed for directed energy applications
pertaining to missile defense resulting in the new Airborne Laser Test Bed (ALTB)
designation. Since that time, the ALTB has had mixed success in its ability to track
and destroy boosting ballistic missiles in test events. Since the decision to make the
ABL a testbed system, the aircraft has successfully destroyed only one of three
ballistic missile targets. (Schachtman, Raygun 747 Botches Another Test, 2010).
As with both the FCS and JSF program, the ABL/ALTB program exhibited
poor program management with a high reliance on immature technology that did
not advance at a rate anywhere close to that anticipated at the beginning of the
program. Despite nearly 14 years of development, the GAO still reported that zero
of the seven critical technologies were mature by 2010. This either suggests that
the original program management did not have sufficient understanding of the
required effort or severely mismanaged the programs during the 14 years resulting
in significant schedule delays and cost overruns.
Legacy Program Summary
In all three of these cases, a significant source of the programmatic
challenges are reliance on immature technologies that cost much more than
expected and take much longer than expected to develop. This challenge calls into
question the selection of these particular technologies as key parts in each program.
The GAO summarizes the status of DOD program delays as follows:
In addition to delivering fewer quantities than expected, DOD continues to experience
delays in delivering new or modified weapon systems to the warfighter as promised.
Acquisition delays can lead to loss of program credibility with stakeholders, increased
acquisition costs, new systems not being available to meet the needs of warfighters
during combat operations, and the continued use of less capable systems with
questionable reliability and high operating costs. The average delay in delivering initial
capabilities to the warfighter increased to 22 months for programs in DOD's 2008
portfolio, compared with 21 months for programs in the 2007 portfolio (see table 1).
Only 28 percent of DOD's major defense acquisition programs currently estimate that
they will deliver on time or ahead of schedule, while just under one- half report they will
have a delay of 1 year or more in delivery of an initial operational capability (see Figure
10). (GAO, 2009)
Programs planning to achieve IOC
on time (or less than I month late)
(20 programs)
28%
14%
24%
Programs planning to achieve IC
between I to 12 months late
17 programs)
*
Programs planning to achieve IOC
between 13 to 24 months late
(13 programs)
18%
Programs planning to achieve IOC
-
between 25 to 48 months late
(12 programs)
Programs planning to achieve 10C
more than 48 months late
(10 programs)
No: uata operational capabaty 0IOCI is gr*ralty achmve, *hn sone urlts or organzalions thai
we seneue0 Io recewe a system have received It and av te th abity to emply and raintain it
Figure 10: DOD Schedule Delays as of December 2007 (GAO, 2009)
As mentioned by the GAO, the cost overruns have had a significant effect on
the final purchase quantities of many of DODs major programs. These effects are
summarized in Table 5.
Table 5: Changes in Cost and Acquisition Quantities (GAO, 2009)
Total cost
(fscal year 2009
Acquisition
dotm In millions)
Total quantlity
unit cost
First full Current
estimate estirate
206,410 244.772
Program
Joint Stnke Fighter
First futl Current
estinate eathmate
2,866
2.456
Percentage
change
38
Future Combat System
89,776
129,731
15
15
45
Virginia Class Submarne
F-22A Raptor
58,378
88,134
81,556
73,723
30
648
51,733
73,571
210
30
184
190
40
195
57
38,726
55,544
913
458
186
78,925
49,939
34,360
51,787
49,614
29,914
1,000
845
3
493
561
3
33
50
-13
29,974
29,622
115
113
1
C-17 Globemaster
ll
V-22 Joint Services
Advanced Vertcal Lilt
Aircraft
F/A-1861F Super Hornet
Trident If Missile
CVN 21 Nuclear Aircraft
Class Carrier
P-BA Poseidon Mulbmission Maritime Aircraft
As shown, of the 10 programs, only one showed a reduction in per unit cost and
eight of the programs showed a unit cost increase of greater than 30%. Of perhaps
greater significance, the planned acquisition quantity declined in many of these
programs, most significantly in the case of the F-22A Raptor where the acquired
quantity declined by 70%. One must assume that originally the Air Force conducted
an analysis to determine that 648 Raptors were required to accomplish the air
superiority mission required by DOD. Instead, because of rising costs, the Air Force
was only able to purchase 184 of the aircraft.
However, all is not doom and gloom in DOD acquisition. In several of the
more recent programs, the DOD appears to have taken steps to reduce the likelihood
of cost overruns and schedule slips. These changes are part of a DOD level plan to
alter several aspects of their acquisition strategy.
In December 2008, DOD revised its policy for major defense acquisition programs to
place more emphasis on acquiring knowledge about requirements, technology, and
design before programs start and maintaining discipline once they begin. The policy
recommends holding early systems engineering reviews; includes a requirement for
early prototyping; and establishes review boards to monitor requirements changesall positive steps. (GAO, 2009)
One of the best examples of this new approach is the Joint Light Tactical
Vehicle (JLTV). This program is a joint Army/Marine Corps effort to design and
acquire a new tactical vehicle to replace the venerable High Mobility Multi-Wheeled
Vehicle (HMMWV). As part of this program, the DOD selected three vendor teams
for Technology Development (TD) contracts to build representative prototypes,
shown in Figure 11, that are currently undergoing evaluation at the Army's
Aberdeen Proving Ground. Upon completion of the TD phase, the Army intends to
contract with two vendor teams for the engineering, manufacturing, and
development (EMD) phase.
Figure 11: JLTV Prototypes (PM JLTV, 2010)
Even the GAO, normally known for critical look at DOD acquisition programs
had this to say about the JLTV program. "At this point, it is a well-structured
program with desirable features like a competitive technology development phase"
(GAO, 2011).
While this approach does reduce the risk to the program by developing
critical technologies prior to the EMD phase, the JLTV program is still suffering from
challenges. In October, the DOD Buzz reported "the Marines, who have voiced
concerns for some time about the program, appear ready to abandon or seriously
curtail their purchase of the Joint Light Tactical Vehicle (JLTV)... The Army has
already voiced concerns about the program's rising price and may substantially
scale down its buy to around 50,000 vehicles" (Clark, JLTV Sinking, EFV Wobbly,
2010). According to the Congressional Research Service, the DOD was initially
planning on replacing approximately 160,000 vehicles with JLTVs (Feickert, 2010).
The GAO's report on Tactical Wheeled Vehicles supports this perspective:
JLTV's affordability will be a key determination at the Milestone B decision point. The
services and DOD will have to balance the cost of the JLTV against other service needs.
The cost could determine whether to continue with the program as planned, look at other
ways of meeting the requirements, or buy fewer vehicles. (GAO, 2011)
While this new approach does show some promise, it is not perfect.
Additional steps are needed to improve the DOD's acquisition capabilities to select
technologies that will be developed on-time and on-budget to allow systems to
reach the Warfighter on schedule. Without improvement to the DOD acquisition
cycle, many programs are doomed to continue the death spiral where increasing
costs leads to reduced quantity purchase which in turn leads to increasing unit
costs.
The next chapter describes a proposed methodology to better select
technologies for inclusion in research, development, and acquisition programs. This
methodology is intended to reduce the likelihood that a technology can be selected
for a program and then have the same technology derail the program, as it proves
incapable of meeting the requirements that led to the technology's selection.
Chapter 3- Proposed Technology Selection Approach
The GAO has frequently recommended the use of systems engineering
processes in DOD acquisitions. As exemplified by the JLTV, the use of some of these
practices has helped improve the DOD acquisition process. However, as described
previously, even the JLTV program, described by GAO as "well-structured," is still
struggling with cost challenges that may result in the familiar death spiral.
GAO describes the benefits as systems engineering as follows:
Early system engineering has proven helpful to programs that have employed it. Early
systems engineering, ideally beginning before a program is initiated and a business case is
set, is critical to ensuring that a product's requirements are achievable and designable
given available resources. Before starting development, programs should hold systems
engineering events such as the system requirements review, system functional review, and
preliminary design review to ensure that requirements are defined and feasible and that the
proposed design can meet those requirements within cost, schedule, and other system
constraints. As evidence of the benefits of early systems engineering, we found that the
programs in our assessment that conducted these systems engineering events prior to
development start experienced, on average, over 20 percent less research and development
cost growth than programs that conducted these reviews after development start. These
programs also often experienced a shorter delay in delivery of initial operational
capability. On average, the programs that conducted a system requirements review or a
system functional review prior to development start experienced delays in the delivery of
initial operational capabilities that were, respectively, 8 and 9 months shorter than
programs that held these reviews after development start. (GAO, 2009)
In contrast, GAO sums up the problems occasioned by ignoring systems
engineering practices below:
For example, in March 2007 we reported that only 16 percent of the 62 DOD weapon
system programs we reviewed had mature technologies to meet requirements at the start of
development. The prime contractors on these programs ignored best systems engineering
practices and relied on immature technologies that carry significant unknowns about
whether they are ready for integration into a product. (GAO, 2008)
GAO quantifies the benefits of systems engineering as shown in Figure 12. GAO
defines the System Requirements Review, System Design Review, and Preliminary
Design Review events as follows:
System Requirements Review (SRR) - "ensure that the system under review can proceed
into system development and that all system and performance requirements are consistent
with cost, schedule, risk, and other system constraints" (GAO, 2009)
System Functional Review (SFR) - "ensure that the system can proceed into preliminary
design and that all system and functional performance requirements are defined and are
consistent with cost, schedule, risk, and other system constraints" (GAO, 2009)
Preliminary Design Review (PDR) - "ensure that the system under review can proceed
into detailed design, and can meet the stated performance requirements within cost,
schedule, risk, and other system constraints" (GAO, 2009).
Purcent
60
50
40
a)
S30O
0
0
U
20
to
0
SAR
-
P'g.rme that "wdthe
9F4
evew
PogsvIwmMat f*ethe
".vne
POR
orwesopment stari
alter d4m
a
Figure 12: Average RDTE&E Cost Growth in GAO Study (GAO, 2009)
Note that the sample size for each of the three categories shown in Figure 12 is 31,
23, and 36 for SRR, SFR, and PDR respectively.
Even for programs that have conducted some of the key systems engineering
practices still suffer from significant cost overruns. For example, programs that
conducted a system functional review (SFR) prior to program start still experienced
more than 25% growth in RDT&E during the life of the program.
While GAO calls for the use of systems engineering to drive down system
development costs, GAO's definition of systems engineering is fairly limited. GAO
concentrates primarily on requirements definition and proper testing activities
described as translating "customers' broad requirements into detailed requirements
and designs, including identifying requisite technological, software, engineering, and
production capabilities" (GAO, 2008).
However, modern systems engineering includes many other techniques and
capabilities that can be brought to bear on the problems of program cost overruns
and schedule delays. The International Council on Systems Engineering (INCOSE)
defines modern systems engineering as
Activities involving the technologies, processes, and systems management approaches
needed for: definition of systems, including identification of user requirements and
technological specifications; development of systems, including conceptual
architectures, tradeoff of design concepts, configuration management during system
development, integration of new systems with legacy systems, and integrated product
and process development; and deployment of systems, including operational test and
evaluation, maintenance over an extended lifecycle, and reengineering. (International
Council on Systems Engineering, 2009)
The differences between the GAO's description of systems engineering and INCOSE's
definition allows for significant steps in improving DOD acquisition by incorporating the
practices described by INCOSE. Instead of simply using systems engineering to
incorporate appropriate requirements analysis, systems engineering also promotes
lowering acquisition costs be conducting design tradeoffs, systems architecting, and
probabilistic simulations among other methodologies. The use of these methodologies
can enable further improvements over the use of basic systems engineering as
defined by the GAO.
As demonstrated in Chapter 2, one of the significant problems encountered
in DOD product development programs is the selection of technologies that do not
seem to bring the capabilities expected to the overall program. This is the case in
both programs begun from scratch or technology insertion programs where a new
technology is inserted into legacy programs such as the Air Force's C-5 reengineering program. Utilizing the system engineering methodologies mentioned
by INCOSE, a more rigorous approach can be applied to the technology selection
process.
In 1961, Robert McNamara and his aides introduced "systems analysis" to
the Department of Defense. "Systems analysis centered on intensive study of
problems and options, with examinations of costs, benefits, and risks of potential
decisions" (Chivers, 2010). One of the first examples of the use of systems analysis
in DOD was the selection of the M-16 rifle. Unfortunately for the reputation of
system analysis, it was poorly executed and heavily influenced by politics resulting
in the fielding of an inferior rifle. However, in a properly executed systems analysis,
this approach would have clearly guided decision makers to the optimal decision
based on the particular variables considered in the analysis. This approach has
been widely utilized since the 1960s and has now become part of the regular DOD
analysis of weapons systems. Unfortunately, these analyses are often conducted at
the macro level resulting in missed opportunities at the micro level.
One recent example is in the ongoing Air Force KC-X competition for new
refueling tankers. As part of the criteria used in the 2008 evaluation the Air Force
utilized a modeling tool to support their decision-making process.
The Combined Mating and Ranging Planning System (Cmarps) was designed for the
Strategic Air Command in the 1980s and is now used by planners in Air Mobility
Command. It helps operators assess how many tankers are required for a variety of
missions, where they can be based and how many receivers -- fighters and intelligence
aircraft, for example -- can be serviced by the available refuelers. It is one of various
modeling systems used by the Air Force. (Butler, 2008)
These types of tools may be utilized to determine required number of aircraft to
fulfill a mission or how many submarines the Navy may need but do not serve to
identify particular technologies that could be most cost-effective to improve the
performance of a particular weapon system.
However, using some of the methodologies of systems engineering, this same
approach can be applied at the micro level and used to analyze the effects of
different technologies on a weapons system. Returning to the M-16 program, when
the rifle was first introduced it suffered from serious reliability problems caused by
a number of problems resulting in extraction failures that jammed the weapon
(Chivers, 2010). A micro-level analysis could have been used to determine the costbenefit ratios of all the different possible solutions such as chrome-plated receivers,
heavier buffers, better corrosion protection, and so on.
A methodology can be devised to utilize the "systems analysis" approach at
the technology level for both new weapons systems and legacy systems that are
being upgraded. This capability will become increasingly important with the
growing number of programs that use spiral development approaches that push out
systems with the intention of upgrading them in the future.
To develop such an analysis methodology, a system level model of the
particular system can be created based on known or assumed performance metrics
of individual subcomponents. This model should determine the system
performance across the primary dimensions of performance. The inputs to the
model should correspond to the different components or technologies included in
the system. After creating a system model with all of the relevant parameters, other
systems engineering practices can be brought to enhance the utility of the model.
With the cheap availability of modern computing power, Monte Carlo techniques
can be utilized to conduct a probabilistic analysis to determine the likely outcomes
of each of the possible technologies considered for insertion. By combining cost
estimates of developing and introducing each technology into a system, one can
then determine the cost-benefit ratio for a given technology.
Using the M-16 jamming problems as an example, a model could be created
that determines the probability of a failure to extract as the output. The inputs
could include the amount of gas from each round fired traveling back to the bolt
carrier, the probability of a round having a misshapen cartridge, the probability of
pitting in the barrel, and so on. After conducting a baseline simulation, the model
could then be optimized to reflect the affects of reducing the fouling from the
propellant gas, improving the cartridge manufacturing, or chroming the barrels to
reduce plating. Accompanying these optimizations would be estimated costs for
these design improvements. By determining the change in the likelihood of a jam,
one could decide which, or which combination, of these optimization options is the
most cost effective.
While outside of the scope of this thesis, it bears mentioning that accurate
cost estimates are critical to any attempt to quantify the cost-benefit ratio of any
technology selection. Many volumes could be written on estimating development
costs and significant research has been done on why estimates are often inaccurate.
A prime example of this is the recent controversies regarding the estimated cost of
the Joint Strike Fighter. The JSF program has resulted in numerous cost studies and
estimates performed by Lockheed Martin (the prime contractor), the Program
Executive Office (PEO) that manages the program, and the Department of Defense's
Joint Estimating Team (JET). The JET was requested to create an independent cost
estimate because Pentagon leadership was losing confidence in both Lockheed
Martin's and the PEO's ability to accurately predict development costs (Warwick,
2009). While cost estimating is controversial and oftentimes wrong, for the
purposes of this simulation, it will be assumed that all cost information is accurate
and that the selected development costs are correct.
The next chapter of this thesis presents such an example applied to the US
Army's Active Protection System (APS). The APS is an outgrowth of one of the FCS
initiatives. In purpose, APS is a suite of technologies that is able to detect, track, and
then shoot-down incoming munitions that are aimed at a tactical vehicle. A
graphical illustration of one company's system is shown in Figure 13.
Figure 13: Artist's Conception of Iron Curtain APS (Crane, 2009)
Several countries have developed various versions of APS with varying levels of
capability. Some systems are focused exclusively on rocket-propelled grenades
(RPG) while others are attempts at full spectrum coverage to cover all types of
threats such as RPGs, anti-tank guided missiles, and kinetic energy rounds. The
United States has begun efforts at developing its own version of APS managed by
Program Executive Office (PEO) Integration:
The FCS Active Protection System is being developed by Raytheon. Raytheon won the
contract from the FCS program after participating in an open competition that involved
other key competitors and competitor systems. A team of 21 technical experts from
various U.S. government agencies, the Army and private-sector industry evaluated
competing Active Protection Systems. According to the Government Accountability
Office, the team reached "a clear consensus... [that] Raytheon's Quick-Kill system was
the best alternative." Army officials said that one key advantage of the Raytheon APS is
its vertical launch system, which protects against top-attack rounds. They said this
gives Soldiers true 360-degree hemispherical protection." (Guardiano, 2008)
However, to date, Raytheon appears to be struggling to meet the required timelines
for fielding on schedule.
Despite APS' initial association with the Army's failed Future Combat
Systems, its development is still eagerly awaited by the Army to include in the new
Ground Combat Vehicle. "[GEN] Chiarelli said the new vehicle (GCV) would be able
to incorporate some kind of active protection - the ability to detect and shoot
down incoming rocket-propelled grenades or anti-tank guided missiles." (Hodge,
2010)
The following chapter describes the application of the proposed
methodology to the Active Protection System. The methodology provides a
framework to a decision maker for selecting technologies to be included in the
program. The selection is based on system analysis applied at the micro-level to
determine which technologies have the most benefit to the performance metrics
chosen by the decision maker.
46
Chapter 4- Methodology
Simulation Description
A technology assessment cost model of the Active Protection System (APS)
was created in MathWorks Matlab* R2009A. The model uses several cost models to
determine the effects of each possible technology. As part of the model, the Matlab
Statistical Toolbox determines a launch range of the threat munition and a flight
velocity based on a normal distribution, searches for the incoming round with an
increasing probability of detection as the munition approached the vehicle, and then
determines whether or not to launch a counter-measure. After modeling the
baseline case, a sensitivity analysis was conducted to determine the effects on total
system effectiveness as a function of changing several of the input variables. In
combination with these variables, a development cost was associated with changes
to the variables. Instead of determining the system effectiveness of the selected
variables, the system effectiveness was determined as a function of development
cost. The following portions in this chapter detail construction of the model and
how the technology cost analysis model can be applied to a real world situation.
Technology Cost Analysis Modeling
The Technology Cost Analysis Model is formed by the following steps:
1- Determine measures of performance for technology comparison
2- Select technologies to be considered
3- Develop cost models for technologies to be analyzed
4- Develop weapons system or sub-system model
5- Conduct Monte Carlo analysis of weapons system performance given
technology cost models
6- Compare measures of performance across different technologies
1. Determine measures of performance
This step will allow a decision maker to select the variables that are of importance
for selecting a technology over another. In the APS case, the two measures of
performance considered are system effectiveness, defined by percentage of threat
munitions destroyed, and development cost. Other parameters that could be
considered are weight, size, reliability, and others. Additionally, some programs will
benefit from considering acquisition costs, life-cycle costs instead of just the
development cost.
2. Select Technologies
For a given program, the decision maker must determine which technologies should
be included for analysis. In a real-world case, this step may require significant
research to select the most relevant technologies for consideration. Technologies
may be excluded for maturity, cost, or other reasons. For this thesis, three different
variables are considered: 1-the time it takes for the APS system to track an incoming
threat (tracking time), 2- the time it takes to launch a counter-measure munition
after the firing solution is determined (launch delay), and 3 - the minimum range at
which a counter-measure can function (CM min range). Initially, instead of using
tracking time, the counter-measure velocity was used. However, as the baseline
analysis case was developed, it was discovered that the velocity had virtually no
impact on overall system performance and this variable removed from
consideration for this analysis. Similarly, it was discovered that the CM min range
had limited affects on the overall system performance so it was not considered for
technology comparison. For the remaining two technologies, tracking time and
launch delay, two notional technologies will be considered for each.
3. Develop Cost Models
Cost models must be developed for each technology under consideration. These
models should represent the best assessment of the required funding necessary to
achieve a given level of performance for the variable that they affect. For the APS
analysis, the two sets of technologies for tracking time and launch delay both impact
how fast the system is able to react so the cost models determine the tracking time
or launch delay as a function of development cost. The models for the tracking time
are shown in Figure 14, the cost models for launch delay and CM Minimum Range
are shown later.
Development Cost for Tracking Time
120000000
-Cost
1
-Cost
2
S100000000
3
80000000
--
60000
-*40000000
20000000
0
0
0.2
0.4
0.6
0.8
1
1.2
Tracking Time Duration (sec)
Figure 14: Tracking Time Development Cost Models
The two cost models are intended to represent two different technology
cases. The first case, Cost 1, represents a technology that is limited in capability as
its performance increases towards the upper bound. This case requires stretching
the technology to its limits as the performance increases to its limit. This case
would be similar to the increasing levels of performance from microprocessors. A
marginal increase in performance can be made without substantive changes to the
chip architecture but as the performance increases, the current architecture reaches
a limit. The second case, Cost 2, represents a technology that has a higher initial
development cost but once it is working it is able to be developed further without
significantly stretching the technology. This again could be exemplified by the
microprocessor example where a vendor may decide to change the manufacturing
process to increase the microprocessor performance. In such as case the initial
development cost is higher but once begun, the incremental cost to increasing
performance is less.
Similarly, the cost models for the launch time delay are shown in Figure 15
Development Cost for Launch Time Delay
120000000
100000000
80000000---60000000
40000000
Q20000000
0
0.2
0.4
0.6
0.8
1
1.2
Time Delay (s)
Figure 15: Launch Time Delay Development Cost Models
Finally, the cost model for the CM Minimum Range is shown in Figure 16. Because
the baseline case showed that the changing the CM Minimum Range value had
virtually no effect on overall system performance, only one cost model was used.
Cost for CM Minimum Range Reduction
120,000,000
100,000,000
80,000,000
60,000,000
40,000,000
P
20,000,000
0~
0
10
20
30
40
50
CM Range (m)
Figure 16: CM Minimum Range Development Cost Model
4. Develop Weapons System or Sub-System Model
The decision maker then needs to have a system or sub-system model created for
the weapons system to be analyzed. The complexity of the models can vary
significantly depending on the weapons system in question. For this analysis, a
simplified system model of the Active Protection System was constructed. The most
significant simplification was that the probability of intercept, assuming a countermeasure was launched, was represented by a constant instead of a function of to all
of the complexities of a live engagement between an incoming threat munition and
the counter-munition. In this particular model, this simplification is appropriate
given that the technologies under consideration do not directly impact the intercept
engagement, instead they impact the likelihood of counter-measure launch.
The system level model of the APS system was developed to simulate its
effectiveness against a variety of threat devices. In order to keep the simulation and
resulting data releasable to the public, all input parameters are either based on
documents in the public domain or are notional. The model assumes that there are
four classes of threat munitions and utilizes one example of each class. These four
classes are Anti-Tank Guided Missiles (ATGM), rocket propelled grenades (RPG),
armor-piercing fin-stabilized, discarding sabot (APFSDS) or kinetic energy
penetrator, and high-explosive anti-tank round (HEAT). The specific munitions
selected for this analysis are:
ATGM - AT-15
RPG - RPG-7
APFSDS - 3BM42
HEAT - 3BK14M
The input specifications for each of these munition types are shown in Table 6. Note
that the maximum range and average velocity are taken from the sources listed, all
other values are notional. Using a random number generator, each threat munition
was determined to form a given percentage of the likely threat. These percentages
are listed in the second column of Table 6, such that out of 100 likely engagements,
the AT-15 would be encountered 23 times, the RPG-7 40 times, and so on.
Table 6: Threat Munition Specifications
Threat
Percent
Min.
Range
Max.
Range
Avg.
launch
(m)
(m)
Range
Std Dev
Avg.
Velocity
Range
(m)
(m/s)
(M)
(30%)
Vel.
Std
Dev
(20%)
(60%)
AT-15(Pike,
2006)
23.05
250
6000
3600
900
400
20
RPG-7(South
African Army,
2008)
40.33
50
500
300
75
300
15
3BM42 APFSDS (Jane's,
2010)
22.63
400
3800
2280
580
1700
85
3BK14M-HEAT
(Jane's, 2010)
13.99
350
1500
900
225
905
45
The simulation then uses the Statistics Toolbox to determine the likelihood of a
successful engagement of the APS system using Monte Carlo methods according to
the following steps.
I. Determines what type of threat is being launched at the host vehicle
according to the percentages in Table 6.
II. Determine the launch range and the threat munition velocity via normal
distribution with input values from Table 6.
III. Calculate the probability that the APS tracking radar detects the incoming
round according to the likelihood shown in Figure 17. The radar is assumed
to have a sampling rate of 5 Hz. Each .2 seconds, the system is given the
opportunity to detect the incoming round based on the probability below.
APS Probability of Detection
1
0.9
C 0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.10
-
8
--
0
500
-
--
1000
1500
2000
2500
3000
3500
-
4000
4500
Range to Inconing Munition (m)
Figure 17: APS Model System Probability of Detection
IV. Once the round is detected, the system determines a firing solution during
the tracking time. After this time has elapsed, the system decides whether or
not to launch a countermeasure based on the calculated intercept position
which accounts for the threat munition velocity and the counter-measure
munition velocity. If the calculated intercept position is less than the CM
minimum range or greater than the CM maximum range, the system will elect
not to fire a counter-measure munition
V. Once the system elects to launch a counter-measure, there is a time delay
from the time the launch command is issued to when the counter-measure is
in flight towards the incoming threat round.
VI. For the purposes of this simulation, it is assumed that the Counter-Measure
will have a 70% probability of intercept of the incoming round. Therefore,
for this simulation, the highest system effectiveness possible is 70%.
A summary of the input variables in the model are listed below:
Note that the variables that were optimized in the simulation are listed in bold.
Probability of Defeat (assuming CM launch):
Pdf
=.
CM minimum range: cmmin = 6 to 40 meter
CM maximum range: cmmax = 850 meter
CM average velocity: cmvel = 1000 to 2500 meter/second
CM velocity standard deviation: cmvelstd 10 meter/second
Tracking time duration: tracktime = 1 to .1 second
Launch Time Delay: launchdelay = 1.05 to .15 second
Radar Sample Rate: samplerate= 5 Hz
Probability of Detection: Pet =
r
Initial trial runs were conducted to determine the number of iterations
needed for the results to converge. Several truncated simulations were conducted
with run numbers from 1000 to 2500 at 500 run increments for the first eight
combinations of variables. Each truncated simulation was conducted three times
and the standard deviation for the overall effectiveness was calculated. The
standard deviation is plotted as a function of the number of runs shown in Figure
18.
Simulation Convergence
U.U2
0.02
"S 0.015
.
0.01
*
0.005
0
0
500
1000
1500
2000
2500
3000
Number of Runs
Figure 18: Convergence of Monte Carlo Simulations
Based on the convergence simulations, 2000 runs are sufficient for convergence and
the following simulations utilize 2000 runs for each combination of variables.
Given these previously described inputs, a baseline case was run to ensure
that the simulation worked as intended. The baseline runs had the following
parameters as inputs:
Time Delay = 1 Sec
CM Velocity = 1000 m/sec
CM Minimum Range = 40m
Tracking Time = 1sec
The results of the baseline case are presented in Figure 19.
APS Effectiveness
Base4ine Samulaton Results
No Launch
cmvel=1000
de4ay=1
n 46
Intercepts
zn=87
n=280
Misses
cnmnin=40
n=53
n=2001
HEAT
Total
0.8
0.6
0.4
0.2
0
ATIS
RPG7
APFSDS
Figure 19: APS Effectiveness - Baseline Case
Shown in Figure 19 are the calculated percentages of the attacks for each threat
class that would result in no launch of the counter-munition, successful intercept,
and the number of misses of the counter-munition. Using this methodology, a
successful engagement would only be scored for those listed as intercepts. If the
APS system did not launch a counter-measure or if the counter-measure missed, the
engagement could result in casualties to the vehicle crew. A summary for all of the
threats is presented in the right of the figure. The numeric percentages shown in
Figure 19 are listed in Table 7. Note that the routine calculates the required number
of runs for each threat class based on the percentages assigned in Table 6 and then
rounds up to the nearest integer. For the baseline case, this resulted in 2001 runs
instead of 2000 as intended.
Table 7: Baseline case results
% No
Runs
Launch
Intercepts
% Misses
AT-15
461
0.0%
66.6%
33.4%
RPG-7
APFSDS
807
280
90.1%
85.7%
6.6%
10.4%
3.3%
3.9%
HEAT
453
59.8%
28.7%
11.5%
Total
2001
61.9%
25.9%
12.2%
Development Cost
As stated in the previous section, an initial case was run with the following
investment curves for Launch Time Delay (Figure 20), CM Velocity (Figure 21), and
CM Minimum Range (Figure 22).
Cost for Reduced Launch Time Delay
$120,000,000
_
$100,000,000
$80,000,000
$60,000,000
$40,000,000
$
$20,000,000
$0
0
0.2
0.4
0.6
0.8
1
1.2
Time Delay (s)
Figure 20: Return on Investment for Launch Time Delay
Cost for CM Velocity Increase
$120,000,000
E"
$100,000,000
-
$80,000,000
$60,000,000
g
$40,000,000
$
$20,000,000
-
$0 0
/
1
1
1
500 1000 1500 2000 2500 3000
Average CM Velocity (m/s)
Figure 21: Return on Investment for CM Velocity
Cost for CM Minimum Range Reduction
120,000,000
100,000,000
*
80,000,000
60,000,000
40,000,000
20,000,000
2
0
0
10
20
30
40
50
CM Range (m)
Figure 22: Return on Investment for CM Minimum Range
A series of simulations were conducted with each of the possible combinations
listed in Table 8. For each simulation, the estimated development cost was then
calculated based on the cost curves presented previously, baseline values are shown
in bold. The total number of variables summed up to 2880 combinations.
Table 8: Baseline Simulated Variable Values
Simulated Variables
CM Min Range (m)
40
38
36
34
32
30
28
26
24
CM Velocity (m/s)
1000
1100
1200
1300
1400
1500
1600
1700
1800
22
20
18
16
14
12
10
1900
2000
2100
2200
2300
2400
2500
Delay
(sec)
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
Based on the cost curves in Figures 14 through 16, the possible development
cost could vary from zero (baseline case) to $249M.
After completing the simulations, the total system performance was plotted as a
function of cost to determine if the changes to these variables had an effect on
overall APS system effectiveness. The system effectiveness as a function of cost is
shown in Figure 23.
APS System Effectiveness
Simulaton I
0.75
0.7
0.65
0.6
0.55
0.5
0.45
0.4
0.35
0,3
0.26
0
0.5
1
1.5
2
Development Cost (Dollars)
2.5
x 106
Figure 23: APS System Effectiveness
The system effectiveness was then plotted as a function of each variable and is
shown in the following figures. Figure 24 shows that as the launch time delay
decreases from one second to .1 seconds, the overall system effectiveness increases
from between .25 and .33 to .68 to .73.
APS Effectiveness
Effect of Launch Delay
0. 75
0
0.
0
5I
0
0
0
I
4
0
0
T
5!
0.2
0
0-2
0.4
0.6
Launch Time Delay (s)
0.8
1
Figure 24: APS Effectiveness as a function of Launch Time Delay
In contrast, Figure 25 and Figure 26 show the system effectiveness as a
function of the CM average velocity and the CM minimum range. In both of these
figures, the independent variable (CM average velocity or CM minimum range)
shows no clear correlation to the overall system performance. In contrast to Figure
24, the overall system performance does not appear impacted by the independent
variable; for each value of the independent variable, the system effectiveness can
still vary from .27 to .73.
0.6
0.55
wi
OS
0.45
0-4
1000
1200
1400
1600
1800
2000
2200
2400
2600
CM Average Velocity (m/s)
Figure 25: APS Effectiveness as a function of CM Velocity
APS Effectiveness
Effect of CM Minumwi Range
0.75
1111111
iIIII!
07
0.6
06
0.55
w 0.6
0.46
0.34
0.3
5
10
15
20
25
30
3S
40
CM Miimum Range (m)
Figure 26: APS Effectiveness as a function CM Minimum Range
Based on these results, it is shown that neither the CM Minimum Range or CM
Velocity variables had any meaningful impact on the system effectiveness.
Therefore, the simulation was modified to replace the CM Velocity variable
with the tracking time duration variable. Four simulations were run with two cost
models each for the Time Delay and the Tracking Time variables, shown in Figure
20 and Figure 21. The cost model for CM Minimum Range was carried over from
Figure 22 for all four simulation configurations. The values for the simulated
variables are listed in Table 9.
Table 9: Simulation Variable Values
Launch
Time Delay
(S)
Tracking
Time (s)
CM Minimum
Range (m)
1.05
1
40
0.95
0.85
0.75
0.65
0.55
0.45
0.35
0.25
0.15
0.95
0.9
0.85
0.8
0.75
0.7
0.65
0.6
0.55
0.5
0.45
0.4
0.35
0.3
0.25
0.2
0.15
38
36
34
32
30
28
26
24
22
20
18
16
14
12
10
8
6
For simplicity, the cost models for the Tracking Time and the Launch Time
Delay are repeated in Figure 27 and Figure 28. For each cost curve, Cost 1
represents the use of an existing technology that is stretched towards its theoretical
limit. Cost 2 represents a new technology that requires further upfront investment
to operate in a real-world environment.
Development Cost for Tracking Time
120000000
-Cost
i
1
100000000
W 800Cost02
o 80000000
60000000
40000000
O
20000000
0
0.2
0.4
0.6
0.8
1
1.2
Tracking Time Duration (sec)
Figure 27: Tracking Time Cost Models
Development Cost for Launch Time Delay
120000000
100000000---Cs1
~8O000OO-Cost
-- Cost 22
80000000
60000000
40000000
20000000
0
0
0.2
0.4
0.6
0.8
1
1.2
Time Delay (s)
d
Figure 28: Launch time Delay Cost Models
The cost models used for each simulation are shown in Table 10.
Table 10: Simulation Cost Matrix
Simulation
Tracking Time
Launch Time Delay
1
2
3
4
Existing Technology
New Technology
Existing Technology
New Technology
Existing Technology
Existing Technology
New Technology
New Technology
Chapter 5- Results
The results for the four simulations described above are shown in Figure 29.
Simula tion Results
Samiation
I
08
0.7
07
0.6
0-6
0.5
05
0.4
0.4
0-3
E 0
S0 3
0.2
02
011
0
0.5
1
1-5
Development Cost ($)
2
0,1
2.
x 106
Sanutation 2
-
4
0
0.5
1
1.5
Development Cost (S)
Simlation 3
^o.-
2
2!
x 10"
Sinulation 4
0'
07
0.6
0.5
0.4
E
0.3
02
0
1.5
Development Cost ()
0.5
1
2
2.5
01
0
i
05
1
1.5
Development Cost (S)
2
25
x le
Figure 29: APS Effectiveness as a Function of Investment Cost
As each of the four plots shown in Figure 29 varies in shape, the system
effectiveness varies based on the cost models utilized for the tracking time and the
launch delay time. In comparison, if the cost models did not impact the system
effectiveness, the four plots would appear the same.
Plotting the system effectiveness as a result of the tracking time is presented
in Figure 30.
System Effectiveness - Tracking Time
Simulation I
0.8
7
0.7
0
06
0.6
0.5
0's
04
0.4
0.3
0.3
0.2
0.2
0
0.5
1
Tracking Time (sec)
15
Simulation 3
01
Simulation 2
01
0
0.8
07
0-7
jO4
0.6
0.6
0's
0-4
04
0-3
0.3
0.2
02
0.1
0
0.5
1
Tracking Twme
(sec)
15
0-1
0
0.5
1
Tracking Time (sec)
15
Simulation 4
0.5
Tracking Time (sec)
IS
Figure 30: System Effectiveness as a Function of Tracking Time
Figure 30 shows that as the tracking time is reduced, the possible system
effectiveness is altered. While decreasing the tracking time does not necessarily
constitute a significant increase in overall system effectiveness, the tracking time
must be reduced from the starting value of one second to increase the system
effectiveness. Conversely, if the tracking time is not reduced from one second, the
system effectiveness will never rise above approximately .32.
Similarly, Figure 31 shows the overall system effectiveness for the four
simulations as a function of Launch Time Delay. As is the case with Tracking Time,
the Launch Time Delay must be reduced from the system effectiveness to rise above
approximately 30%.
System Effectiveness - Launch Time Delay
Smuahion
1
Samutaion 2
0.8
08
0.7
07
Ii
0.6
0.5
SA
0,6
0.4
~
0OA
015
1
S301
0.3
02
0.10
0
Lun
Launc h Time Delay (sec)
1
1.
0 0
Simulafion 4
Simutation 3
0.7
0-7
0.6
06
06
A
06
0.4
0.4
EE
0.3
0.5
Launch Time Delay (sec)
02
02
01
1
0.5
Launch Time Delay (sec)
0
1.5
0
0.5
1
Launch Time Delay (sac)
15
Figure 31: System Effectiveness as a Function of Launch Time Delay
To understand the cost effectiveness of each of the four cost development
models, the results shown in Figure 29 were then binned into 5% effectiveness
increments and the most cost effective combinations of the three variables was
selected representing the preferred investment option for achieving a given system
effectiveness. These selected combinations of variables are plotted on the system
effectiveness results as red "+"signs shown in Figure 32. In essence, the red "+"
indicate the cheapest combination of variables that results in a given level of system
effectiveness.
Simulaton 1I
0.8
Simulation 2
0.8
e0.8
08
00
0O.3
0.
0.2
0-2
0.1
0.60.5
0
S0.4
1
1.5
0.5
02
Development Cost ($)
lbu
013
00-0.5
2.5
x 10a
1
15
0.5
Development Cost ($)
2
2.5
x 10
0.4
3
Siao ereon
0.8
0
simulation 4
0,8
0.70.
j-.
0.5
O
0.4
U)
(fl
0-2
0.2
01'
-0.5
0
1
1.5
0.5
Deve~opment Cost 4$)
2
0.1
-056
2.5
x1,Development
0
0.5
1
156
Cost 4S)
2.5
2
x1
Figure 32: Cost-effective Variable Combinations
Each of the preferred investment options for the four simulations was
plotted as a function of system effectiveness. These points represent an ideal
investment plan for each of the four simulations. The following three figures
describe the optimal combinations of variables for each of the four simulations.
Figure 33 shows the development path for launch time delay, Figure 34 the tracking
time, and Figure 35 the CM minimum range.
Optimal Development Path - Launch Time Delay
Samulation 1
aek----
Simulation 2
Simulation 3
Simulation 4
0
06r
0.4
~
%
4>
0,2
0L
0
0-1
0,2
0.6
0.4
0,3
System Effectiveness(%)
0,7
0.6
Figure 33: Launch Time Delay Development Path
Optmal Development Path - Tracking Time
-
Simulation
Simulation
Simulation
Simulation
I
2
3
4
0.8
O6
0L
0.2-
0
0.1
02
0.5
0.3
0-4
System Effectiveness(%)
0.6
0-7
Figure 34: Tracking Time Development Path
0.8
Optimal Development Path - CM MAnimum Range
40,
230L
0
Simulation 1
26.
0.
,W
Simulation 2
Simulation 3
Simulshon 4
2
0.6
0.3
CI
System" Effectivenea(%)
06
0,7
018
Figure 35: CM Minimum Range Development Path
Taken together, these three plots show a decision maker the level of
performance required from each of the three variables to achieve the required level
of performance for the least investment. Utilizing these three plots, the decision
maker could decide what level of system effectiveness is required and then select
the required performance for each of the three variables utilized in the analysis.
Finally, Figure 36 shows the development cost as a function of system
effectiveness for the four different simulations.
Cost
Optimal Development Path - Development
x 107
12
10/
6-
2
Simoulation 1
0
0.1
0,2
-20 03
0.4
05s
System Effectiveneaa(%)
0.6
Simutation 2
Simulation 3
Simulation 4
017
8
0,8
Figure 36: Cost-Effective Development Path
Combined, these curves show the development cost needed for achieving
different levels of system effectiveness, providing an easy method for comparison of
the different technologies reflected by the cost curve for the simulations. This
analysis shows that applying the proposed methodology to a specific example
results in meaningful results that can form the basis for a decision maker to select
technologies for a program. The following chapter further describes the
implications of this methodology and how a decision maker can utilize it to improve
technology selection.
Chapter 6 - Discussion
The APS system model illustrates that using a technology cost analysis model
can help guide investment decisions that result in the most cost-effective
development program. Of most significance, the simulation showed that changes in
the estimated development costs alters the most cost-effective development
strategy. If one assumes that the cost model and system model are accurate, then a
sensitivity analysis can be used to determine which technology investments should
be made to increase overall system effectiveness.
For most acquisition programs, there are two criteria for making a selection:
cost and requirement. For a given system, a decision maker may intend to develop a
particular capability where cost is the dependent variable, capability is the
dependent variable, or where there may have to find a compromise between the
two.
The most notable feature of the four simulations conducted is that the
majority of combinations of the different investment options wasted money. For the
results shown in Figure 29, only those results to the left of each plot represent an
efficient use of investment dollars. For all other combinations of investments,
money is spent on improving technologies that do not improve the overall system
performance. This result suggests that the technology cost analysis model can
indicate which options are efficient uses of investment dollars and which are not.
Returning to the APS simulation, these different approaches are shown in
Figure 37 where the two vertical lines marked as X1 and X2 represent two different
capability requirement thresholds, X1 at 30% effectiveness and X2 at 67%
effectiveness. Similarly, the two horizontal lines represent two different
development cost limits, Y1 at $30M and Y2 at $80M.
Optmal Development PaM - Developmen Cost
XW1
X1
X2
10
9
Y2
0
SImulation 2
SImulation 3
Simulation 4
0o
01
Simulation 1
0.2
0-3
0.4
0.5
System Effectivenesst'o)
06
07
0.8
Figure 37: Optimal Development Path with Cost/Requirement Limits
Using the results from the previously described simulation, a decision maker
focusing entirely on providing a given capability could utilize lines X1 and X2. To
provide the capability at 30% effectiveness, the cheapest technology selection
would be that represented by Simulation 1 at approximately $35M. In comparison,
to provide an effectiveness of 67%, the cheapest technologies are those represented
by Simulation 2 at approximately $80M.
The lines Y1 and Y2 can represent a cost-constrained approach. If the
decision maker can spend $30M to develop the capability, then Simulation 1
represents the highest capability for that level of funding at approximately 35%.
Similarly, Y2 represents a development budget of $80M which would lead the
decision maker to select Simulation 2 providing a system effectiveness of
approximately 67%.
However, as previously described there are occasions where the decisionmaking criteria are not quite so clear-cut. In such a case, the decision maker would
need to apply their own preference for creating a trade-off between cost and
effectiveness. If desired, it would be possible to create rule-based criteria to help
the decision maker evaluate and select among alternatives.
Lastly, if the program in question is utilizing a spiral type development, it
may be more complicated. In spiral development programs, the system is fielded
prior to meeting all requirements and as the system is improved, these improved
versions are introduced into the field. In such a case, it may be cost-effective to keep
the final requirements of a program in mind while making a decision. For example,
in the results of the simulation presented above, if Spiral 1 was required to have an
overall effectiveness of 50%, then the logical choice would be the technologies
represented by Simulation 1. However, if a later spiral is supposed to reach an
effectiveness of closer to 70%, then it may be better to pick the technologies
represented by Simulation 2. Such a selection would allow selection of technologies
that will meet both current and future requirements but may require more funding
early on in the program's life. While each case is different, in most cases it would be
prohibitively expensive to switch technologies in the midst of system development.
Given these different approaches, a sub-system model with probabilistic
analysis can help the decision maker make a rational decision about which set of
technologies can be the most cost-effective based on the attributes established when
the model was created. While there are still difficult decisions to make, the
modeling serves to provide the decision maker with data that is able to predict
possible outcomes of the decision.
An additional application of the simulation is that it can also help plan for
system improvements. For example, if a particular program is budget constrained
and is unable to achieve the desired level of performance, it may be preferential to
pick a lower-capability design in order to plan for future upgrades when funding is
available. Such a decision could be based on picking particular technologies even if
they do not lead to the most capable system up front.
For example, the US Army is currently in the demonstration phase of the
Joint Air to Ground Missile (JAGM). It is expected that the complete program will
end up totaling more than $5 billion dollars in R&D and acquisition. Two teams are
currently testing their prototype designs for a down-select decision in the near
future. The JAGM missile utilizes three separate seeker modes, infrared (IR),
millimeter wave (MMW), and semi-active laser (SAL) for all-weather targeting.
Lockheed Martin's design utilizes a cooled IR seeker, which "provides 50% greater
visibility and range." In contrast, the Raytheon-Boeing team claims that their uncooled IR seeker is superior because of "lower costs, less weight, fewer parts, and
less chance of leaks." (Clark, 2010)
In such a case, the cheaper technology (uncooled IR) may possibly provide
the level of performance initially required but in the long run could prove to be a
limiting factor if the Program Manager chooses to upgrade the system later on. Such
a situation could be represented as shown in Figure 38. Note that while using the
JAGM seeker as an example, the figure shown is purely notional. In the figure, the
probability of detection of the two technologies is represented as a function of the
development cost with the initial required capability shown.
Technology Comparison
UncooedIR
ooled IR
--
.
Required Capabdlity
-
FutureReirement
DeV-WWofP"M"nt Cost(S
Figure 38: Notional Technology Development Cost
As shown, if meeting the "Required Capability" shown in the figure is the only
consideration, then the Uncooled IRtechnology is the cheaper method to meet the
requirements. However, the program manager could foresee a future requirement
for higher probability of detection for the IR seeker which could be illustrated as the
"Future Requirement" shown in Figure 38. In such a case, it may be in the best
interest of the program to pay the initial cost up front for the cooled IR technology
to allow for future upgrades without having to change seeker technologies to meet
the future requirement. To better understand the situation, the decision maker
could commission a system model with probabilistic analysis to better understand
the trade-offs in picking each option. In addition to just looking at the seeker head,
the model could include other contributing sub-systems. For example, perhaps a
higher-fidelity sensor on the aircraft platform could makeup for limitations in the
missile seeker and the model could take into account the cost-benefit comparison of
the cheaper missile-seeker combined with a more expensive aircraft sensor. This
model could then offer solid data to the decision maker who could make an
informed decision based on the cost and performance of the total system as opposed
to just focusing on the missile itself.
While the previously described system modeling and probabilistic analysis
does offer significant benefits to decision makers, it does have its limitations. The
adage "garbage in, garbage out" applies to this approach. If the cost estimates or the
system model itself are flawed, the outputs from the analysis will similarly be
flawed. For this reason, this tool will become more useful as a program advances
through the development process. This is because as the system is developed,
development test results of the sub-systems or system can be used to validate the
model. Similarly, as the program progresses, the initial cost estimates of various
technologies will have further fidelity as some of the challenges for each technology
can be better understood as they are attempted to be overcome.
Improving the fidelity in cost estimating is worthy of significant effort as this
is a reoccurring problem in many programs. However, any discussion of this is
beyond the scope of this thesis. However, it should be noted that improving the
fidelity of cost estimating will have a significant impact on the accuracy and utility of
the system modeling with probabilistic analysis.
The technology cost analysis model described previously can have a
beneficial effect in many R&D projects to better understand the results of selecting
various technologies for inclusion in the program. The simulations show that the
approach can be used to glean a number of different conclusions such as whether a
particular technology or technologies improve overall system performance. Similar
application to other R&D projects will likely result in improved decision making in
selecting technologies for a given system.
Chapter 7- Conclusion
This thesis has described in detail the current method of technology selection
in many DOD projects and has also described in detail the ill-effects of poor
technology selection in three legacy programs. The shortfall in technology selection
has cost the United States' taxpayers billions of dollars over the last several decades
and has delayed numerous programs by years. As shown in the included simulation,
technology cost analysis modeling can be used to determine the cost-benefit ratio of
a given technology. This same analysis can also be used to determine which
variables will have a greater affect on the overall system capabilities. By
implementing such an analysis, a decision maker can be given a more rational basis
for selecting technologies than previously employed. However, to better utilize such
a method, further efforts should be made to improve the fidelity of cost estimating
in order to make the outputs from the analysis as accurate as possible.
While this methodology does require additional effort and cost on the part of
program managers, it will have a significant benefit on the overall success of the
programs. However, application of this methodology does require the addition of a
new skill set for those supporting the project manager. For this approach to work, it
is likely that such a capability needs to reside within the government project
managers requiring that engineers, cost analysts, or others in the program manager
office need to develop the capability to conduct technology cost analysis modeling.
Based on the results of this simulation, it is recommended that decision
makers utilize this approach to selecting technologies. Once utilized in an R&D
program to select a technology, it is recommended that the model and inputs be
continually updated to reflect changes to the system model and cost-benefit inputs
to ensure that they reflect the most accurate information. It is further
recommended that this approach be utilized in a number of systems to compare the
effectiveness of this tool relative to the whole of DOD acquisition to enable
validation of its utility.
As mentioned in Chapter 1, challenges faced by DOD program managers in
selecting technologies for defense programs are similar to those faced by program
managers in the commercial world. While this approach was developed primarily
for DOD development programs it would likely work in non-DOD applications as
well. It is also recommended that commercial program managers consider the
application of this methodology to their programs to determine if this approach
could benefit the commercial sector as well.
Bibliography
Ackerman, S. (2010, Nov 1). Pentagon'sFavoriteJet Delayed as Costs Rise Yet Again.
Retrieved Nov 1, 2010 from Danger Room:
http://www.wired.com/dangerroom/2010/11/pentagons-favorite-jet-delayed-ascosts-rise-yet-again/
Butler, A. (2008, Mar 19). USA On The KC-X Defensive A Year Ago. Aerospace Daily &
Defense Report.
Caryl, C.(2010, July 7). Life by a Thousand Cuts. Retrieved July 22, 2010 from Foreign
Policy:
http://www.foreignpolicy.com/articles/2010/07/07/life-by-a-thousand-cuts?page
=full
Chivers, C.(2010). The Gun. New York, NY: Simon & Schuster.
Clark, C. (2010, Jul 28). BAE's GCV Weighs 53 Tons, Hybrid. Retrieved Nov 22, 2010
from DOD Buzz: http://www.dodbuzz.com/2010/07/28/baes-gcv-weighs-53-tonshybrid/
Clark, C. (2010, July 15). CSAR Copter Marks Buying Shift. Retrieved July 16, 2010
from DOD Buzz: http://www.dodbuzz.com/2010/07/15/csar-copter-marksbuying-shift/
Clark, C. (2010, Oct 7).JLTVSinking, EFV Wobbly. Retrieved Nov 5, 2010 from DOD
Buzz: http://www.dodbuzz.com/2010/10/07/jltv-sinking-efv-wobbly/
Clark, C.(2010, October 15). Questions Rise OnJAGM Missile. Retrieved October 23,
2010 from DoD Buzz - Online Defense and Acquisition Journal:
http://www.dodbuzz.com/2010/10/15/questions-rise-on-jagm-missile/
Crane, D. (2009, Aug 30). Artis Iron CurtainActive ProtectionSystem (APS). Retrieved
September 12, 2010 from Defense Review: http://www.defensereview.com/artisiron-curtain-active-protection-system-aps-shoot-down-ballistic-reactive-groundvehicle-defense-system/
Drew, C.(2010, June 27). Military Costs Under Review in Bid to Trim Waste. Retrieved
July 21, 2010 from New York Times:
http://www.nytimes.com/2010/06/28/business/28contracts.html?ref=defensede
partment
Duffy, T. (2006, Feb 16). PentagonDemotes Airborne Laser Program.Retrieved Nov
1, 2010 from Military.com:
http://www.military.com/features/0,15240,88020,00.html
Feickert, A. (2010, September 17). Joint Light Tactical Vehicle (JLTV): Background
and Issues for Congress. CongressionalResearch Service, p. 1.
GAO. (2007). Analysis of Processes Used to EvaluateActive ProtectionSystems 07-759.
Washington DC: GAO.
GAO. (2008). BEST PRACTICES - IncreasedFocus on Requirements and Oversight
Needed to Improve DOD'sAcquisition Environment and Weapon System Quality - GAO08-294. Washington DC: US Government Accountability office.
GAO. (2005). DEFENSE ACQUISITIONS - Asessments of Selected Major Weapon
ProgramsGA0-05-301. Washington DC: US Government Accountability Office.
83
GAO. (2010). DEFENSE ACQUISITIONS - Assessments of Selected Weapon Programs
GAO-1 0-388SP. Washington DC: United States Government Accountability Office.
GAO. (2009). DEFENSE ACQUISITIONS - Assessments of Selected Weapon Programs
GAO-09-326SP. Washington DC: US Government Accountability Office.
GAO. (2010). DEFENSE ACQUISITIONS - Observationson Weapon Program
Performanceand Aquisition Reforms - GAO-10-706T. Washington DC: United States
Government Accountability Office.
GAO. (2010). DEFENSEACQUISITIONS - OpportunitiesExist to PositionArmy's Ground
Force ModernizationEffortsfor Success - GAO-10-406. Washington DC: US
Government Accountability Office.
GAO. (2011). Defense Acquisitions: Issues to Be Consideredas DOD Modernizes Its
Fleet of Tactical Wheeled Vehicles GAO-11-83. Washington DC: US Government
Accountability Office.
Gates, R. M. (2010). DOD News Briefing with Secretary Gatesfrom the Pentagon,
August 09,2010. Washington DC: Federal News Service, Inc.
Grant, G. (2010, August 25). Army Abruptly Cancels Ground Combat Vehicle
Competition (Updated). Retrieved August 25, 2010 from DEFENSETECH:
http://defensetech.org/2010/08/25/army-cancels-ground-combat-vehicle-gcvcompetition/
Guardiano, J.R. (2008, November 17). FCS Active ProtectionSystem in 'Top 50'
inventions. Retrieved July 20, 2010 from US Army Home Page:
http://www.army.mil/-news/2008/11/17/14274-fcs-active-protection-system-intop-5 0-inventions/
Hand, E. (2010, July 28). Enlisting Investigators. Nature.
Hodge, N. (2010, February 26). Building a more Survivable 'Future'forthe Army.
Retrieved July 23, 2010 from Danger Room:
http://www.wired.com/dangerroom/2010/02/building-a-more-survivable-futurefor-the-army/#more-22901
International Council on Systems Engineering. (2009, Sep 29).JournalofSystems
Engineering.Retrieved Oct 25, 2010 from International Council on Systems
Engineering:
http://www.incose.org/ProductsPubs/periodicals/journalofsystems.aspx
Jane's. (2010, September 06).125 mm APFSDSAmmunition. Retrieved September
29, 2010 from Jane's Ammunition Handbook:
http://www4.janes.com/subscribe/jah/doc-view.jsp?K2DocKey=/cont...NS%27+or
+%27Russian+Federation%27%29+%3CIN%3E+body%29%29%29%29
Jane's. (2010, January 14).125mm HEA T-FSAmmunition. Retrieved September 29,
2010 from Jane's Ammunition Handbook:
http://www4.janes.com/subscribe/jah/doc view.jsp?K2DocKey=/cont...UNS%27+o
r+%27Russian+Federation%27%29+%3CIN%3E+body%29%29%29%29
Kennedy, M. e. (2006). Analysis ofAlternatives (AoA)for KC-135 Recapitalization,
Executive Summary. Rand Corporation. Rand Corporation.
Knaack, M. S. (1978). Encyclopeida ofAir ForceAircraft and Missile Systems (Vol.
Volume I). Washington D.C.: Office of Air Force History.
Magnuson, S. (2010, August). New Truck To Show The Way for Acquisition Reforms.
NationalDefense.
Missile Defense Agency. (2010, Feb 11). Laser Test Bed Successful in Lethal
Intercept Experiment.
PEO JSF. (n.d.). The F-35 Lighting II. Retrieved Oct 26, 2010 from The F-25 Lighting
II: www.jsf.mil
Pike, J. (2006, 04 22). AT-15 Khrizantema.Retrieved 05 18, 2010 from
GlobalSecurity.org: http://www.globalsecurity.org/military/world/russia/at15.htm
PM JLTV. (2010). ProductManagerJoint Light Tactical Vehicles. Retrieved Nov 5,
2010 from Program Executive Office: Combat Support & Combat Service Support:
http://peocscss.tacom.army.mil/pmLTV.html
Sahadi, J. (2010, July 11). Defense spending: Slaying the sacred cow. Retrieved July 23,
2010 from money.cnn.com:
http://money.cnn.com/2010/07/09/news/economy/defense-spending/index.htm
?postversion=2010071113
Schachtman, N. (2006, April 12). LaserJet's Toxic Interior.Retrieved Nov 1, 2010
from Noah Schachtman's Blog:
http://www.noahshachtman.com/archives/002317.html
Schachtman, N. (2010, Oct 22). Raygun 747 Botches Another Test. Retrieved Nov 2,
2010 from Danger Room: http://www.wired.com/dangerroom/2010/10/flyinglaser-cannon-botches-another-test-sigh/
Schachtman, N. (2010, Feb 12). Video: LaserJetBlasts BallisticMissile in Landmark
Test. Retrieved Nov 1, 2010 from Danger Room:
http://www.wired.com/dangerroom/2010/02/laser-jet-blasts-ballistic-missile-inlandmark-test/#more-22504
Shanker, T., & Drew, C.(2008, November 2). Pentagon Expects Cuts in Military
Spending. Retrieved July 21, 2010 from New York Times:
http://www.nytimes.com/2008/11/03/washington/03military.html?_r=1&ref=def
ense-department&pagewanted=all
Shanker, T., & Drew, C.(2010, July 22). Pentagon Faces Politicaland Economic
Pressuresto Cut Budget. Retrieved July 23, 2010 from New York Times:
http://www.nytimes.com/2010/07/23/us/politics/23budget.html?hp=&pagewant
ed=print
South African Army. (2008, December 02). Weapons Systems: Infantry -Anti Tank
weapons. Retrieved May 18, 2010 from sa army:
http://www.army.mil.za/equipment/weaponsystems/infantry/RPG7ATRL_106mm
%C2%ADRecoillessRifleSyst.htm
Sullivan, M. (2010). Joint Strike Fighter - Significant Challenges and Decisions Ahead
GAO-10-478T. Subcommittees on Air and Land Forcesand Seapowerand
ExpeditionaryForces,Committee on the Armed Services, House of Representatives.
Washington DC: United States Government Accountability Office.
Sweetman, B. (2010, Oct 25). The NextJSF Debate. Retrieved Oct 29, 2010 from Ares:
A Defense Technology Blog:
http://www.aviationweek.com/aw/blogs/defense/index.jsp?plckController=Blog&
plckBlogPage=BlogViewPost&newspaperUserld=27ec4a53-dcc8-42d0-bd3a01329aef79a7&plckPostld=Blog%3a27ec4a53-dcc8-42dO-bd3a-
013 29aef79a7Post%3a50139 1e5-eb78-4e9d-8824895c5e8d4fc6&plckScript=blogScript&plckElementld=blogDest
Wall, R. (2010, July 28).JSF's Reliability Rub. Retrieved Oct 27, 2010 from Ares: A
Defense Technology Blog:
http://www.aviationweek.com/aw/blogs/defense/index.jsp?plckController=Blog&
plckBlogPage=BlogViewPost&newspaperUserld=27ec4a53-dcc8-42d0-bd3a01329aef79a7&plckPostld=Blog%3a27ec4a53-dcc8-42dO-bd3a01329aef79a7Post%3abb96ac85-83f3-4407-bcacd21ddcd47665&plckScript=blogScript&plckElementld=blogDest
Warwick, G. (2010, Sep 12). F-35's Unequal Progress.Retrieved Oct 27, 2010 from
Ares: A Defense Technology Blog:
http://www.aviationweek.com/aw/blogs/defense/index.jsp?plckController=Blog&
plckBlogPage=BlogViewPost&newspaperUserld=27ec4a53-dcc8-42d0-bd3a01329aef79a7&plckPostld=Blog%3a27ec4a53-dcc8-42dO-bd3a01329aef79a7Post%3a22bbfc29-lbdc-4f96-9bce13252c2a3538&plckScript=blogScript&plckElementld=blogDest
Warwick, G. (2009, Aug 03). JSF Faces Showdown on F-35 Cost Estimates. Defense
Technology International.
86
Download