CMMI Transition Status Steering Group Meeting

advertisement
Benefits of CMMI
Within the Defense
Industry
Software Engineering Institute
Carnegie Mellon University
Pittsburgh, PA 15213
May 2010
© 2010 Carnegie Mellon University
Outline
•
•
•
•
Introduction
Benefits of CMMI Implementation
– Quantitative
– Qualitative
Looking Ahead
Summary
This report was created with the cooperation of the Systems Engineering
Division (SED) of the National Defense Industrial Association (NDIA) and
their member companies and DoD organizations.
© 2010 Carnegie Mellon University
2
Purpose of Presentation
Present new evidence about effective implementations of CMMI
• Examples are provided by the defense industrial base and DoD organizations.
• New examples are based upon the measures that practicing organizations use
to track value to their businesses.
• Examples are provided by organizations that have tracked and measured
performance improvements from using CMMI over many years.
• Many of the organizations emphasize high maturity results and show that they
enabled superior performance.
• Their data indicate why CMMI is important to the DoD & its suppliers.
The new data presented in this report demonstrates that effective
implementation of good practices aided by use of CMMI can improve
cost, schedule, and quality performance.
© 2010 Carnegie Mellon University
3
CMMI: Major Benefits to DoD
“Does CMMI work?” We asked our nation’s defense contractors, as well as
government agencies, to share results from their performance improvement efforts
using CMMI. The results spoke for themselves: “Yes, CMMI works!”
The following slides include information from six defense organizations that
responded.*
*Results reported in this presentation are not attributed to protect confidentiality.
© 2010 Carnegie Mellon University
4
Background on the Data for this
Presentation
Organizational and project leaders decided which measures were most
useful to them when tracking the results of CMMI-based improvements.
A common thread was their interest in measuring the effect CMMI had on
schedule, effort and cost, and quality.
The summarized results demonstrate the wide scope of business values
and goals of the participating organizations.
The source studies in this presentation used current data as follows:
• 2010: Organizations 1, 2A, 3, & 6
• 2009: Organizations 5 & 7
• 2008: Organization 2B
© 2010 Carnegie Mellon University
5
Quantitative Measures: Schedule
Performance Results Summary
Measure Used By The Organization Performance Result
On-time deliverables (Organization
2a)
On-time deliverable increase of 4.9%
(organization went from 95% to 99.9% of
projects delivered on time)
Earlier Defect Detection and Repair
(Organization 1)
6.35 times less defect discovery and
repair hours after start of system testing;
potential savings of 5 – 6.5 months in
schedule delay after system tests begin
for average sized project
We all do!
Schedule performance index
(Organization 7)
Increased from .78 to .93 over three years
(a 19.2% improvement in estimation and
execution of schedule)
© 2010 Carnegie Mellon University
6
Quantitative Measures: Effort (Rework) and
Cost Performance Results Summary
Measure Used By The Organization
Performance Result
Total hours for defect repair (Organization 1)
58% fewer hours needed to repair defects for
ML5 versus ML3; Result: a potential cost
savings of $1.9 to $2.3 M per average-sized
project (defined as 233 KESLOC [Kilo Equivalent
Source Lines of Code])
Hours per KLOC to find and fix defects for
CMMI ML5 relative to the SW-CMMI ML3
baseline (Organization 6)
Defect find and fix cost down 22%
Effort hours needed to repair high severity
defects in integration and test phases
(Organization 4)
24% reduction in effort hours per defect
Cost performance index (Organization 4)
Increased from .88 to .96 over two years
Overhead rates for CMMI ML5 relative to the
SW-CMMI ML3 baseline (Organization 6)
Reduced by 7.3%
Software development cost for CMMI ML5
relative to the SW-CMMI ML3 baseline
(Organization 6)
Reduced by 28%
We all do!
© 2010 Carnegie Mellon University
7
Selected Results: High Maturity Reduces
Costs for Repair (Organization 1)
High Maturity Projects
Discover defects
earlier
182.5
Average Hours per Defect per Phase
200
180
Maturity Level 3
160
Maturity Level 5
57.7% Fewer Hours
Overall for ML 5
Early detection and repair lowers
Costs
140
9.3
20
7.9
40
16.5
60
35.7
47.3
80
77.2
Saves 105.3
hours per
defect
15.7
100
22.1
120
57.7% fewer hours for ML5
projects expended to repair
defects versus ML3
105.1
88.6 fewer
hours in
Testing
105.3 fewer hours per defect
• 88.6 fewer hours during
Testing alone
• When largest risk to
schedule occurs
-
Req & Design
Code & UT
Sys & Acpt Test Post Delivery
Total Hours
8
© 2010 Carnegie Mellon University
8
Selected Results: Effort to Repair Defects
by Phase (Organization 1)
Hours to Repair Defects (by Phase)
(233 KESLOC Avg Project)
45,000
40,000
42,519
Maturity Level 3
57.7% fewer hours (24,527)
expended for ML 5
Maturity Level 5
35,000
30,000
25,000
20,000
6.35 times (20,641 hrs) less
risk of Cost or Schedule
impact late in program
24,496
17,992
15,000
11,022
10,000
5,000
8,309
5,155
3,855
3,651
1,846 2,177
-
Req & Design
Code & UT
Sys & Acpt Test
Post Delivery
Total Hours
Potential Cost Savings From $ 1.9 M to $2.3 M per average sized program
© 2010 Carnegie Mellon University
9
Quantitative Measures:
Quality Performance Results Summary
Measure Used By The Organization
Performance Result
Defect density by severity, ML5
compared to ML3 (Organization 1)
62.5% fewer high-severity defects
with ML5 projects
Defect density in circuit board design
(Organization 2a)
65% improvement
Defect containment by phase
(Organization 3)
fix of defects within the phase
We allThe
do!
they were injected increased by 240%
Defect containment, ML5 compared to
ML3, by phase per KLOC (thousands of
lines of code) (Organization 2b)
Defect containment improved 13%
User acceptance test defects per KLOC
(Organization 7)
Less than 0.15 defects per KLOC
% of defects removed prior to system
test (Organization 7)
>85%
© 2010 Carnegie Mellon University
10
Selected Results: Quality Performance
(Organization 3)
© 2010 Carnegie Mellon University
11
Quantitative Measures:
Productivity Results Summary
Measure Used By The Organization
Performance Result
Productivity Gain with ML5 (Organization
1)
42% gain with ML5 organizational
practices over 9 years
Organizational productivity vs. Galorath
SEER SEM Estimation Model
(Organization 1)
Production hours reduction: 33.0%
at ML3; 37.4% at ML5
We all do!
Productivity up 25.2%
Productivity for CMMI ML5 relative to the
SW-CMM ML3 baseline (Organization 6)
© 2010 Carnegie Mellon University
12
Selected Results: Software Productivity
(Organization 1)
Productivity Gain with Long Term ML 5
1.60
42% Gain with ML 5
Organizational Practices
1.40
Average project size was 233
KESLOC
Largest = 1,360 KESLOC
Smallest = 29 KESLOC
1.20
1.00
0.80
0.60
0.40
0.20
0.00
9 Year ML5 Org
Other Projects
Average customer project savings due to increased productivity
• Equivalent of 406 work months per project (33.8 work years)
© 2010 Carnegie Mellon University
13
Quantitative Measures:
Customer Satisfaction Results Summary
Measure Used By The Organization
Performance Result
Award fee (used as an indicator of
customer satisfaction) for CMMI ML5
relative to the SW-CMM ML2 baseline
(Organization 6)
50% of potential additional award
fee achieved
Rose from 5.7 M to 7.1 M (25%)
We all do!
Cost savings to customer in a cost-plus
contract (Organization 1)
© 2010 Carnegie Mellon University
14
Selected Results: Award Fee (Organization 6)
50% of Potential
Additional Award Fee
Achieved
Potential
Additional
Award Fee
Available
Percent
SW CMM L2
SW CMM L3
SW CMM L4
SW CMM L5
CMMI L5
CMMI L5
Customer Satisfaction Continues to Improve
© 2010 Carnegie Mellon University
15
Quantitative Result: Return on Investment
(Organization 2a)
Organization 2a reported their quantified ROI from CMMI Maturity Level 5
activity to be 24 : 1.
Using the data in Performance Results of CMMI ® -Based
Process Improvement (CMU/SEI-2006-TR-004) they
were able to compare their ROI performance to others in
industry:
•
•
•
•
Median ROI
Lowest ROI
Organization 2a
Highest ROI
4:1
1.7 : 1
24 : 1
27.7 : 1
These results are a consequence of meaningful process improvement aligned with
the business and engineering objectives.
© 2010 Carnegie Mellon University
16
CMMI Provides Many Qualitative
Benefits as Well*
Organizations also gathered various qualitative
measures to compliment their quantitative
measurements. They found qualitative benefits such as:
• Reduced
overtime and less intense
• Improved
program insight,
control, and tracking
• Reduced training: process
documentation enables knowledge
transfer to new generation of workers
• Process transformation (via
consistency, integration, coordination)
• Personnel retention and job satisfaction
pressure
• Clear roles and responsibilities for
business execution
• Common language (i.e., defined
processes, measures) across
business units
• Decrease in replanning
• Products with lower levels of defects
and lower risk; one organization offers
a lifetime warranty on products
*based on published benefits from a wide
variety of organizations
© 2010 Carnegie Mellon University
17
The Bottom Line
Why improve processes? - Because processes are the foundation
for all other business improvements, and critical for
• lasting improvements
• successful technology insertion
If a performance management system is not in use, leadership is
unaware of what is and is not working.
CMMI is a proven approach to performance management – with
more than a decade of results showing it does work.
Organizations have provided data that shows CMMI
• enables the delivery of lower-defect products, with predictable
cost, schedule, and quality
• improves business performance
• serves as competitive discriminator
© 2010 Carnegie Mellon University
18
Results Depend on Implementation
Simply deciding to “do CMMI” is not enough to achieve benefits.
Defining good processes, using them, measuring the results, and
making improvements based on what you learn are all key to reaping
the benefits described in this presentation.
The CMMI models are a foundational part of a comprehensive approach
to process improvement that helps organizations understand
• why they should improve
• what frameworks and tools would best fit their needs
• how to implement them
© 2010 Carnegie Mellon University
19
Recent Research on CMMI: Just the Tip
of the Iceberg!
© 2010 Carnegie Mellon University
20
CMMI Research - References
Bibliographic information cited in this presentation:
Gibson, Diane; Goldenson, Dennis R.; and Kost, Keith.
•Performance Results of CMMI-Based Process Improvement
(CMU/SEI-2006-TR-004). Pittsburgh, PA: Software
Engineering Institute, Carnegie Mellon University, August
2006.
Journal Issue: “Performance Results from Process
Improvement.” SoftwareTech News. Vol. 10, Number 1.
March 2007.
Goldenson, Dennis R. and Gibson, Diane L. Demonstrating
the Impact and Benefits of CMMI®: An Update and
Preliminary Results. (CMU/SEI-2003-SR-009). Pittsburgh,
PA: Software Engineering Institute, Carnegie Mellon
University, October 2003.
Journal Issue: “CMMI: Getting a Handle on Process.”
CrossTalk. Vol. 23, Number 1. Jan/Feb 2010.
Herbsleb, James D.; Carleton, Anita; Rozum, James A.;
Siegel, Jane; and Zubrow, David. Benefits of CMM-Based
Software Process Improvement: Initial Results* (CMU/SEI94-TR-013). Pittsburgh, PA: Software Engineering Institute,
Carnegie Mellon University, August 1994. (*Also see SEI
Special Report: Benefits of CMM-Based Software Process
Improvement: Executive Summary of Initial Results,
CMU/SEI-94-SR-013)
Stoddard II, Robert W. and Goldenson, Dennis R.
Approaches to Process Performance Modeling: A Summary
from the SEI Series of Workshops on CMMI High Maturity
Measurement and Analysis (CMU/SEI-2009-TR-021).
Pittsburgh, PA: Software Engineering Institute, Carnegie
Mellon University, January 2010.
Jones, Capers. Assessment and Control of Software Risks.
Upper Saddle River, NJ: Prentice-Hall, 1994 (ISBN 0-13741406-4).
Website about CMMI at the Software Engineering Institute:
<http://www.sei.cmu.edu/cmmi/index.cfm>
© 2010 Carnegie Mellon University
21
Looking Ahead
The road ahead for CMMI implementation
•
A continued focus on high maturity
More and more organizations are striving for and achieving high maturity – and are
collecting data demonstrating the benefits. Once at ML 4 or 5, organizations must maintain
their focus on good implementation practices for continuous improvement.
•
Implementation of CMMI for Services (CMMI-SVC)
CMMI-SVC extends the benefits of CMMI to a new audience. Service providers can use
the model concept that has proven useful in the development community to specifically
address their interests and concerns.
•
Implementation of CMMI for Acquisition (CMMI-ACQ)
CMMI-ACQ helps organizations improve relationships with their suppliers and improve
acquisition processes. The model can enable increased control of projects, better
management of global sourcing of products and services, and more successful acquisition
solutions.
•
Integration with other improvement paradigms (e.g., TSP, ISO, Lean Six Sigma)
Organizations are finding that integrated improvement initiatives can produce outstanding
results. Choosing CMMI doesn’t mean discontinuing improvement efforts already in place
or avoiding new ones that show promise.
© 2010 Carnegie Mellon University
22
Summary
Many stakeholders are involved in the development and maintenance of CMMI
models, with participants from commercial industry, government, and the DoD.
Broad adoption has occurred worldwide. Adopters range from small and midsize
organizations (these are the majority) to large and very large organizations.
Organizations that provide products and services to the DoD use CMMI to improve
programs, systems, product and service management, systems and software
engineering, work processes, and training solutions.
Quantitative and qualitative results have been documented by defense contractors
and others, as shown in this report. There is a great deal of additional data showing
the benefits of CMMI from a broad range of industries, including banking and
finance, manufacturing, medical, and others.
CMMI enables performance improvement focused on business objectives, but the
level of success depends on the implementation.
© 2010 Carnegie Mellon University
23
Who Benefits from CMMI Today?
We all do!
© 2010 Carnegie Mellon University
24
Background Slides if Needed
© 2010 Carnegie Mellon University
25
Background: The Achievement of Excellence
CMMI leads the way to high performance through improved processes.
The management of the development and delivery of software
systems must be guided by quantitatively managed processes.
Performance comes from processes that are
predictable, repeatable, and continuously improving
in terms of product quality, cost and schedule performance,
process performance, and customer satisfaction.
82
100
TSP Fidelity
Schedule
Cust. Satisfaction
Cost & effort
126
84
Organizational Coverage
61
Project Performance
92
Functional Completion
115
Quality
69
© 2010 Carnegie Mellon University
26
Results Overview – Quantitative Measures
We received data/information showing performance improvements in the
following categories:
• Schedule
• Effort/cost
• Quality
• Customer satisfaction
• Business growth
© 2010 Carnegie Mellon University
27
Background: Leadership, Stewardship, and
Evolution of Maturity Models
Many stakeholders have been involved in the
development and evolution of the maturity models
published by the SEI, with hundreds of people
contributing their time and expertise over the years.
Industry participants
AT&T
Automatic Data Processing
BAE Systems
Boeing
Comarco Systems
Computer Sciences Corporation
DNV IT Global Services
EER Systems
Ericsson
General Dynamics
General Motors
Harris Corporation
Hewlett-Packard Company
Honeywell Corporation
IBM
Integrated System Diagnostics, Inc.
JP Morgan Chase
KPMG Consulting
Motorola
National Defense Industrial Association
Norimatsu Process Engineering Lab
Northrop Grumman Corporation
Pacific Bell
Q-Labs, Inc
Raytheon
Rockwell Collins
SAIC
Siemens
Software Productivity Consortium
Spirula
SRA International
Systems and Software Consortium
Tata Consultancy Services
TeraQuest, Inc
THALES
Government participants
Defense Logistics Agency
Department of Homeland Security
Federal Aviation Administration
Institute for Defense Analyses
National Reconnaissance Office
National Security Agency
Office of the Secretary of Defense
RD&E Command
US Air Force
US Army
US Navy
TRW
University of Pittsburgh Medical Center
© 2010 Carnegie Mellon University
41 28
CMMI Adoption Knows No Borders
There are 33 countries with more than ten appraisals as of March 2010:
USA
1582
China
1229
India
524
Japan
306
Spain
180
France
168
Korea (ROK) 165
Brazil
144
Taiwan
134
U.K.
113
Mexico
86
Argentina
77
Germany
76
Malaysia
71
Canada
59
Egypt
43
Italy
43
Thailand
38
Chile
37
Australia
36
An estimated 1.8 million people work in organizations
that have had at least one SCAMPI A appraisal since
April 2002.
Also: Colombia, Pakistan, Philippines, Singapore, Israel, Hong Kong,
Vietnam, Turkey, Netherlands, Portugal, Sri Lanka, Ireland and Russia
© 2010 Carnegie Mellon University
29
CMMI Works for Organizations of All Sizes
501 to 1000
6.8%
1001 to 2000
3.7%
2000+
2.5%
1 to 100
58.2%
25 or fewer
13.6%
301 to 500
7.4%
201 to 300
9.0%
26 to 50
15.6%
201 to 2000+
23.8%
101 to 200
19.8%
51 to 75
12.8%
76 to 100
8.9%
Source for these statistical analyses: http://www.sei.cmu.edu/cmmi/casestudies/profiles/cmmi.cfm
© 2010 Carnegie Mellon University
30
CMMI Adoption Is Multi-Sector
Wholesale Trade
0.1%
Transportation,
Communication, Electric, Gas
and Sanitary Services
3.6%
Finance, Insurance and Real
Estate
5.5%
Public Administration
(Including Defense)
3.2%
Retail Trade
0.3%
Fabricated Metal Products
0.2%
Primary Metal Industries
0.3%
Industrial Machinery And
Equipment
0.7%
Business Services
35.0%
Electronic & Other Electric
Equipment
10.4%
Instruments And Related
Products
1.0%
Services
71.1%
Transportation Equipment
2.4%
Health Services
1.3%
Engineering & Management
Services
24.2%
Other Services
10.7%
Other Manufacturing
Industries
1.2%
Manufacturing
16.3%
Source for these statistical analyses: http://www.sei.cmu.edu/cmmi/casestudies/profiles/cmmi.cfm
© 2010 Carnegie Mellon University
31
Why Care about Improving Software
Engineering Performance?
• To improve software engineering cost, schedule, and quality
performance
• To improve competitive economic and military advantage
© 2010 Carnegie Mellon University
32
CMMI: A Strong Partner for DoD and the
Defense Industrial Base
Large or small, organizations that provide
products and services to the DoD share
common challenges, from meeting defense
software specifications and requirements, to
securing networks, to developing and
retaining a talented workforce.
CMMI helps the defense industrial base
create better systems management,
improved software engineering, more
efficient processes, and tailored training
solutions.
CMMI’s worldwide growth even in tough
economic times indicates the value of the
framework.
© 2010 Carnegie Mellon University
33
What Happens When Effective
Processes are Applied in an Organization?
-Defects
-Cost
-Time
-Risk
-Quality
-Time to Market
-Customer
Satisfaction
-Performance
© 2010 Carnegie Mellon University
34
The CMMI Mission & Vision at the SEI
Mission
• Improve the development and acquisition of software through research, and the transition
to practice, of new, breakthrough, but proven engineering management methods.
[by proven we mean having hard data and evidence]
Vision
• Systems and software engineering management are guided by facts, models, and data
that are shown to predictably improve performance and results well beyond the limits of
current practice.
• The practice of managing engineering work is recognized to be not just the responsibility of
management, but of professionals at all levels and in every related activity.
• Professionals that are developing or acquiring systems think and manage quantitatively.
© 2010 Carnegie Mellon University
35
Improving Performance Requires Knowledge
and Expertise
CMMI-DEV
SCAMPI
CMMI-ACQ
People CMM
Goal-Driven Measurement
and Six Sigma
CMMI-SVC
The “What” – Quality Principles
TSP and PSP
The “How” – Appraisal Methods,
Operational Practices, Improvement
Techniques, Measurement and
Analysis Tools
© 2010 Carnegie Mellon University
36
Causal
Analysis and
Resolution
Organizational
Innovation and
Deployment
What processes characterize
high maturity organizations?
Quantitative
Project
Management
Organizational
Process
Performance
© 2010 Carnegie Mellon University
37
Quality and Process Performance Objectives
The engine that drives project
performance
Productivity
The engine that drives business
performance
CPI
Defect Density
Defect Containment
SPI
Requirements Volatility
The engine that drives high
maturity
© 2010 Carnegie Mellon University
38
Organizational Process Performance (OPP)
The purpose of Organizational Process Performance (OPP) is to
establish and maintain a quantitative understanding of the performance
of selected processes within the organization’s set of standard
processes in support of achieving quality and process-performance
objectives, and to provide process-performance data, baselines, and
models to quantitatively manage the organization’s projects.
© 2010 Carnegie Mellon University
39
Quantitative Project Management (QPM)
The purpose of Quantitative Project Management (QPM) is to
quantitatively manage the project’s defined process to achieve the
project’s established quality and process-performance objectives.
© 2010 Carnegie Mellon University
40
Causal Analysis and Resolution (CAR)
The purpose of Causal Analysis and Resolution (CAR) is to identify
causes of selected outcomes and take action to improve process
performance.
© 2010 Carnegie Mellon University
41
Organizational Innovation and Deployment
(OID)
The purpose of Organizational Innovation and Deployment (OID) is to
proactively seek, identify, select and deploy incremental and innovative
improvements that measurably improve the organization’s processes
and technologies. The improvements support the organization’s quality
and process-performance objectives as derived from the organization’s
business objectives.
© 2010 Carnegie Mellon University
42
CMMI Transition Status
Reported to the SEI as of April 30, 2010
Training
Introduction to CMMI
Intermediate Concepts of CMMI
Understanding CMMI High Maturity Practices
Introduction to CMMI V1.2 Supplement for ACQ
Introduction to CMMI V1.2 Supplement for SVC
Introduction to CMMI for Services, V1.2
CMMI Level 2 for Practitioners
CMMI Level 3 for Practitioners
Certifications
Introduction to CMMI V1.2 Instructors
CMMI-ACQ V1.2 Supplement Instructors
CMMI-SVC V1.2 Supplement Instructors
CMMI Level 2 for Practitioners
CMMI Level 3 for Practitioners
SCAMPI V1.2 Lead Appraisers
SCAMPI V1.2 High Maturity Lead Appraisers
CMMI-ACQ V1.2 Lead Appraisers
CMMI-SVC V1.2 Lead Appraisers
115,371
3,049
595
1,172
1,774
217
71
48
444
63
124
18
18
497
166
72
134
© 2010 Carnegie Mellon University
43
SCAMPI v1.1/v1.2 Class A Appraisals
Conducted by Quarter Reported as of 4-30-10
500
450
400
350
300
250
200
150
100
50
© 2010 Carnegie Mellon University
Q4/09
Q4/08
Q4/07
Q4/06
Q4/05
Q4/04
Q3/09
Q3/08
Q3/07
Q3/06
Q3/05
Q3/04
Q2/10
Q2/09
Q2/08
Q2/07
Q2/06
Q2/05
Q2/04
Q1/10
Q1/09
Q1/08
Q1/07
Q1/06
Q1/05
Q1/04
0
44
Countries where Appraisals have been
Performed and Reported to the SEI
Argentina
Bulgaria
Dominican Republic
Hungary
Latvia
Netherlands
Portugal
Sri Lanka
United Arab Emirates
Australia
Canada
Egypt
India
Lithuania
New Zealand
Romania
Sweden
United Kingdom
Austria
Chile
Finland
Indonesia
Luxembourg
Norway
Russia
Switzerland
United States
Bahrain
China
France
Ireland
Malaysia
Pakistan
Saudi Arabia
Taiwan
Uruguay
Bangladesh
Colombia
Germany
Israel
Mauritius
Panama
Singapore
Thailand
Viet Nam
Belarus
Costa Rica
Greece
Italy
Mexico
Peru
Slovakia
Tunisia
Belgium
Czech Republic
Guatemala
Japan
Morocco
Philippines
South Africa
Turkey
Brazil
Denmark
Hong Kong
Korea, Republic Of
Nepal
Poland
Spain
Ukraine
© 2010 Carnegie Mellon University
45
Number of Appraisals and Maturity Levels
Reported to the SEI by Country
Country
Number of
Appraisals
Argentina
Australia
Austria
Bahrain
Bangladesh
Belarus
Belgium
Brazil
Bulgaria
Canada
Chile
China
Colombia
Costa Rica
Czech Republic
Denmark
Dominican Republic
Egypt
Finland
France
Germany
Greece
Guatemala
Hong Kong
Hungary
77
36
10 or fewer
10 or fewer
10 or fewer
10 or fewer
10 or fewer
144
10 or fewer
59
37
1229
34
10 or fewer
10 or fewer
10 or fewer
10 or fewer
43
10 or fewer
168
76
10 or fewer
10 or fewer
18
10 or fewer
India
524
Indonesia
Ireland
Israel
10 or fewer
11
19
Italy
43
Japan
306
Korea, Republic Of 165
Latvia
10 or fewer
Lithuania
10 or fewer
Maturity
Level 1
Reported
1
Maturity
Level 2
Reported
50
8
Maturity Maturity
Level 3
Level 4
Reported Reported
18
7
2
2
Maturity
Level 5
Reported Country
4
4
Malaysia
Mauritius
Mexico
Morocco
Nepal
Netherlands
New Zealand
Norway
Pakistan
Panama
Peru
Philippines
Poland
Portugal
Romania
Russia
Saudi Arabia
Singapore
Slovakia
South Africa
Spain
Sri Lanka
Sweden
Switzerland
Taiwan
71
10 or fewer
86
10 or fewer
10 or fewer
14
10 or fewer
10 or fewer
28
10 or fewer
10 or fewer
23
10 or fewer
14
10 or fewer
11
10 or fewer
21
10 or fewer
10 or fewer
180
14
10 or fewer
10 or fewer
134
189
Thailand
38
3
Tunisia
Turkey
Ukraine
10 or fewer
16
10 or fewer
1
71
57
1
11
1
16
22
135
12
24
12
987
13
5
4
2
48
2
1
36
3
1
22
12
2
3
4
9
98
37
53
15
1
1
2
1
2
11
17
18
1
278
2
3
4
11
19
86
55
20
140
75
5
24
13
15
Number of
Appraisals
17
8
United Arab Emirates10 or fewer
United Kingdom
113
United States
1582
Uruguay
10 or fewer
Viet Nam
17
Maturity
Maturity
Level 1
Level 2
Reported Reported
1
Maturity Maturity
Maturity
Level 3
Level 4
Level 5
Reported Reported Reported
22
43
6
36
39
5
7
1
21
4
1
2
11
8
5
7
1
3
6
3
3
4
4
11
1
4
1
108
2
54
12
3
4
1
76
51
2
2
12
24
1
14
2
3
30
50
564
35
610
1
23
4
141
12
2
3
© 2010 Carnegie Mellon University
46
Why do we need improved performance
and better processes?
- Because there is STILL a management crisis in software!
A recent Standish report confirms that the number of troubled projects rises
each year
The result?
• Losses in the millions for the government agencies and companies affected
• Leadership can be unaware of what is and is not working
• Without a robust performance management system, management is
operating without needed data to make quality decisions
Source: http://www.projectsmart.co.uk/docs/chaos-report.pdf
© 2010 Carnegie Mellon University
47
Measurement Challenges
There are challenges in measuring return on investment in a traditional sense
when correlating the CMMI framework to predictable performance improvement in
a given organization
• Companies that wanted to adopt CMMI had little data related to their
conditions before starting and generally took no data to record their
investments in applying the framework
• They rarely had any data on their organizations performance until after they
were well along the path to adherence
• As a result, the ROI data published was generally notional and unsupported
© 2010 Carnegie Mellon University
48
The Cost of Quality - Before and After
CMMI Transition
© 2010 Carnegie Mellon University
49
Time History- Cost
© 2010 Carnegie Mellon University
50
Time History- Schedule
© 2010 Carnegie Mellon University
51
“Cost” to Implement CMM
Company A
Company J
Company F
Company M
Company
C
Company I
© 2010 Carnegie Mellon University
52
Recurring Costs for PI
Company H
Company B
Company M
Company I
Company D
Company C
Company O
© 2010 Carnegie Mellon University
53
CMM Cost Details
•
•
•
•
•
•
•
•
•
•
Company A: $4.5M, 18 Months, 350 People, Level 2
• Complete Outsourcing of CMM Support
Company C: NTE 5% of Budget; 1 Year, 30 People, Level 3 (SW)
• Extensive Capture of Cost to Implement
Company B: 2% of Budget, 1 Year, 560 People, Level 2, Then 3
• Extensive Outsourcing of CMM Support
Company O: 1.5-2.5%, 18 Months, 180 People, Level 2 (SW)
Company J: $900K, 2 Year, 400 People, Level 3 (SW)
Company D: 2% of Annual Budget, 150 People, No Assessment
• 5 Years to Best Productivity; All Costs Not Captured
Company F: $1M, 150 People, Level 2, Then 3
Company M: Staffing at 3-5%; Up to 5% for Metrics Expense
Company H: 2% of Budget, 60% of Company at Level 4
Company I: Implement costs: $2.5M, 2.5 years, Level 3 (SW/SE), not all costs
included, 15% workforce initial pilot. Sustainment costs: 15.25 work-years
government, 4 full and part-time contractors. 3600 employees.
50% of Costs Devoted to Training
© 2010 Carnegie Mellon University
54
•
This material is considered SEI-Proprietary and is distributed by the Software
Engineering Institute (SEI) to SEI Staff ONLY.
•
This material SHALL NOT be reproduced or used for any other purpose without
requesting formal permission from the SEI at permission@sei.cmu.edu.
•
THE MATERIAL IS PROVIDED ON AN “AS IS” BASIS, AND CARNEGIE MELLON
DISCLAIMS ANY AND ALL WARRANTIES, IMPLIED OR OTHERWISE (INCLUDING,
BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR A PARTICULAR PURPOSE,
RESULTS OBTAINED FROM USE OF THE MATERIAL, MERCHANTABILITY, AND/OR
NON-INFRINGEMENT).
© 2010 Carnegie Mellon University
55
Download