4 June 2010
Rick Hefner
Director, Process Assurance
Northrop Grumman Corporation
2
• Many published results show improved cost and schedule performance from adopting CMMI ®
• Despite these results, there is still community debate over the value of CMMI ® , and whether CMMI ® ratings provide sufficient guarantees of program performance.
• This program will explore three factors contributing to the confusion:
– Inaccurate CMMI ® ratings
– Over-estimating the benefits that CMMI ® provides a customer
– Contractors not living up to their CMMI ® rating
Underlying CMMI ® Principles
• CMMI ® relationship to productivity, predictability and speed
Does CMMI Benefit the Customer?
How Projects Fail
How to Get Contractors to Live Up to Their CMMI Ratings
3
SM SCAMPI, SCAMPI Lead Appraiser, and SEI are service marks of Carnegie Mellon University.
® Capability Maturity Model Integration and CMMI® are registered in the U.S. Patent & Trademark Office.
4
People-Related Mistakes
1. Undermined motivation
2. Weak personnel
3. Uncontrolled problem employees
4. Heroics
5. Adding people to a late project
6. Noisy, crowded offices
7. Friction between developers and customers
8. Unrealistic expectations
9. Lack of effective project sponsorship
10. Lack of stakeholder buy-in
11. Lack of user input
12. Politics placed over substance
13. Wishful thinking
Process-Related Mistakes
14. Overly optimistic schedules
15. Insufficient Risk
Management
16. Contractor failure
Insufficient planning
17. Abandonment of planning under pressure
18. Wasted time during the fuzzy front end
19. Shortchanged upstream activities
20. Inadequate design
21. Shortchanged quality assurance
22. Insufficient management controls
23. Premature or too frequent convergence
25. Omitting necessary tasks from estimates
26. Planning to catch up later
27. Code-like-hell programming
Reference: Steve McConnell, Rapid Development
Product-Related Mistakes
28. Requirements gold-plating
29. Feature creep
30. Developer gold-plating
31. Push me, pull me negotiation
32. Research-oriented development
Technology-Related Mistakes
33. Silver-bullet syndrome
34. Overestimated savings from new tools or methods
35. Switching tools in the middle of a project
36. Lack of automated source-code control
Standish Group survey of
13,000 projects (2003)
• 34% successes
• 15% failures
• 51% overruns
5
• Which weaknesses are causing my problems?
• Which strengths may mitigate my problems?
• Which improvement investments offer the best return?
People
Business
Environment
Management
Structure
Tools
One solution!
Process
Product Methods
Technology
Data-Driven (e.g., Six Sigma, Lean) Model-Driven (e.g., CMM ® , CMMI ® )
6
• Clarify what your customer wants
(Voice of Customer)
– Critical to Quality (CTQs)
• Determine what your processes can do (Voice of Process)
– Statistical Process Control
• Identify and prioritize improvement opportunities
– Causal analysis of data
• Determine where your customers/competitors are going
(Voice of Business)
– Design for Six Sigma
• Determine the industry best practice
– Benchmarking, models
• Compare your current practices to the model
– Appraisal, education
• Identify and prioritize improvement opportunities
– Implementation
– Institutionalization
• Look for ways to optimize the processes
®
A model is a simplified representation of the world. Capability Maturity
Models (CMM ® s) contain the essential elements of effective processes for one or more bodies of knowledge. These elements are based on the concepts developed by Crosby, Deming, Juran, and Humphrey.
-Introduction, CMMI ®
7
• CMMI ® provides a model of industry best practices
• Following these practices has shown to produce software and systems faster, better, and cheaper, when properly applied
• The main benefits cited by CMMI ® users are:
– More predictable adherence to budgets and schedules
– Reduced re-work (which can reduce cost and schedule)
– Reduced risk
8
• Process maturity gets at one source of the problem, e.g.,
– Are we using proven industry practices?
– Does the staff have the resources needed to execute the process?
– Is the organization providing effective project support?
• The main benefits typically seen are:
– Improved predictability of project budgets and schedules
– Improved management awareness of problems
– Reduced re-work, which improves predictability, cost, and schedule
J. Herbsleb and D. Zubrow,
“Software Process Improvement:
An Analysis of Assessment Data and Outcomes”
– 13 organizations
– ROI of 4:1 to 9:1
– Improved quality, error rates, time to market, productivity
R. Dion, “Process Improvement and the Corporate Balance
Sheet”
ROI of 7.7:1: Reduced rework, improved quality
Two-fold increase in productivity
Underlying CMMI ® Principles
Does CMMI Benefit the Customer?
• Cost of implementing CMMI-compliant processes
• Timelines for impacting program performance
• Practical tips and techniques for realizing the benefits
How Projects Fail
How to Get Contractors to Live Up to Their CMMI Ratings
9
SM SCAMPI, SCAMPI Lead Appraiser, and SEI are service marks of Carnegie Mellon University.
® Capability Maturity Model Integration and CMMI® are registered in the U.S. Patent & Trademark Office.
®
Project Performance Organizational Performance
Quality/Rework Institutionalization
10
Rick Hefner, “Achieving the Promised Benefits of CMMI,” CMMI Technology Conference &
User Group, Denver, CO, 14-17 Nov 2005
11
• Project performance problems often arise because of incomplete or unrealistic planning
– Forgotten activities
– Unconscious decisions
– Overly-optimistic estimates
• When cost/schedule pressure arises, people abandon the plans, leading to more problems
– Individual judgment versus best use of resources
CMMI
• Identifies the elements of good planning
– Proven engineering processes
– Estimates based on historical data, using these processes
• When cost/schedule pressure arises, CMMI ® practices track and correct
– Reactive (L2)
– Proactive, risk management
(L3)
– Quantitative management (L4)
• QA, management ensures processes/plans are followed
Train project managers on how to use the tools (estimation, earned value, risk management)
Project managers (not organizational staff) must be responsible for implementing the improved processes
Demand realistic, data-driven estimates
12
• Each project’s processes are unique
– Personnel must re-learn with each project
– Difficulty moving people from project to project
– Historical data of little use in estimation
• No way to compare project-toproject
– Which process was best?
– What did we learn?
CMMI
• Standard organizational process, tailored to fit each project
– Can be documented, trained, supported by templates
– Over time, people learn the process
• Common processes/measures allow better use of historical data
– Calibrate cost estimation models
– Project to project comparisons
– Over time, the organization can optimize the process
Develop an organizational process(es) which fits the full range of your projects (small/large, all life cycles and project types)
Capture and use historical data (measurement repository)
Capture and share project documents (process asset library)
13
• Focus on “faster and cheaper” leads to skipping of essential steps
– Key steps are not obvious, often counter intuitive
• Fixing latent defects often accounts for 30-40% of project cost
– The cost of defects
(rework) is seldom measured
CMMI
• A disciplined engineering and management process
– Do it right the first time
– CMMI identifies the essential steps
• Peer reviews find defects early, where it is cost effective to fix them
– Requirements, designs, code, plans, etc.
– Often more efficient and effective than testing
– Many types (Fagan inspections, walkthroughs, desk checks, etc.)
Focus on eliminating defects, not on faster and cheaper
Measure the cost of finding and fixing defects
Invest time in learning different methods of peer review and when each is effective
14
• Some improvement efforts focus on quick fixes
– Driven by yearly budget cycles
– Expectation that results will be immediate
• It is tempting to reduce overhead to reduce cost
– Training
– Staff support to projects
– Use of outside process experts
CMMI
• Short-term investment for longterm gain
– Initial investment in the cost of change, learning curve, new overhead structures
– Long-term benefits in increased productivity
• Organizational infrastructure exists to support the policies and process
– Measurement repositories
Expect 18-24 months before benefits begin to be realized
Senior management must demand that everyone follow the new processes
QA can be the organization’s strongest tool – if they are focused!
15
• The typical benefits are:
– Reduced cost
– Faster schedules
– Greater productivity
– Higher quality
– Increased customer satisfaction
• Over 40 published studies on the benefits of SW-CMM ®
– DoD DACS website: http://www.thedacs.com/databases/roi/
• Similar results starting to be seen for CMMI ®
– “Demonstrating the Impact and Benefits of CMMI: An Update and
Preliminary Results,” Software Engineering Institute, CMU/SEI-2003-SR-
009, Oct 2003
– http://www.sei.cmu.edu/cmmi/results/results-by-category.html
• Reduced Costs
– 33% decrease in the average cost to fix a defect (Boeing)
– 20% reduction in unit software costs (Lockheed Martin)
– Reduced cost of poor quality from over 45 percent to under 30 percent over a three year period
(Siemens)
– 10% decrease in overall cost per maturity level (Northrop Grumman)
16
• Faster Schedules
– 50% reduction in release turnaround time (Boeing)
– 60% reduction in re-work following test (Boeing)
– Increase from 50% to 95% the number of milestones met (General
Motors)
• Greater Productivity
– 25-30% increase in productivity within 3 years (Lockheed Martin,
Harris, Siemens)
• Higher Quality
– 50% reduction of software defects
(Lockheed Martin)
• Customer Satisfaction
– 55% increase in award fees
(Lockheed Martin)
17
• Both theoretical models and industry data suggests that CMMI-compliant projects achieve a cost reduction of 10% per level, i.e., Level 3 is 20% cheaper than Level 1
– The key is reducing rework
• Knox Model – Theoretical Benefits
Prevention Appraisal Int Failure Ext Failure TCoSQ
60
50
40
COCOMO predicts similar benefits based on current industry data
30
20
10
0
1 2 3
SEI CMM Level
4 5
• Some organizations are driven to achieve a maturity level only for it’s marketing value
Improvement goals are not set realistically (“Level 5 in ’05”)
Focus on passing the appraisal, not understanding and deciding among possible interpretations
Only some of the projects participate in the improvement effort
Practitioners/customers perceive
CMMI as more expensive
Only some of the projects get appraised
The remaining projects don’t implement
18
Insufficient resources (e.g., training, QA, metrics, consultants)
People don’t learn or become proficient in the new behaviors
Management doesn’t enforce using processes on new programs
Benefits are not realized because projects do not start up effectively
Rick Hefner, “CMMI Horror Stories: When Good Projects Go Bad,” Software Engineering Process Group
Conference , 6-9 March 2006
Decisions made on the basis of maturity level ratings are only valid if the ratings are based on known criteria.
SCAMPI A Method Description Document
• A CMMI appraisal indicates the organization’s capacity to perform the next project, but cannot guarantee that each new project will perform in that way
• The CMMI methodology assumes the organization will propagating their processes to every new project
– An organization that gets appraised solely to demonstrate a maturity level might not have that intent
– Organizations may not have developed the skills to roll out their processes effectively
19
• A CMMI appraisal judges the maturity of the organization’s processes – based upon the projects sampled
– New projects must embrace the new processes
20
Organizational process performance
More accurate estimates
Quantitative project management
Organizational innovation and deployment
Causal analysis
Problem behaviors are recognized faster, enabling quicker resolution
The project benefits from improvements found and proven on other projects
The project fixes the source of defects to prevent future defects
Better Products and Services Produced Faster And Cheaper
Rick Hefner, “How Does High Maturity Benefit the Customer?,” Systems & Software Technology
Conference, 18-22 April 2005
21
… Does it’s performance and quality meet my customer’s expectations?
… If not, how should I tailor the process?
Managing by Variation
22
15
UCL=12.30
10
Expected
Variation
Average 5 Mean=4.799
• Useful in evaluating future reviews
– Was the review effective?
– Was the process different?
– Is the product different?
0
0 1 2 3 4 5 6 7
Observation Number
8 9 10
Corrective and preventative actions
LCL=-2.705
23
• Most customers care about:
– Delivered defects
– Cost and schedule
• So organizations try to predict:
– Defects found throughout the lifecycle
– Effectiveness of peer reviews, testing
– Cost achieved/actual
(Cost Performance Index – CPI)
– Schedule achieved/actual
(Schedule Performance Index –
SPI)
180.00
160.00
140.00
120.00
100.00
80.00
60.00
40.00
20.00
0.00
Req'mts Design
Defect Detection Profile
Code Unit Test
Phase
Integrate Sys Test Del 90 Days
All Projects
New Process
• Determine whether processes are behaving consistently or have stable trends (i.e., are predictable)
• Identify processes where the performance is within natural bounds that are consistent across process implementation teams
• Establish criteria for identifying whether a process or process element should be statistically managed, and determine pertinent measures and analytic techniques to be used in such management
• Identify processes that show unusual (e.g., sporadic or unpredictable) behavior
• Identify any aspects of the processes that can be improved in the organization's set of standard processes
24
• Identify the implementation of a process which performs best
25
• Six Sigma is an enabler for higher maturity
– Focus on data, measurement systems, process improvement
– Tying improvements to business goals
– Tools and methods support the Level 4/5 analysis tasks
• Level 3 metrics, measurement processes, and goal setting are generally inadequate for Levels 4 and 5
– Better definitions of the measures
– Lower level metrics of lower level subprocesses
• Having all the tools at Level 5 gives you the insight to manage each project the way the customer needs it to be managed
26
Underlying CMMI ® Principles
Does CMMI Benefit the Customer?
How Projects Fail
• Start up problems
• Appraisal inaccuracies
How to Get Contractors to Live Up to Their CMMI Ratings
27
• The projects within the organization may not live up to the capability
– Start-up problems with planning, subcontractors, and infrastructure
– Problems with staffing, either as the prime or with subcontractors
– Differences in domain experience
– Back-sliding
• The appraisal results may not be an accurate reflection of the organization’s capability
– Sampling bias
– Appraisal inaccuracies
– Organization’s inability to immediately apply their appraised processes
• Note that all of these problems are equally possible with both the staged and continuous representations
• Many process-related problems arise in the first few months of a project
– New relationships are established
– Personnel changes and shortfalls
– Pressure to produce quickly
– Gaps between the planned processes and what was bid
28
• If a project is going to live up to the organization’s process capability, it is essential to fully implement the processes from the beginning
– Processes should be defined during the proposal, by tailoring the organization’s standard process
– Estimates should be based on historical data from the organization’s measurement repository
– Process assets (e.g., templates) should support detailed planning to ensure consistency with the organization’s best practices
– Evidence reviews should be used to ensure CMMI compliance
29
• The CMMI generic practices ensure that processes are institutionalized – sustained over time
• The approach for implementing the generic practices must reflect:
– Efficiency
– Effectiveness
– Applicability to ALL projects
• Frequent appraisals can be used to assess the effectiveness of the institutionalization
Geoff Draper and Rick Hefner, “Applying CMMI
Generic Practices with Good Judgment,” SEPG
Conference, 2004.
Commitment to
Perform
Policies and sponsorship
GP 2.1 Establish
Organizational Policy
Directing
Implementation
Managing performance of the process
GP 2.6 Manage
Configurations
GP 2.7 Identify/Involve
Relevant Stakeholders
GP 2.8 Monitor and
Control the Process
GP 3.2 Collect
Improvement Info.
Ability to Perform
Project and/or organizational resources
GP 2.2 Plan the Process
GP 2.3 Provide Resources
GP 2.4 Assign
Responsibility
GP 2.5 Train People
GP 3.1 Establish a Defined
Process
Verifying
Implementation
Management review, process conformance
GP 2.9 Objectively
Evaluate Adherence
GP 2.10 Review Status with Higher Level
Management
The size and number of instantiations investigated should be selected to form a valid sample of the organizational unit to which the results will be attributed.
- SCAMPI A Method Description Document
• The Lead Appraiser is permitted to select sample projects as
“representative” of the organization as a whole
– Little guidance in the MDD
– Wide variation among Lead Appraisers
Only some of the projects get appraised
The remaining projects don’t implement
30
• If an organization is only interested in a good appraisal result, they will appraise large organizations with a handful of samples, and/or exclude/hide inferior projects
• This potential abuse exist with both staged and continuous representations
31
• An organization with 50 projects at multiple sites may select 4-
5 sample projects
• Are the appraisal results representative of the organization?
Division A
Sector
Division B
Site A1
Project XX
Project XX
Project XX
Site A2
Project XX
Project XX
Project XX
Project XX
Project XX
Project XX Project XX
Sampled projects
Division C
Project XX
Project XX
Project XX
Project XX
32
• Methodology
– SCAMPI A appraisals are the only approach that provides benchmark quality appraisal results
– SCAMPI B, C, and other appraisal methods may be useful, but they are not designed to provide the same accuracy
• Appraiser Skill
– There is wide variation in appraiser skill, experience and insight
– Although appraisal experience is a crucial contributor to accuracy, the appraisal methods do little to ensure sufficient experience
– There is also wide variation in how the model is interpreted
• Appraiser Independence
– Appraiser independence in needed to ensure unbiased results
– It is difficult to establish a completely independent situation
The ADS is a summary statement describing the appraisal results that includes the conditions and constraints under which the appraisal was performed. It contains information considered essential to adequately interpret the meaning of assigned maturity level or capability level ratings.
- SCAMPI A Method Description Document
• The Appraisal Disclosure Statement (ADS) provides keys to assessing an appraisal’s accuracy
– Organizational unit appraised (the unit to which the ratings are applicable and the domains examined)
– Appraisal team leader and appraisal team members and their organizational affiliations
– Process areas rated and process areas not rated
– Dates of on-site activity
33
• Not included - sampling approach or percentage of projects sampled
34
:
• SCAMPI A Appraisal Disclosure Statement
– Organizational unit appraised
– Appraisal team leader affiliation
– Process areas rated and not rated
– Dates of on-site activity
• Explanation of sampling approach used in appraisal
• Approach to be used to ensure proper project start-up
• Data to demonstrate the speed with which new projects adopt and execute the organization’s processes
• Approach to be used to prevent back-sliding
35
Underlying CMMI ® Principles
Does CMMI Benefit the Customer?
How Projects Fail
How to Get Contractors to Live Up to Their CMMI Ratings
• Contenders and Pretenders
36
• There is a marked difference between organizations that truly want to implement CMMI ® , and those who simply want a
“certificate”
• Contenders invest time and energy on understanding the industry best practices in the model, fitting them to their projects and organization, and improving their effectiveness and efficiency
• Pretenders simply do enough to convince an appraiser to give them the maturity level -- along the way, they de-motivate their staff with bureaucratic processes, disappoint their customers with inconsistent performance, and generally give the model a bad name
37
Assuming the contractor’s CMMI ® rating is accurate, and applicable to the team doing the work, where could problems arise?
• Areas outside of the CMMI ®
• Start-up problems
• Back-sliding
38
®
Process
People
Domain knowledge
Sufficient quantity
Motivation
Technology
Domain-specific
Maturity
Tools
39
1. Lack of awareness of the importance, value, timing, accountability, and organizational structure of SE on programs
2. Adequate, qualified resources are generally not available within
Government and industry for allocation on major programs
3. Insufficient SE tools and environments to effectively execute SE on programs
4. Requirements definition, development and management is not applied consistently and effectively
5. Poor initial program formulation
“Top Five Systems Engineering Issues In Defense
Industry”, NDIA Systems Engineering Division Task Group
Report, Jan, 2003
40
1. The impact of requirements upon software is not consistently quantified and managed in development or sustainment
2. Fundamental system engineering decisions are made without full participation of software engineering.
3. Software life-cycle planning and management by acquirers and suppliers is ineffective.
4. The quantity and quality of software engineering expertise is insufficient to meet the demands of government and the defense industry.
5. Traditional software verification techniques are costly and ineffective for dealing with the scale and complexity of modern systems.
6. There is a failure to assure correct, predictable, safe, secure execution of complex software in distributed environments.
7. Inadequate attention is given to total lifecycle issues for COTS/NDI impacts on lifecycle cost and risk.
“Top Software Engineering Issues In Defense Industry”,
NDIA Systems Engineering Division and Software
Committee, Sep 2006
41
• Project Planning starts in the proposal phase, is refreshed at contract start, and re-occurs throughout the project lifecycle
• Contenders extend their CMMI practices to proposal teams and re-planning efforts
• Pretenders focus on contract start
– Costs and schedules defined at proposals may be immature and overlyaggressive
– Re-planning may be ad hoc
• Mature estimates may also be overruled by business interests
42
®
SG 1 Establish Estimates
Estimates of project planning parameters are established and maintained.
SP 1.1 Estimate the Scope of the Project
Establish a top-level work breakdown structure (WBS) to estimate the scope of the project.
SP 1.2 Establish Estimates of Work Product and Task Attributes
Establish and maintain estimates of the attributes of the work products and tasks.
SP 1.3 Define Project Lifecycle
Define the project life-cycle phases upon which to scope the planning effort.
SP 1.4 Determine Estimates of Effort and Cost
Estimate the project effort and cost for the work products and tasks based on estimation rationale.
43
®
SG 2 Develop a Project Plan
A project plan is established and maintained as the basis for managing the project.
SP 2.1 Establish the Budget and Schedule
Establish and maintain the project’s budget and schedule.
SP 2.2 Identify Project Risks
Identify and analyze project risks.
SP 2.3 Plan for Data Management
Plan for the management of project data.
SP 2.4 Plan for Project Resources
Plan for necessary resources to perform the project.
SP 2.5 Plan for Needed Knowledge and Skills
Plan for knowledge and skills needed to perform the project.
SP 2.6 Plan Stakeholder Involvement
Plan the involvement of identified stakeholders.
SP 2.7 Establish the Project Plan
Establish and maintain the overall project plan content.
44
®
SG 3 Obtain Commitment to the Plan
Commitments to the project plan are established and maintained.
SP 3.1 Review Plans that Affect the Project
Review all plans that affect the project to understand project commitments.
SP 3.2 Reconcile Work and Resource Levels
Reconcile the project plan to reflect available and estimated resources.
SP 3.3 Obtain Plan Commitment
Obtain commitment from relevant stakeholders responsible for performing and supporting plan execution.
45
• Ask suppliers to show how they extend the CMMI practices to proposal activities
• Request planning documents with the proposal
• During re-planning, ask suppliers to show how they performed the CMMI practices
46
Institutionalization: The ingrained way of doing business that an organization follows routinely as part of its corporate culture.
- CMMI-DEV v1.2
When mentioned in the generic goal and generic practice descriptions, institutionalization implies that the process is ingrained in the way the work is performed and there is commitment and consistency to performing the process.
An institutionalized process is more likely to be retained during times of stress.
GG 2 Institutionalize a Managed Process
GP 2.1 Establish an Organizational Policy
GP 2.2 Plan the Process
GP 2.3 Provide Resources
GP 2.4 Assign Responsibility
GP 2.5 Train People
GP 2.6 Manage Configurations
GP 2.7 Identify and Involve Relevant Stakeholders
GP 2.8 Monitor and Control the Process
GP 2.9 Objectively Evaluate Adherence
GP 2.10 Review Status with Higher Level
Management
GG 3 Institutionalize a Defined Process
GP 3.1 Establish a Defined Process
GP 3.2 Collect Improvement Information
®
Commitment to Perform
GP 2.1 Establish an Organizational Policy
47
Directing Implementation
GP 2.6 Manage Configurations
GP 2.7 Identify and Involve Relevant Stakeholders
GP 2.8 Monitor and Control the Process
GP 3.2 Collect Improvement Information
Ability to Perform
GP 2.2 Plan the Process
GP 2.3 Provide Resources
GP 2.4 Assign Responsibility
GP 2.5 Train People
GP 3.1 Establish a Defined Process
Verifying Implementation
GP 2.9 Objectively Evaluate Adherence
GP 2.10 Review Status with Higher Level Management
48
• Fully support the CMMI ® based improvement program by providing training, templates, tools, process assets libraries, measurement repositories and other work aids focused on improving the ability of practitioners to competently adopt the model
• Largely ignore organizational support, often to save money
• Where required by the model, they establish process asset libraries and measurement repositories, but they are largely shelfware
®
Policies, Processes,
Templates & Tools
Process Group Training Program Process Improvement
Measurement Repositories
Predictive Modeling
Best-Practice Libraries Audits & Appraisals Communications
49
Developing and maintaining mature processes requires significant time and investment in infrastructure
50
A pattern of shared basic assumptions that the group learned as it solved its problems of external adaptation and internal integration, that has worked well enough to be considered valid and, therefore, to be taught to new members as the correct way you perceive, think, and feel in relation to those problems.
• Artifacts
– The practices that can be observed in such areas as dress code, leadership style, communication processes
• Espoused values
– The elements the organization says it believes in, the factors that it says influence the practices in which it engages
• Basic underlying assumptions
– Unstated beliefs the organization has come to accept and abide by
Organizational Culture & Leadership,
Edgar H Schein, used with permission
51
• Understands the key messages
• Is willing to take actions to reinforce them
• Provides resources to support/sustain process improvement efforts
• Sets expectations that essential project functions will be funded and processes will be followed
– Project planning, estimation, tailoring, CM, QA, etc.
• Supports process improvement and sustainment, rather than passing appraisals
• Rewards mature processes development and sustainment rather than individual heroics
Rick Hefner, “Sustaining CMMI Compliance,” 2006 CMMI
Technology Conference and User Group
52
• Ask suppliers to show how they perform the CMMI generic practices
• When problems occur, ask why the CMMI practices were not effective in sustaining the desired behavior, and what will be done to prevent future problems
53
• There is a marked difference between organizations that truly want to implement CMMI ® , and those who are simply try to get a “certificate”
• By discussing the differences, we hope to help the CMMI ® community the true value of CMMI ®
54