1 - University of Wisconsin

advertisement
Notes_010
SE 3730 / CS 5730 – Software Quality
1 Standards and Certifications
SEI CMMI http://www.sei.cmu.edu/cmm/
CMMI 2nd Edition, Chrissis, Konrad and Shrum, 2007 Addison Wesley
1.1 Prior to CMMI there was CMM.
CMM (Capability Maturity Model) and CMMI (Capability Maturity Model Integrated) the first release of
CMMI (v1.02 in 2000). It took a few years for the first CMMI Level 5 Company to appear after the
standard was published.
1.2 CMMI Certification
Why is it important –
 technical qualification for government
 preferred vendors list
 hopefully better quality software
©2011 Mike Rowe
Page 1
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality
CMU – SEI – 30 different
Orgs improvements.
CMU- SEI – Motorola Global
Systems Group Russia – a Level 5 Maturity org.
©2011 Mike Rowe
Page 2
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality
CMU-SEI – Drop in defects
-- CMU-SEI – Warner Robins Logistics Center Schedule
Performance
©2011 Mike Rowe
Page 3
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality
1.3 CMMI Process Categories and Areas
Humphrey observed during his research, that successful organizations share a common set of
capabilities.
 He classified these capabilities into 25 process areas, which can be arranged to four different
Process Categories.
o Project Management,
o Process Management,
o Engineering, and
o Support.

Each Process Area has Specific Goals (SGs) – these are characteristics that must be present to
satisfy Process Area.

Each Specific Goal has Specific Practices (SPs) – these are activities that are expected to result
in achievement of SGs, or in other words this is the evidence that the SGs have been met.

There can also be Generic Goals (GGs) and their Generic Practices (GPs) – these are like SGs
and SPs except they are not unique to a single Process Area.

Each Specific Practices can have Typical Work Products – these describe typical outputs from
Specific Practices.

Each Specific Practices can also have Subpractices that provide guidelines to help interpret and
implement goals and practices.

Eventually the 25 process areas were narrowed to 23 and then to the below 22 (V1.2) Process
Areas and the Process Category to which they belong.
©2011 Mike Rowe
Page 4
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality
Maturity
Level
Process
Management
Project Management
Engineering
Support
1: Initial
2: Managed
Project Planning (PP)
Requirements
Management
(REQM)
Configuration
Management
(CM)
Process and
Product Quality
Assurance (PPQA)
Measurement
and Analysis (MA)
Requirements
Development (RD)
Decision Analysis
and Resolution
(DAR)
Project Monitoring and
Control (PMC)
Supplier Agreement
Management (SAM)
3:
Defined
Organization Process
Focus (OPF)
Organizational Process
Focus (OPF)+ Integrated
Product and Process
Development (IPPD)
Integrated Project
Management (IPM) +
Integrated Product and
Process Development (IPPD)
Organization Training
(OT)
Risk Management (RSKM)
Organization Process
Performance (OPP)
Quantitative Project
Management (QPM)
Technical Solutions
(TS)
Product Integration
(PI)
Verification (VER)
Validation (VAL)
4:
Quantitatively
Managed
5:
Optimizing
Organization Innovation
and Deployment (OID)
Causal Analysis
and Resolution
(CAR)
1.4 Maturity and Capability Levels
Maturity starts with the level one and culminates in level five. CMMI has the following Maturity levels:

Level 5 – Optimizing
 Level 4 – Quantitatively Managed

Level 3 – Defined

Level 2 – Managed

Level 1 – Initial
With increasing level, the risk for building the wrong product decreases and quality and productivity
improves. All maturity levels consist of specific and generic practices and goals for the process areas.
©2011 Mike Rowe
Page 5
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality
There are six Capability levels:

Level 5 – Optimizing
 Level 4 – Quantitatively Managed

Level 3 – Defined

Level 2 – Managed

Level 1 – Performed

Level 0 – Incomplete
The capability levels are applied to each process area.
 Each process area runs through the capability levels from zero to five.

In contrast to the maturity levels, the capability levels for a process area can progress through
levels independently of other process areas.
o Ex: A company can be at different capability levels for Project Management, Process
Management, Engineering, and Support. The maturity level is based on the lowest
capability level of the organization.

At level zero an organization does not possess or apply any part of a process area. When a
process area is at such an incomplete level, this is at capability level zero.

Two Approaches to using CMMI, Continuous and Staged:
o Continuous: An organization can progress on each Process Category and Process Area in a
way that optimizes the impact to their organization. An organization is described at a level
for each Process Category. For instance they might be Level 2 for Project Management and
Level 3 for Engineering.
o Staged: An organization is rated on all Process Categories and Areas. They are rated based
on the lowest Process Area that they have obtained.
2 Maturity Level Descriptions
The below material is from a chart published by SEI, CMMI-SE/SW/IPPD/SS V1.2 2007
2.1 Level 1: Initial,
There is no formal process, and success can be attributed to the heroics of a few engineers;
For $10 I’ll certify anyone at Level 1
©2011 Mike Rowe
Page 6
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality
The first level, the initial level, does not really involve improving the quality of software.
 Usually, at this level, conditions are chaotic and few, if any, processes are defined.

There is no organized management for the project.

Success of a project depends on individual heroes and the project can not be repeated in the
same way again.

Few general tools are used for projects on this level.

At this level no measures are taken to control and plan projects and their success.

At this level projects are often running out of time and budget.
2.2 Level 2: Managed
Level 2: Managed, there is a minimal process and the status of projects is visible to management at
major milestones. Process varies from project to project.
Repeatable. Basic project management processes are established to
 track cost,
 track schedule, and
 track functionality.
The key process areas at Level 2 focus on the software project's concerns related to establishing basic
project management controls.
The necessary process discipline is in place to repeat earlier successes on projects with similar
applications.
The managed level focuses on organizing project management and project processes.
 The scope of the managed level is restricted to individual Projects – not all Projects within an
organization.

The processes which need to be used for a project are determined and applied.

The success of these processes is measured and controlled during the entire project.

These projects work with milestones to make the project’s results and success visible to the
management at given times (there is not real-time visibility of performance).

Work results are regularly controlled.

Products as well as services are within determined standards and procedures.
©2011 Mike Rowe
Page 7
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality
Maturity Level
Process Management
Level 2:
Managed
Project
Management
Engineering
Support
Project Planning (PP)
Requirements
Management (REQM)
Configuration
Management (CM)
Process and Product
Quality Assurance
(PPQA)
Measurement and
Analysis (MA)
Project Monitoring
and Control (PMC)
Supplier Agreement
Management (SAM)
Not there is NO Process Management needed here. This is because each project does its own
thing.
2.2.1 Level 2 Project Management
Project Planning (PP)
 SG 1: Establish Estimates
o SP 1.1 Estimate the Scope of a Project
o SP 1.2 Establish Estimates of Work Product and Task Attributes
o SP 1.3 Define Project Lifecycle
o SP 1.4 Determine Estimates of Effort and Cost
 SG 2: Develop a Project Plan
o SP 2.1 Establish the Budget and Schedule
o SP 2.2 Identify Project Risks
o SP 2.3 Plan for Data Management
o SP 2.4 Plan for Project Resources
o SP 2.5 Plan for Needed Knowledge and Skills
o SP 2.6 Plan Stakeholders Involvement
o SP 2.7 Establish the Project Plan
 SG 3: Obtain Commitment to the Plan
o SP 3.1 Review Plans That Affect the Project
o SP 3.2 Reconcile Work and Resources
o SP 3.3 Obtain Plan Commitment
 Estimates are documented for planning and tracking
 Project activities are planned and documented
Project Monitoring and Control (PMC)
 SG 1: Monitor Project Against Plan
o SP 1.1 Monitor Project Planning Parameters
o SP 1.2 Monitor Commitments
o SP 1.3 Monitor Project Risks
o SP 1.4 Monitor Data Management
o SP 1.5 Monitor Stakeholders Involvement
o SP 1.6 Conduct Progress Reviews
o SP 1.6 Conduct Milestone Reviews
©2011 Mike Rowe
Page 8
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality

SG 2: Manage Corrective Actions to Closure
o SP 2.1 Analyze Issues
o SP 2.2 Take Corrective Actions
o SP 2.3 Manage Corrective Actions



Results and performance tracked against time
Changes to commitments are agreed to by affected groups
Correction actions can be applied when the project deviates from plan.
Supplier Agreement Management (SAM)
 SG 1: Establish Supplier Agreements
o SP 1.1 Determine Acquisition Type
o SP 1.2 Select Suppliers
o Sp 1.3 Establish Supplier Agreements
 SG 2: Satisfy Supplier Agreements
o SP 2.1 Execute the Supplier Agreement
o SP 2.2 Monitor Selected Supplier Processes
o SP 2.3 Evaluate Selected Supplier Work Products
o SP 2.4 Accept the Acquired Product
o SP 2.5 Transition Product

Manage the acquisition and quality of components from outside the company.
2.2.2 Level 2 Engineering
Requirements Management (REQM)
 SG 1: Manage Requirements
o SP 1.1 Obtain an Understanding of Requirements
o SP 1.2 Obtain Commitment to Requirements
o SP 1.3 Manage Requirement Changes
o SP 1.4 Maintain Bidirectional Traceability of Requirements
o SP 1.5 Identify Inconsistencies Between Project Work and Requirements
2.2.3 Level 2 Support
Configuration Management (CM)
 SG 1: Establish Baselines
o SP 1.1 Identify Configuration Items
o SP 1.2 Establish a Configuration Management System
o SP 1.3 Create or Release Baselines
 SG 2: Track and Control Changes
o SP 2.1 Track Change Requests
o SP 2.2 Control Configuration Items
©2011 Mike Rowe
Page 9
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality

SG 3: Establish Integrity
o SP 3.1 Establish Configuration Management Records
o SP 3.2 Perform Configuration Audits
Process and Product Quality Assurance (PPQA)
 SG 1: Objectively Evaluate Process and Work Products
o SP 1.1 Objectively Evaluate Processes
o SP 1.2 Objectively Evaluate Work Products and Services
 SG 2: Provide Objective Insight
o SP 2.1 Communicate and Ensure Resolution of Noncompliance Issues
o SP 2.2 Establish Records
Measurement Analysis (MA)
 SG 1: Align Measurement and Analysis Activities
o SP 1.1 Establish Measurement Objectives
o SP 1.2 Specify Measures
o SP 1.3 Specify Data Collection and Storage Procedures
o SP 1.4 Specify Analysis Procedures
 SG 2: Provide Measurement Results
o SP 2.1 Collect Measurement Data
o SP 2.2 Analyze Measurement Data
o SP 2.3 Store Data and Results
o SP 2.4 Communicate Results
2.3 Level 3: Defined
Level 3: Defined, there are organizational-wide standards, procedures, tools and methods.
Level three is known as the defined level and requires maturity level two as a precondition.
Works on coordination across groups in the organization.
 The software process for both management and engineering activities is
o documented,
o standardized, and
o integrated

The processes are defined and applied organization-wide and can be tailored from the
standard process to project’s special requirements. This is an important distinction to level two.

The project is not just managed, but the processes are exactly defined and described in detail.

The process is supported by tools and suitable methods.

Processes are consistent across the entire organization and only tailoring allows some
differences between processes used in different projects.
©2011 Mike Rowe
Page 10
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality

Goals of processes are derived from the organization’s standard processes and their success is
being controlled as well as the process itself and its product.
Maturity Level
Level 3:
Defined
Process Management
Organization Process
Focus (OPF)
Organizational Process
Definition (OPD)+
Integrated Product and
Process Development
(IPPD)
Organization Training
(OT)
Project Management
Integrated Project
Management (IPM) +
Integrated Product and
Process Development
(IPPD)
Risk Management (RSKM)
Engineering
Support
Requirements
Development (RD)
Technical Solutions
(TS)
Decision Analysis and
Resolution (DAR)
Product Integration
(PI)
Verification (VER)
Validation (VAL)
2.3.1 Level 3 Process Management
Organization Process Focus (OPF)
 SG 1: Determine Process Improvement Opportunities
o SP 1.1 Establish Organizational Process Needs
o SP 1.2 Appraise the Organization’s Processes
o SP 1.3 Identify the Organization’s Process Improvements
 SG 2: Plan and Implement Process Improvement
o SP 2.1 Establish Process Action Plans
o SP 2.2 Implement Process Action Plans
 SG 3: Deploy Organizational Process Assets and Incorporate Lessons Learned
o Deploy Organizational Process Assets
o Deploy Standard Process Assets
o Monitor Implementation
o Incorporate Process-Related Experiences into the Organizational Process Assets

Plan and implement organizational process improvements based on understood strengths and
weaknesses of the organization and process assets.
Organizational Process Definition (OPD) + Integrated Product and Process Development (IPPD)
 SG 1: Establish Organizational Assets
o SP 1.1 Establish Standard Processes
o SP 1.2 Establish Lifecycle Model Descriptions
o SP 1.3 Establish Tailoring Criteria and Guidelines
o SP 1.4 Establish the Organization’s Measurement Repository
o SP 1.5 Establish the Organization’s Process Asset Library
o SP 1.6 Establish Work Environment Standards
©2011 Mike Rowe
Page 11
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality

SG 2: Enable IPPD Management
o SP 2.1 Establish Empowerment Mechanisms
o SP 2.2 Establish Rules and Guidelines for Integrated Teams
o SP 2.3 Balance Team and Home Organization Responsibilities

Develop and maintain a software process standard for the organization – these are the process
assets.
Create an environment that enables integrated teams to efficiently meet the project’s
requirements and produce quality work.

Organizational Training (OT)
 SG 1: Establish and Organizational Training Capability
o SP 1.1 Establish the Strategic Training Needs
o SP 1.2 Determine Which Training Needs are the Responsibility of the Organization
o SP 1.3 Establish and Organizational Training Tactical Plan
o SP 1.4 Establish Training Capability
 SG 2: Provide Necessary Training
o SP 2.1 Deliver Training
o SP 2.2 Establish Training Records
o SP 2.3 Assess Training Effectiveness

People are trained in the process and needed technologies so that they can use it effectively.
2.3.2 Level 3 Project Management
Integrated Project Management (IPM) + Integrated Product and Process Development (IPPD)
 SG 1: Use the Project’s Defined Process
o SP 1.1 Establish the Project’s Defined Process
o SP 1.2 Use Organizational Process Assets for Planning Project Activities
o SP 1.3 Establish the Project’s Work Environment
o SP 1.4 Integrate Plans
o SP 1.5 Manage the Project Using the Integrated Plans
o SP 1.6 Contribute to the Organizational Process Assets
 SG 2: Coordinate and Collaborate with Relevant Stakeholders
o SP 2.1 Manage Stakeholder Involvement
o SP 2.2 Manage Dependencies
o Sp 2.3 Resolve Coordination Issues
 SG 3: Apply IPPD Principles (Integrated Product and Process Development)
o SP 3.1 Establish the Projects Shared Vision
o SP 3.2 Establish the Integrated Team Structure
o SP 3.3 Allocate Requirements to Integrated Teams
o SP 3.4 Establish Integrated Teams
o SP 3.5 Ensure Collaboration among Interfacing Teams
©2011 Mike Rowe
Page 12
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality


Integrate and define process that involves relevant stakeholders
Software process activities are coordinated – every group has a responsibility in a project
Risk Management (RSKM)
 SG 1: Prepare for Risk Management
o SP 1.1 Determine Risk Sources and Categories
o SP 1.2 Define Risk Parameters
o SP 1.3 Establish a Risk Management Strategy
 SG 2: Identify and Analyze Risks
o SP 2.1 Identify Risks
o SP 2.2 Evaluate, Categorize, and Prioritize Risks
 SG 3: Mitigate Risks
o SP 3.1 Develop Risk Mitigation Plans
o SP 3.2 Implement Risk Mitigation Plans


Identify potential problems before they occur.
Risk handling activities are planned throughout the products lifecycle to address potential risks.
2.3.3 Level 3 Engineering
Requirements Development (RD)
 SG 1: Develop Customer Requirements
o SP 1.1 Elicit Needs
o SP 1.2 Develop Customer Requirements
 SG 2: Develop Product Requirements
o SP 2.1 Establish Product and Product Component Requirements
o SP 2.2 Allocate Product Component Requirements
o SP 2.3 Identify Interface Requirements
 SG 3: Analyze and Validate Requirements
o SP 3.1 Establish Operational Concepts and Scenarios
o SP 3.2 Establish a Definition of Requirement Functionality
o SP 3.3 Analyze Requirements
o SP 3.4 Analyze Requirements to Achieve Balance
o SP 3.5 Validate Requirements
Technical Solution (TS)
 SG 1: Select Product Component Solutions
o SP 1.1 Develop Alternative Solutions and Selection Criteria
o SP 1.2 Select Product Component Solutions
 SG 2: Develop Design
o SP 2.1 Design the Product or Product Component
o SP 2.2 Establish Interfaces Using Criteria
©2011 Mike Rowe
Page 13
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality

o SP 2.3 Design Interfaces Using Criteria
o SP 2.4 Perform Make, Buy, or Reuse Analysis
SG 3: Implement the Product Design
o SP 3.1 Implement the Design
o SP 3.2 Develop Product Support Documentation

Develop, design and implement solutions to requirements.
Product Integration (PI)
 SG 1: Prepare for Product Integration
o SP 1.1 Determine Integration Sequence
o SP 1.2 Establish the Product Integration Environment
o SP 1.3 Establish Product Integration Procedures and Criteria
 SG 2: Ensure Interface Compatibility
o SP 2.1 Review Interface Descriptions for Completeness
o SP 2.2 Manage Interfaces
 SG 3: Assemble Product Components and Deliver the Product
o SP 3.1 Confirm Readiness of Product Components for Integration
o SP 3.2 Assemble Product Components
o SP 3.3 Evaluate Assembled Product Components
o SP 3.4 Package and Deliver the Product or Product Component

Products are assembled to ensure they function properly and deliver the product to customers.
Verification (VER)
 SG 1: Prepare for Verification
o SP 1.1 Select Work Products for Verification
o SP 1.2 Establish the Verification Environment
o SP 1.3 Establish Verification Procedures and Criteria
 SG 2: Perform Peer Reviews
o SP 2.1 Prepare for Peer Reviews
o SP 2.2 Conduct Peer Reviews
o SP 2.3 Analyze Peer Review Data
 SG 3: Verify Selected Work Products
o SP 3.1 Perform Verification
o SP 3.2 Analyze Verification Results
Validation (VAL)
 SG 1: Prepare for Validation
o SP 1.1 Select Products for Validation
o SP 1.2 Establish the Validation Environment
o SP 1.3 Establish Validation Procedures and Criteria
©2011 Mike Rowe
Page 14
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality

SG 2: Validate Product or Product Component
o SP 2.1 Perform Validation
o SP 2.2 Analyze Validation Results
2.3.4 Level 3 Support
Decision Analysis and Resolution (DAR)
 SG 1: Evaluate Alternatives
o SP 1.1 Establish Guidelines for Decision Analysis
o SP 1.2 Establish Evaluation Criteria
o SP 1.3 Identify Alternative Solutions
o SP 1.4 Select Evaluation Methods
o SP 1.5 Evaluate Alternatives
o SP 1.6 Select Solutions

Analyze possible decisions using a formal evaluation process that takes into account the impact
of alternative solutions.
2.4 Level 4: Quantitatively Managed
With Quantitatively managed level four, all objectives from lower levels are already achieved.
Additionally sub processes are introduced.
 The processes’ entire performance and quality is controlled using quantitative techniques and
statistics during the entire lifecycle of processes.

Processes at this level are constantly measured and statistically are analyzed.

The results of this control are used for better managing the processes and projects. This
makes the process quantitatively predictable. In comparison, at level three only processes’
quality can be predicted and not actually known during the project.

Customer and user needs are the foundation for quantitative goals.

At this level it is important to detect and correct differences and reasons for differences
between the given process and the process applied during the project.

All this has to happen at organization’s level not just on project level, to avoid variations for
future projects and to support further decisions.
©2011 Mike Rowe
Page 15
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality
Maturity Level
Process Management
Project
Management
4: Quantitatively
Managed
Organization Process
Performance (OPP)
Quantitative Project
Management (QPM)
Engineering
Support
2.4.1 Level 4 Process Management
Organizational Process Performance (OPP)
 SG 1: Establish Performance Baselines and Models
o SP 1.1 Select Processes
o SP 1.2 Establish Process Performance Measures
o SP 1.3 Establish Quality and Process Performance Objectives
o SP 1.4 Establish Process Performance Baselines
o Sp 1.5 Establish Process Performance Models


Establish and maintain a quantitative understanding of performance of the organization’s
standard process in support of quality and process performance objectives.
Provide performance data, baselines, and models to quantitatively manage people and
projects.
2.4.2 Level 4 Project Management
Quantitative Project Management (QPM)
 SG 1: Quantitatively Manage the Project
o SP 1.1 Establish the Project’s Objectives
o SP 1.2 Compose the Defined Process
o SP 1.3 Select the Subprocesses that will be Statistically Managed
o SP 1.4 Manage Project Performance
 SG 2: Statistically Manage Subprocess Performance
o SP 2.1 Select Measures and Analytic Techniques
o SP 2.2 Apply Statistical Methods to Understand Variation
o SP 2.3 Monitor Performance of the Selected Subprocesses
o SP 2.4 Record Statistical Management Data

Quantitatively manage the project’s defined process to achieve the project’s quality and
process objectives.
2.5 Level 5: Optimizing
Level five, the optimizing level, all processes are already defined and managed. Goals for levels one to
four are all achieved.
©2011 Mike Rowe
Page 16
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality

The purpose of this level is to continually improve the performance of all practices of all
processes, based on quantitative improvement. Improvements can be incremental
improvements and as well as innovative technologies.

The organization is establishing goals for improving processes. These goals are continually
controlled and changed if necessary. This supports changes to business goals used for
improving the processes.

After those improvements are introduced, they are continuously measured and their
quantitative contribution to process improvements are assessed.

In level four and level five, there are still variations in processes. However, the important
difference between level four and level five is that at level five the reasons for the changes are
measured and analyzed to optimize the processes. Results from those measurements and
analysis are used to change the process for still being able to achieve the given quantitative
process goals
Maturity Level
5:
Optimizing
Process Management
Project
Management
Engineering
Organization
Innovation and
Deployment (OID)
Support
Causal Analysis
and Resolution
(CAR)
2.5.1 Level 5 Process Management
Organizational Innovation and Deployment (OID)
 SG 1: Select Improvements
o SP 1.1 Collect and Analyze Improvement Proposals
o SP 1.2 Identify and Analyze Innovations
o SP 1.3 Pilot Improvements
o SP 1.4 Select Improvements for Deployments
 SG 2: Deploy Improvements
o SP 2.1 Plan the Deployment
o SP 2.2 Manage the Deployment
o SP 2.3 Measure Improvement Efforts

Select and deploy incremental innovative improvements that measurably improve the
organization’s processes.
©2011 Mike Rowe
Page 17
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality
2.5.2 Level 5 Project Management
Causal Analysis and Resolution (CAR)
 SG 1: Determine Causes of Defects
o SP 1.1 Select Defect Data for Analysis
o SP 1.2 Analyze Causes
 SG 2: Address Causes of Defects
o SP 2.1 Implement the Action Proposals
o SP 2.2 Evaluate the Effect of Changes
o SP 2.3 Record Data


Identify defects and their causes.
Take measures to prevent the causes of the defects from occurring in the future.
2.6 Example of a Level 5

Lockheed Martin Federal Systems, Owego has attained the highest rating of a company's
software development capability, Level 5, from the Carnegie Mellon Software Engineering
Institute (SEI).

In 2003 on only three companies were rated CMM Level 5 (CMM is the predecessor to CMMI)
world-wide.
The following link provides query by Level and Year of CMMI certified companies.
http://sas.sei.cmu.edu/pars/pars.aspx
2.6.1 Esterline Control Systems – AVISTA

Avista became one of 22 companies in the US to achieve Level five during July 2007 CMMI
under the v1.1 standard.

Every three years a company must be re-evaluated. AVISTA was re-evaluated in May 2010 and
was recertified CMMI Level 5 under the v1.2 standard.

If you are Level 5 the re-evaluation must indicate that your are continuing to Optimize your
process and performance
o If you only hold your ground you are no longer considered Level 5.

Below is the 2007 SCAMPI Evaluation for AVISTA from the above link.
©2011 Mike Rowe
Page 18
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality
Organization
Organization
AVISTA, Incorporated
Name:
Appraisal Sponsor Thomas Bragg, Tim Budden
Name:
Lead Appraiser
Johnny Childs
Name:
SEI Partner
cLear Improvement & Associates, LLC
Name:
Organizational Unit Description
Projects/Support **Sensitive: Platteville, WI United States
Groups
**Sensitive: Platteville, WI United States
**Sensitive: Platteville, WI United States
**Sensitive: Platteville, WI United States
**Sensitive: Platteville, WI United States
View Detail
Organizational Sample Size
% of people
33
included:
% of projects
7
included:
Org Scope
For the purpose of the appraisal 3 full lifecycle projects were selected as
Description: focus projects and 2 verification projects were selected as non-focus
projects (VER, VAL, and CM were mapped).
There are presently 137 employees and 80 active projects at AVISTA
totalling 220,477 hours. The sample projects comprise 33.8% of the
engineering staff and 26.4% (58,049) of the hours.
Appraisal Description
Appraisal End
Jul 27, 2007
Date:
Appraisal
Jul 27, 2010
Expiration Date:
©2011 Mike Rowe
Page 19
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality
Appraisal Method
SEI SCAMPI v1.2 A
Used:
CMMI
CMMI v1.1
Information:
Appraised
Functional Areas
Included:
Model Scope and Appraisal Ratings
Level 2
Level 3
Satisfied
REQM
Satisfied
Satisfied
Satisfied
Level 4
RD
Satisfied
PP
Satisfied
TS
Satisfied
PMC
Satisfied
PI
Satisfied
SAM
Satisfied
VER
Satisfied
MA
Satisfied
VAL
Satisfied
PPQA
Satisfied
OPF
CM
Satisfied
OPD
Satisfied
OT
Satisfied
IPM
Satisfied
RSKM
Not Rated
IT
Not Rated
ISM
Satisfied
DAR
Not Rated
OEI
Satisfied
Level 5
OPP
Satisfied
OID
QPM
Satisfied
CAR
Organizational Unit Maturity Level Rating: 5
Additional Information for Appraisals Resulting in Capability or Maturity Level 4 or 5 Ratings:
Hide Detail
Process Performance Baselines
1. Number of defects injected into test cases developed internally - data collected at test
reviews. This deals with the Enterprise quality objectives and an organizational objective dealing
with defects injected per unit. It is a measure of defect removal efficiency as well as the quality
of the product.
2. The number of hours spent developing a test case for an average or difficult complexity
©2011 Mike Rowe
Page 20
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality
requirement. This deals with the business objective of performance to budget.
3. The number of hours spent developing a test case for a simple complexity requirement. This
deals with the business objective of performance to budget.
4. The number of defects detected in the testing of requirements (from all sources). This
supports their business quality objectives and is a measure of defect removal efficiency.
5. Number of defects injected per SLOC developed in-house. This supports their business quality
objectives and is a measure of defect removal efficiency as well as the quality of the product.
6. Number of defects found per SLOC, regardless of the development source. This supports their
business quality objectives and is a measure of review efficiency.
7. Hours spent in the test review for an average to difficult requirement covered in a
requirement based test. This supports the performance to budget objective.
8. Hours spent in the test review for a simple requirement covered in a requirement based test.
This supports the performance to budget objective.
9. Number of defects detected in structural verification (product integration phase). This
supports their business quality objectives and is a measure of defect removal efficiency
10. Number of trivial action items resulting from internally injected defects per requirement
covered in requirements based test. This supports their business quality objectives and is a
measure of defect removal efficiency and review efficiency.
Models
The models were developed by an AVISTA engineer who is also a professor at the University of
Wisconsin, Platteville (UWP) with advanced degrees in statistics and computer science. His
statistical analysis with the baselines in development of the models is well documented and
exhaustive
1. Based on the number of defects injected - Baseline 1. above - (or the Organizational Mean if
the project running mean is unknown), the total Cost of Quality (COQ) or remaining COQ can be
projected. COQ is the number of hours and cost associated with "clean up" activities (rework,
etc.). Using the measures available for defects/rqmt the COQ Hrs/Rqmt can be projected. This
predictor is associated with schedule and cost.
2. Based on the means of development hours/sw requirement tested and review hours/sw
requirements tested in a review (or the Organizational Mean if the project running mean is
unknown) - baselines 2 & 3 and 7 & 8 above, the total COQ or remaining COQ can be projected.
Using the measures listed for development hours/SW rqmt tested and review hours/SW rqmt
tested in review the COQ Hrs/Rqmt can be projected. This predictor is associated with schedule
and cost.
3. Based on the number of defects detected - baselines 4, 6, and 9 above (or the Organizational
Mean if the project running mean is unknown), the CPI for the SW RBT Activity can be
projected. This predictor is associated with schedule and cost. The organization makes broad
use of Earned value as a tracking mechanism. This model is an enabler for the prediction of EV.
©2011 Mike Rowe
Page 21
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality
2.7 Rationale For Government Requirement
Below is a rationale for government contracts to obtain at least Level 3. Think back to the first lab that
we did in the class – perhaps the State of Wisconsin should require a bit more of its contractors.
08/12/02; Vol. 21 No. 23
Feds must weigh worth of vendors' CMM claims
http://www.gcn.com/21_23/outsourcing/19576-1.html
By Susan M. Menke
GCN Staff
Agencies often rely on Capability Maturity Model ratings touted by vendors when deciding whether to
hire them for software development jobs. But are the rating claims made by many vendors legit?
Maybe, maybe not.
Since the Software Engineering Institute at Carnegie Mellon University began using the CMM to take
the pulse of software development 15 years ago, the number of participating software shops has
soared more than tenfold.
SEI's 1993 software maturity profile cited 156 organizations, 65 percent of them government agencies
or their contractors. The most recent profile--issued in March by the institute--cited 1,638
organizations, and only 31 percent were government agencies or their contractors.
Despite the last 15 years' tremendous growth in commercial software, which now far overshadows
government development, the Defense Department has always been the financial force behind SEI's
model.
The CMM for software ranks projects and organizations at levels 1 through 5. The higher levels are
supposed to indicate fewer software defects, more repeatable development processes and better
project management.
DOD acquisition policy for major systems, such as weapons systems, requires contractors to have what
is called "CMM Level 3 equivalence." Bidders that lack such credentials must submit risk-mitigation
plans with their bids.
Civilian acquisition officials are far less strict about equivalence ratings, even for projects costing
hundreds of millions of dollars.
"Typical DOD IT projects in the $100 million range, which account for most of the problems and
failures, are not covered" by the Level 3 equivalence requirement, said Lloyd K. Mosemann II, a former
deputy assistant secretary of the Air Force and now a senior vice president with Science Applications
©2011 Mike Rowe
Page 22
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality
International Corp. "At the very least, government [should] specify that the performing organization
be Level 3" on the software CMM, Mosemann said in his keynote speech at the recent Software
Technology Conference in Salt Lake City.
"Virtually every large DOD contractor can boast at least one organization at Level 4 or above, and
several at Level 3," Mosemann said. "On the other hand, most DOD software is still being developed in
less mature organizations, mainly because the program executive office or program manager doesn't
demand that the part of the company that will actually build the software be Level 3."
Reaching Level 5
The economic pressure to obtain a prestigious Level 3, 4 or 5 rating has led to a proliferation of SEI and
non-SEI models--not only for software but for acquisition, personnel, product development, systems
integration and other areas.
Only one software shop in 1993 had a strong enough grip on its development practices to reach the
rarefied Level 5. In contrast, the latest list shows that 86 organizations say they have Level 5
certification--but SEI does not guarantee the accuracy of these claims.
An SEI disclaimer, at www.sei.cmu.edu/sema/pub_ml.html, says, "This list of published maturity levels
is by no means exhaustive."
Why is that?
Software quality assessment, like real estate appraisal, is partly a science and partly an art. SEI
maintains a list of hundreds of appraisers, assessors and consultants who will undertake to rate
software strengths and weaknesses according to the SEI model.
That wide dispersion of authority, coupled with the enormous growth of the software industry, leaves
SEI in the position of neither confirming nor denying the claims that are made using its model.
"As a federally funded research and development center, SEI must avoid any statement that might be
perceived to validate or certify the assessment results that an organization chooses to make public,"
SEI spokesman Bill Pollak said. "The most we can do is to validate the conduct of an assessment--for
example, 'An SEI-authorized lead assessor and trained team performed the assessment.' We do
receive results from SEI-authorized lead assessors, but we keep those results confidential."
SEI senior technical staff member Mary Beth Chrissis said there are "many different flavors of
appraisals. Many other organizations have developed their own appraisals" based on SEI's publicdomain model. A number of such organizations are offshore.
Where does that leave agencies that want to make sure they hire competent contractors whose CMM
certifications are current?
©2011 Mike Rowe
Page 23
2/10/2016
Notes_010
SE 3730 / CS 5730 – Software Quality
A starting point is SEI's CMM for software acquisition (SA-CMM), developed in the mid-1990s by a
government-industry team. Many agencies, including the General Accounting Office and the IRS, have
used its methods to evaluate contracting or outsourcing practices. But there is no recommended list of
SA-CMM vendors, and SEI does not qualify them. In choosing such vendors, SEI says, the key is
experience: "The experience should be demonstrated and not just claimed."
Mosemann, one of the instigators of the SA-CMM, said it was not meant to apply to contractors but
rather to government program and acquisition offices.
"The problem that I perceived--and it clearly exists today--is that a gross mismatch occurs when a DOD
program office that can barely spell the word 'software' oversees a Level 3 or 4 contractor
organization," he said.
"The government program manager has no appreciation for the tools, techniques and methods--and
their cost--that are necessary to develop software on a predictable schedule at a predictable cost with
predictable performance results," Mosemann said. "That is why there is no list of SA-CMM contractors
and why SEI has no plan to qualify them."
Meanwhile, Defense is negotiating with its service acquisition executives to use SEI's newer CMM for
integration, said Joe Jarzombek, deputy director for software-intensive systems in the Office of the
Undersecretary of Defense for Acquisition, Technology and Logistics.
Managers of major software programs are required to choose contractors that have succeeded at
comparable systems and that have mature software development processes in place, Jarzombek said.
DOD's present Software Development Capability Evaluation Core is equivalent to the software CMM
Level 3 criteria, he said.
©2011 Mike Rowe
Page 24
2/10/2016
Download