Uploaded by muhammadhossam

spice2005 Rout Tuffley

advertisement
See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/29453909
The ISO/IEC 15504 Measurement Framework for Process Capability and CMMI
Article · January 2005
Source: OAI
CITATIONS
READS
2
17,506
2 authors:
Terry Rout
Angela Tuffley
Griffith University
Griffith University
80 PUBLICATIONS 623 CITATIONS
15 PUBLICATIONS 136 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
Development of the Standard for Process Assessment View project
All content following this page was uploaded by Terry Rout on 20 August 2015.
The user has requested enhancement of the downloaded file.
SEE PROFILE
The ISO/IEC 15504 Measurement Framework for Process
Capability and CMMI®
Terence P. Rout and Angela Tuffley
Software Quality Institute
Griffith University
Queensland, Australia
T.Rout@griffith.edu.au
The requirements for conformance of a process model to the international standard for process
assessment, ISO/IEC 15504, cover a broad range. The most significant of these is the need to
establish a complete and unambiguous mapping between the model and a relevant Process
Reference Model, and to develop a mechanism for translating the outputs from an assessment into
the standard process profiles defined in ISO/IEC 15504. A key issue in the mapping is the need to
determine the relationship between the candidate Process Assessment Model and the Measurement
Framework for process capability defined in ISO/IEC 15504-2. This paper reports the results of an
analysis of the relationship between CMMI®, as a “candidate conformant assessment model”, and
the Measurement Framework; it is shown that, while in general terms a complete mapping can be
established, there are specific areas where coverage is weak, and it is necessary to move outside the
capability aspects of CMMI® to obtain complete coverage.
Introduction
The increasing adoption of process assessment, both for internal process improvement and also as a
technique useful in supplier selection and management, led to the development of the international
standard for software process assessment, ISO/IEC 15504 [1]. Two different classes of process
models are identified in the framework established by this International Standard [2]; they are:
1. Process Reference Models (PAM): The purpose of these models is to provide the descriptions
of the process entities to be evaluated - to define what is to be measured. Process Reference
Models are, in a very real sense, standards, in that they provide a common terminology and
description of scope for process assessment.
2. Process Assessment Models (PRM): The purpose of these models is to support the conduct of
an assessment. Models may have significant differences in structure and content, but can be
referenced to a common source (a Process Reference Model), providing a mechanism for
harmonisation between different approaches to assessment.
®
CMMI is registered in the U.S. Patent and Trademark Office by Carnegie Mellon University
A Process Reference Model provides the basis for conformance of any Process Assessment Model
and contributes the overall definitions of the processes within the scope of the Process Assessment
Model. The derivation of requirements for the definition of processes is an essential element of the
derivation of the framework.
A Process Assessment Model describes processes, in terms of the evidence that may be identified,
that demonstrates that the process has in fact been implemented. They generally comprise sets of
practices and descriptions of work products that serve as indicators of process performance and
process capability.
Conformance of Process Assessment Models
The requirements for conformance of Process Assessment Models in the revised Standard are
defined on the basis of relationships between the Process Assessment Model and both the external
Process Reference Model and the in-built measurement framework of ISO/IEC 15504. These
requirements are expressed in the following terms [1]:
Scope
A Process Assessment Model relates to at least one process from the specified Process Reference
Model(s), and must address all or a continuous subset of the levels (starting at level 1) of the
Measurement Framework for process capability for each of the processes within its scope Since
each level includes all the attributes of lower levels, any PAM must start from Level 1. However,
the PAM may only cover part of the scale - for example, from Level 1 to Level 3. The PAM must
declare the scope of coverage in the terms of :
a) the selected Process Reference Model(s);
b) the selected processes taken from the Process Reference Model(s); and
c) the capability levels selected from the Measurement Framework.
Process Assessment Model indicators
A Process Assessment Model is based on a set of indicators that explicitly address the purposes and
outcomes, as defined in the selected Process Reference Model, of all the processes within the scope
of the Process Assessment Model; and that demonstrates the achievement of the process attributes
within the capability level scope of the Process Assessment Model. The indicators focus attention
on the implementation of the processes in the scope of the model. The model elements must be
shown to include indicators of both process performance and capability; this will enable judgments
of process capability to be soundly based on objective evidence.
Mapping
The relevant elements of the Process Assessment Model must be mapped to the processes of the
selected Process Reference Model and to the relevant process attributes of the Measurement
Framework; the assessor has to have access to the details of the mapping of the elements of the
model to the reference model.
The indicators, within the Process Assessment Model, are to be mapped to the purposes and
outcomes of the processes in the specified Process Reference Model, and also to the process
attributes (including all of the results of achievements listed for each process attribute) in the
Measurement Framework. This enables Process Assessment Models that are structurally different to
be related to the same Process Reference Model.
The mapping must be complete, clear and unambiguous; such mapping helps to substantiate the
claims of scope of coverage of the model. The mapping may be simple, as is the case in the
embedded assessment model contained in ISO/IEC 15504-5; however, where the structure of the
model is significantly different from the reference model, the mapping may be quite complex.
Expression of assessment results
A Process Assessment Model has to provide a formal and verifiable mechanism for representing the
results of an assessment as a set of process attribute ratings for each process selected from the
specified Process Reference Model(s). The expression of results may involve a direct translation of
Process Assessment Model ratings into a process profile as defined in ISO/IEC 15504, or the
conversion of the data collected during the assessment (with the possible inclusion of additional
information) through further judgement on the part of the assessor.
Thus, all models that meet the requirements for compatibility should be able to provide results in
the form of Process Profiles as defined in the Standard: a set of process attribute ratings for each
process assessed. This is a significant issue for any acquisition agency wishing to consider
capability-related issues in supplier management, as it allows suppliers to maintain their own
desired approach to assessment, reporting results as common profiles.
The Measurement Framework for Process Capability
ISO/IEC 15504-2 [1] defines an ordinal scale for the evaluation of process capability, based upon
six defined Capability Levels:
Level 0: Incomplete process - the process is not implemented, or fails to achieve its process
purpose.
Level 1: Performed process - the implemented process achieves its process purpose.
Level 2: Managed process - the Performed process is planned, monitored and adjusted, and its work
products are appropriately established, controlled and maintained.
Level 3: Established process - the Managed process is implemented using a defined process,
tailored from a set of standard process assets, that is capable of achieving its process outcomes.
Level 4: Predictable process - the Established process operates within defined limits to achieve its
process outcomes.
Level 5: Optimizing process - the Predictable process is continuously improved to meet relevant
current and projected business goals.
Table 1 – Capability Levels and Process Attributes
Level 1
PA 1.1
Level 2
PA 2.1
PA 2.2
Level 3
PA 3.1
PA 3.2
Level 4
PA 4.1
PA 4.2
Level 5
PA 5.1
PA 5.2
Performed Process
Process Performance
Managed Process
Performance Management
Work Product Management
Established Process
Process Definition
Process Deployment
Predictable Process
Process Measurement
Process Control
Optimising Process
Process Innovation
Process Optimization
The Measurement Framework provides a schema that can be used to characterise the capability of
an implemented process with respect to a specific Process Assessment Model. Within this
Measurement Framework, the measure of capability is based upon a set of Process Attributes (PA).
Each attribute defines a particular aspect of process capability; the extent of process attribute
achievement is characterised on a defined rating scale. The combination of process attribute
achievement and a defined grouping of process attributes together determine the process capability
level. The Process Attributes within the framework are listed in Table 1.
Capability Maturity Model Integration
The CMMI® Project [3] is a collaborative effort sponsored by the Office of the Under Secretary of
Defense for Acquisition, Technology, and Logistics (OUSD/AT&L), and the National Defence
Industrial Association (NDIA), with participation by government, industry, and the Software
Engineering Institute (SEISM). The project's objective is to develop a product suite that provides
industry and government with a set of integrated products to support process and product
improvement. The intent is for the CMMI models to eventually replace existing Capability
Maturity Model®s, including SW-CMM®, SE-CMM and IPD-CMM.
CMMI models provide two “representations” [3, 4] for presenting the same model components – a
Staged Representation, in which the elements are arranged from the perspective of organizational
process maturity, and a Continuous Representation, in which the elements provide a view of process
capability for individual Process Areas.
In the Continuous Representation, a two-dimensional structure is defined, with one dimension
comprising sets of Process Areas, describing the processes implemented within an organization,
while the other dimension describes the capability with which the processes are implemented, in
terms of a set of six Capability Levels. The Levels are broadly equivalent to those described in the
ISO/IEC 15504 Measurement Framework. Each Capability Level is seen as being achieved through
the performance of a set of Generic Practices (GPs); the number varies from level to level, but the
same set of GPs applies across all Process Areas. A list of the Generic Goals (defining the
Capability Levels) and the Generic Practices is given in Table 2.
One objective of the CMMI Project is to establish consistency and compatibility with the
international standard for process assessment. In order to determine the feasibility of this goal, a
critical initial step is to establish mappings between CMMI and the ISO/IEC 15504 Measurement
Framework. The Continuous Representation is most similar to the model established in ISO/IEC
15504, and provides the best starting point for the establishment of a relationship between the two
models.
In earlier work [5], we examined the relationship between Release 1.0 of CMMI and the Process
Reference Model embedded in ISO/IEC TR 15504-2. Although the coverage of the model was
found to be good, there were a number of areas that caused concern in relation to their impact on the
achievement of a mechanism for conversion of CMMI assessment data to ISO/IEC 15504 standard
process profiles. In some instances the areas of concern constituted potential problems in the
structure or content of the CMMI Model.
Since the earlier work, the ISO/IEC 15504 Measurement Framework has been extensively revised,
and there have also been significant changes to the CMMI in the release of Version 1.1. The work
reported here presents one key aspect of the demonstration of conformance of CMMI as a Process
Assessment Model within the terms of CMMI.
®
SM
Capability Maturity Model, Capability Maturity Modeling, CMM, and CMMI are registered in the U.S. Patent and Trademark Office by
Carnegie Mellon University
SEI is a service mark of Carnegie Mellon University.
Table 2 – Generic Goals and Generic Practices
GG 1
GP 1.1
GG 2
GP 2.1
GP 2.2
GP 2.3
GP 2.4
GP 2.5
GP 2.6
GP 2.7
GP 2.8
GP 2.9
GP 2.10
GG 3
GP 3.1
GP 3.2
GG 4
GP 4.1
GP 4.2
GG 5
GP 5.1
GP 5.2
Achieve Specific Goals
Perform Base Practices
Institutionalize a Managed Process
Establish an Organizational Policy
Plan the Process
Provide Resources
Assign Responsibility
Train People
Manage Configurations
Identify and Involve Relevant Stakeholders
Monitor and Control the Process
Objectively Evaluate Adherence
Review Status with Higher Level Management
Institutionalize a Defined Process
Establish a Defined Process
Collect Improvement Information
Institutionalize a Quantitatively Managed Process
Establish Quantitative Objectives for the Process
Stabilize Subprocess Performance
Institutionalize an Optimizing Process
Ensure Continuous Process Improvement
Correct Root Causes of Problems
In another report [6], we have presented a mapping from CMMI to the Process Reference Model in
ISO/IEC 12207 (Amd 1) – Software Life Cycle Processes. Some progress has also been made in
developing a mapping to ISO/IEC 15288 – Systems Life Cycle Process.
This paper presents the results of our mapping of CMMI to the ISO/IEC 15504 Measurement
Framework. The results of this mapping can be combined with a mapping to a relevant Process
Reference Model to enable expression of results as standard Process Profiles. More specifically,
any concerns in relation to this mapping will impact on the overall conformance of the model.
Mapping Principles
The rating elements in the CMMI are the Goals (Specific and Generic). The rating of goals is
performed on the basis of evidence recorded against each Practice (Specific and Generic). It is
evident, therefore, that the Practices are "indicators" of process performance and process capability
in the terms of ISO/IEC 15504.
Capability issues in the CMMI are addressed through the Generic Practices. These are most clearly
set out in the Continuous Representation. They are elements in CMMI that apply across all of the
Process Areas. In mapping to the Measurement Framework, therefore, we have focussed on the
Generic Practices.
Tables 1 and 2 list, respectively, the Process Attributes from the Measurement Framework in
ISO/IEC 15504 , and the Generic Practices from CMMI. .
We have explicitly mapped every item in the description of each Generic Practice to the
Measurement Framework components. Where elements of the Measurement Framework were not
covered, we have examined other elements in CMMI for coverage.
Mapping Results
The results of the mapping are presented for each Capability Level.
Capability Level 1
Generic Practice 1.1 relates strongly to PA 1.1 in the Measurement Framework. Any concerns in
CMMI will be in relation to the Process Dimension.
Capability Level 2
The ten Generic Practices at Level 2 have been mapped to the six achievements in PA 2.1 and the
four achievements in PA 2.2. The results are shown in Figure 1.
Level 2
GP 2.1
GP 2.2
GP 2.3
GP 2.4
GP 2.5
GP 2.6
GP 2.7
GP 2.8
GP 2.9
GP 2.10
a
x
x
b
PA 2.1
c
d
x
x
x
e
f
a
PA 2.2
b
c
x
x
x
x
x
x
x
x
d
x
x
x
x
x
x
x
x
x
x
x
x
Figure 1 – GP to PA Mapping, Capability Level 2
The principle concern is that the coverage of PA 2.2.d is very weak:
“work products are reviewed in accordance with planned arrangements and adjusted as
necessary to meet requirements.”
The only reference in CMMI is under GP 2.8 – Monitor and Control the Process:
SubPractice 6: Take corrective action when requirements and objectives are not being satisfied,
when issues are identified, or when progress differs significantly from the plan for performing
the process.
Corrective action may include the following:
•
Taking remedial action to repair defective work products or services
It has been noted that there appears to be an inconsistency in the Generic Practice definitions in
relation to the coverage of Product and Process Quality Assurance; in the preamble to Chapter 4 in
CMMI, it is clear that Capability Level 2 embraces both product and process issues, but in the
definition of the Generic Practices, all reference to Product Quality Assurance is lost.
Other less significant issues in relation to this Capability Level have been identified. The
Measurement Framework does not explicitly address the establishment of policy, and the linkage
between GP 2.1 and PA 2.1a addresses only the establishment of process performance objectives
(“organisational expectations”). PA 2.1e addresses control of resources and information, while the
CMMI elements tend to focus exclusively on resources.
Capability Level 3
The two Generic Practices at Level 3 have been mapped to the five achievements in PA 3.1 and
the six achievements in PA 3.2. The results are shown in Figure 2.
Level 3
GP 3.1
GP 3.2
a
x
b
x
PA 3.1
c d
x
x
e
x
x
a
x
b
PA 3.2
c
d
e
f
x
x
Figure 2 – GP to PA Mapping, Capability Level 3
The CMMI Generic Practices for Level 3 do not explicitly require establishment of a Standard
Process: this is subsumed in the definition of the generic process. Mapping of PA 3.1 therefore is
by reference to Organizational Process Definition Process Area, where the contents of the Standard
Process are defined.
The CMMI does not specify that specific aspects of the Defined Process must be drawn from the
Standard Process. PA 3.2 requires that roles, responsibilities, resources, and infrastructure specified
should be tailored from standard process elements. This is partly addressed in the general
description of Capability Level 3; however, it is a significant issue, and raises the possibility that a
process meeting CMMI requirements for Capability Level 3 might be significantly deficient when
viewed from the ISO/IEC 15504 perspective.
PA 3.2 c, e and f are addressed to some extent in the Integrated Project Management process area,
although there is still no coverage of PA 3.2b and d. This provides some confidence that better
coverage will be obtained, but this extends only to the scope of IPM.
Capability Level 4
The two Generic Practices at Level 4 have been mapped to the six achievements in PA 4.1 and
the five achievements in PA 4.2. The results are shown in Figure 3.
Level 4
a
x
GP 4.1
GP 4.2
PA 4.1
c
d
x
x
b
x
e
f
a
b
x
x
x
x
PA 4.2
c
d
x
e
x
x
Figure 3 – GP to PA Mapping, Capability Level 4
Although the partitioning is slightly different, there are no real gaps. There could be an issue with
the CMMI concept of “sub-process” in relation to statistical control. The focus in PA 4.1 and PA
4.2 is clearly on overall process performance; whether the identification and control of “subprocesses” fully addresses this can only be determined as a result of experience.
Capability Level 5
The two Generic Practices at Level 5 have been mapped to the five achievements in PA 5.1 and
the three achievements in PA 5.2. The results are shown in Figure 4.
Level 5
GP 5.1
GP 5.2
a
x
b
x
x
PA 5.1
c
d
x
x
e
x
a
x
PA 5.2
b
c
x
x
Figure 4 – GP to PA Mapping, Capability Level 5
The Level 5 Process Attributes are almost entirely addressed by the single Generic Practice 5.1.
This could result in a lack of detail for conversion of results.
Discussion and Conclusions
The Generic Practices of CMMI have been successfully mapped to the Process Attributes in the
Measurement Framework of ISO/IEC 15504. Several significant issues have been identified.
The lack of reference in CMMI to the review and adjustment of work products as a characteristic of
Level 2 capability is a significant issue. It may reflect a general weakness in CMMI ratings in
relation to ISO/IEC 15504 ratings.
The lack of a specific requirement in the Level 3 Generic Practices to establish a Standard Process
is a concern. While the issue is addressed by considering the definition of “defined process”, it
could result in confusion.
CMMI does not address the characteristics of the Defined Process in the same detail as ISO/IEC
15504. Questions of roles, responsibilities, resources and infrastructure are not considered. This
could result in problems, particularly in processes not specifically implemented in Projects (where
Integrated Project Management provides some coverage).
The Capability Dimension of CMMI addresses most of the matters addressed in the
Measurement Framework as defined in ISO/IEC 15504, providing a basis for conversion of data.
Matters not adequately addressed will need special consideration in ISO/IEC 15504-conformant
CMMI appraisals.
References
1. ISO/IEC 15504-2: 2002, Information Technology – Process Assessment – Part 2: Requirements
2. T.P. Rout, “ISO/IEC 15504 – Evolution to an International Standard” Softw. Process Improve.
Pract. 2003; 8: 27–40.
3. M.B. Chrissis, M. Konrad and S. Schrumm, CMMI®: Guidelines for Process Integration and
Product Improvement. Addison-Wesley, 2003.
4. Mark C. Paulk, Michael D. Konrad, and Suzanne M. Garcia, “CMM Versus SPICE
Architectures,” IEEE Computer Society Technical Council on Software Engineering, Software
Process Newsletter, No. 3, Spring 1995, pp. 7-11.
5. T.P. Rout and A. Tuffley, “SPICE and CMMI: Conformance of the CMMI Models to ISO/IEC
15504”, SPICE 2002, Venice, Italy, March 2002.
6. A. Tuffley, T.P. Rout and B. Acworth, “CMMI Conformance to ISO/IEC 15504: 2. Mapping to
ISO/IEC 12207:1995, Amd 1:2002, Amd 2:2004”, SEPG Australia Conference, Adelaide,
South Australia, September 2004.
View publication stats
Download