P REFACE

advertisement
PREFACE
Software engineering is an emerging discipline and there are unmistakable trends indicating an increasing level of
maturity:








Several universities throughout the world offer undergraduate degrees in software engineering. For
example, such degrees are offered at the University of New South Wales (Australia), McMaster University
(Canada), the Rochester Institute of Technology (US), the University of Sheffield (UK), and other
universities.
In the US, the Engineering Accreditation Commission of the Accreditation Board for Engineering and
Technology (ABET) is responsible for the accreditation of undergraduate software engineering programs.
The Canadian Information Processing Society has published criteria to accredit software engineering
undergraduate university programs.
The Software Engineering Institute�s Capability Maturity Model for Software (SW CMM) and the new
Capability Maturity Model Integration (CMMI) are used to assess organizational capability for software
engineering. The famous ISO 9000 quality management standards have been applied to software
engineering by the new ISO/IEC 90003.
The Texas Board of Professional Engineers has begun to license professional software engineers.
The Association of Professional Engineers and Geoscientists of British Columbia (APEGBC) has begun
registering software professional engineers, and the Professional Engineers of Ontario (PEO) has also
announced requirements for licensing.
The Association for Computing Machinery (ACM) and the Computer Society of the Institute of Electrical
and Electronics Engineers (IEEE) have jointly developed and adopted a Code of Ethics and Professional
Practice for software engineering professionals.1
The IEEE Computer Society offers the Certified Software Development Professional certification for
software engineering. The Institute for Certification of Computing Professionals (ICCP) has long offered a
certification for computing professionals.
All of these efforts are based upon the presumption that there is a Body of Knowledge that should be mastered by
practicing software engineers. The Body of Knowledge exists in the literature that has accumulated over the past
thirty years. This book provides a Guide to that Body of Knowledge.
PURPOSE
The purpose of the Guide to the Software Engineering Body of Knowledge is to provide a consensually validated
characterization of the bounds of the software engineering discipline and to provide a topical access to the Body of
Knowledge supporting that discipline. The Body of Knowledge is subdivided into ten software engineering
Knowledge Areas (KA) plus an additional chapter providing an overview of the KAs of strongly related disciplines.
The descriptions of the KAs are designed to discriminate among the various important concepts, permitting readers
to find their way quickly to subjects of interest. Upon finding a subject, readers are referred to key papers or book
chapters selected because they succinctly present the knowledge.
In browsing the Guide, readers will note that the content is markedly different from computer science. Just as
electrical engineering is based upon the science of physics, software engineering should be based, among other
things, upon computer science. In these two cases, though, the emphasis is necessarily different. Scientists extend
our knowledge of the laws of nature while engineers apply those laws of nature to build useful artifacts, under a
number of constraints. Therefore, the emphasis of the Guide is placed on the construction of useful software
artifacts.
Readers will also notice that many important aspects of information technology that may constitute important
software engineering knowledge are not covered in the Guide, including specific programming languages, relational
databases, and networks. This is a consequence of an engineering-based approach. In all fields�not only
computing�the designers of engineering curricula have realized that specific technologies are replaced much more
rapidly than the engineering work force. An engineer must be equipped with the essential knowledge that supports
the selection of the appropriate technology at the appropriate time in the appropriate circumstance. For example,
software might be built in Fortran using functional decomposition or in C++ using objectoriented techniques. The
techniques for software configuring instances of those systems would be quite different. But, the principles and
objectives of configuration management remain the same. The Guide therefore does not focus on the rapidly
changing technologies, although their general principles are described in relevant KAs.
These exclusions demonstrate that this Guide is necessarily incomplete. The Guide covers software engineering
knowledge that is necessary but not sufficient for a software engineer. Practicing software engineers will need to
know many things about computer science, project management, and systems engineering�to name a few�that fall
outside the Body of Knowledge characterized by this Guide. However, stating that this information should be known
by software engineers is not the same as stating that this knowledge falls within the bounds of the software
engineering discipline. Instead, it should be stated that software engineers need to know some things taken from
other disciplines�and that is the approach adopted in this Guide. So, this Guide characterizes the Body of
Knowledge falling within the scope of software engineering and provides references to relevant information from
other disciplines. A chapter of the Guide provides a taxonomical overview of the related disciplines derived from
authoritative sources.
The emphasis on engineering practice leads the Guide toward a strong relationship with the normative literature.
Most of the computer science, information technology, and software engineering literature provides information
useful to software engineers, but a relatively small portion is normative. A normative document prescribes what an
engineer should do in a specified situation rather than providing information that might be helpful. The normative
literature is validated by consensus formed among practitioners and is concentrated in standards and related
documents. From the beginning, the SWEBOK project was conceived as having a strong relationship to the
normative literature of software engineering. The two major standards bodies for software engineering (IEEE
Computer Society Software Engineering Standards Committee and ISO/IEC JTC1/SC7) are represented in the
project. Ultimately, we hope that software engineering practice standards will contain principles directly traceable to
the Guide.
INTENDED AUDIENCE
The Guide is oriented toward a variety of audiences, all over the world. It aims to serve public and private
organizations in need of a consistent view of software engineering for defining education and training requirements,
classifying jobs, developing performance evaluation policies, or specifying software development tasks. It also
addresses practicing, or managing, software engineers and the officials responsible for making public policy
regarding licensing and professional guidelines. In addition, professional societies and educators defining the
certification rules, accreditation policies for university curricula, and guidelines for professional practice will benefit
from SWEBOK, as well as the students learning the software engineering profession and educators and trainers
engaged in defining curricula and course content.
EVOLUTION OF THE GUIDE
From 1993 to 2000, the IEEE Computer Society and the Association for Computing Machinery (ACM) cooperated
in promoting the professionalization of software engineering through their joint Software Engineering Coordinating
Committee (SWECC). The Code of Ethics was completed under stewardship of the SWECC primarily through
volunteer efforts. The SWEBOK project was initiated by the SWECC in 1998.
The SWEBOK project�s scope, the variety of communities involved, and the need for broad participation
suggested a need for full-time rather than volunteer management. For this purpose, the IEEE Computer Society
contracted the Software Engineering Management Research Laboratory at the Universit� du Qu�bec �
Montr�al (UQAM) to manage the effort. In recent years, UQAM has been joined by the École de technologie
sup�rieure, Montr�al, Qu�bec.
The project plan comprised three successive phases: Strawman, Stoneman, and Ironman. An early prototype,
Strawman, demonstrated how the project might be organized. The publication of the widely circulated Trial Version
of the Guide in 2001 marked the end of the Stoneman phase of the project and initiated a period of trial usage. The
current Guide marks the end of the Ironman period by providing a Guide that has achieved consensus through broad
review and trial application.
The project team developed two important principles for guiding the project: transparency and consensus. By
transparency, we mean that the development process is itself documented, published, and publicized so that
important decisions and status are visible to all concerned parties. By consensus, we mean that the only practical
method for legitimizing a statement of this kind is through broad participation and agreement by all significant
sectors of the relevant community. Literally hundreds of contributors, reviewers, and trial users have played a part in
producing the current document.
Like any software project, the SWEBOK project has many stakeholders�some of which are formally represented.
An Industrial Advisory Board, composed of representatives from industry (Boeing, Construx Software, the MITRE
Corporation, Rational Software, Raytheon Systems, and SAP Labs-Canada), research agencies (National Institute of
Standards and Technology, National Research Council of Canada), the Canadian Council of Professional Engineers,
and the IEEE Computer Society, has provided financial support for the project. The IAB�s generous support
permits us to make the products of the SWEBOK project publicly available without any charge (see
http://www.swebok.org). IAB membership is supplemented with the chairs of ISO/IEC JTC1/SC7 and the related
Computing Curricula 2001 initiative. The IAB reviews and approves the project plans, oversees consensus building
and review processes, promotes the project, and lends credibility to the effort. In general, it ensures the relevance of
the effort to real-world needs.
The Trial Version of the Guide was the product of extensive review and comment. In three public review cycles, a
total of roughly 500 reviewers from 42 countries provided roughly 9,000 comments, all of which are available at
www.swebok.org. To produce the current version, we released the Trial Version for extensive trial usage. Trial
application in specialized studies resulted in 17 papers describing good aspects of the Guide, as well as aspects
needing improvement. A Web-based survey captured additional experience: 573 individuals from 55 countries
registered for the survey; 124 reviewers from 21 countries actually provided comments�1,020 of them. Additional
suggestions for improvement resulted from liaison with related organizations and efforts: IEEE-CS/ACM
Computing Curricula Software Engineering; the IEEE CS Certified Software Development Professional project;
ISO/IEC JTC1/SC7 (software and systems engineering standards); the IEEE Software Engineering Standards
Committee; the American Society for Quality, Software Division; and an engineering professional society, the
Canadian Council of Professional Engineers.
CHANGES SINCE THE TRIAL VERSION
The overall goal of the current revision was to improve the readability, consistency, and usability of the Guide. This
implied a general rewrite of the entire text to make the style consistent throughout the document. In several cases,
the topical breakdown of the KA was rearranged to make it more usable, but we were careful to include the same
information that was approved by the earlier consensus process. We updated the reference list so that it would be
easier to obtain the references.
Trial usage resulted in the recommendation that three KAs should be rewritten. Practitioners remarked that the
Construction KA was difficult to apply in a practical context. The Management KA was perceived as being too close
to general management and not sufficiently specific to software engineering concerns. The Quality KA was viewed
as an uncomfortable mix of process quality and product quality; it was revised to emphasize the latter.
Finally, some KAs were revised to remove material duplicating that of other KAs.
LIMITATIONS
Even though the Guide has gone through an elaborate development and review process, the following limitations of
this process must be recognized and stated:

Software engineering continues to be infused with new technology and new practices. Acceptance of new

techniques grows and older techniques are discarded. The topics listed as �generally accepted� in this
Guide are carefully selected at this time. Inevitably, though, the selection will need to evolve.
The amount of literature that has been published on software engineering is considerable and the reference
material included in this Guide should not be seen as a definitive selection but rather as a reasonable
selection. Obviously, there are other excellent authors and excellent references than those included in the
Guide. In the case of the Guide, references were selected because they are written in English, readily

available, recent, and easily readable, and�taken as a group�they provide coverage of the topics within
the KA.
Important and highly relevant reference material written in languages other than English have been omitted
from the selected reference material.
Additionally, one must consider that

Software engineering is an emerging discipline. This is especially true if you compare it to certain more
established engineering disciplines. This means notably that the boundaries between the KAs of software
engineering and between software engineering and its related disciplines remain a matter for continued
evolution.
The contents of this Guide must therefore be viewed as an �informed and reasonable� characterization of the
oftware engineering Body of Knowledge and as baseline for future evolution. Additionally, please note that the
Guide is not attempting nor does it claim to replace or amend in any way laws, rules, and procedures that have been
defined by official public policy makers around the world regarding the practice and definition of engineering and
software engineering in particular.
CHAPTER 1
INTRODUCTION TO THE GUIDE
In spite of the millions of software professionals worldwide and the ubiquitous presence of software in our society,
software engineering has only recently reached the status of a legitimate engineering discipline and a recognized
profession.
Achieving consensus by the profession on a core body of knowledge is a key milestone in all disciplines and had
been identified by the IEEE Computer Society as crucial for the evolution of software engineering towards
professional status. This Guide, written under the auspices of the Professional Practices Committee, is part of a
multi-year project designed to reach such a consensus.
WHAT IS SOFTWARE ENGINEERING?
The IEEE Computer Society defines software engineering as: �(1) The application of a systematic, disciplined,
quantifiable approach to the development, operation, and maintenance of software; that is, the application of
engineering to software. (2) The study of approaches as in (1).�1
WHAT IS A RECOGNIZED PROFESSION?
For software engineering to be fully known as a legitimate engineering discipline and a recognized profession,
consensus on a core body of knowledge is imperative. This fact is well illustrated by Starr when he defines what
can be considered a legitimate discipline and a recognized profession. In his Pulitzer Prize-winning book on the
history of the medical profession in the USA, he states,
�The legitimization of professional authority involves three distinctive claims: first, that the knowledge and
competence of the professional have been validated by a community of his or her peers; second, that this
consensually validated knowledge rests on rational, scientific grounds; and third, that the professional�s judgment
and advice are oriented toward a set of substantive values, such as health. These aspects of legitimacy correspond
to the kinds of attributes�collegial, cognitive, and moral�usually embodied in the term �profession.�2
WHAT ARE THE CHARACTERISTICS OF A PROFESSION?
Gary Ford and Norman Gibbs studied several recognized professions, including medicine, law, engineering, and
accounting.3 They concluded that an engineering profession is characterized by several components:





An initial professional education in a curriculum validated by society through accreditation
Registration of fitness to practice via voluntary certification or mandatory licensing
Specialized skill development and continuing professional education
Communal support via a professional society
A commitment to norms of conduct often prescribed in a code of ethics
This Guide contributes to the first three of these components. Articulating a Body of Knowledge is an essential step
toward developing a profession because it represents a broad consensus regarding what a software engineering
professional should know. Without such a consensus, no licensing examination can be validated, no curriculum can
prepare an individual for an examination, and no criteria can be formulated for accrediting a curriculum. The
development of consensus is also a prerequisite to the adoption of coherent skills development and continuing
professional education programs in organizations.
WHAT ARE THE OBJECTIVES OF THE SWEBOK PROJECT?
The Guide should not be confused with the Body of Knowledge itself, which already exists in the published
literature. The purpose of the Guide is to describe what portion of the Body of Knowledge is generally accepted, to
organize that portion, and to provide a topical access to it. Additional information on the meaning given to
�generally accepted� can be found below and in Appendix A.
The Guide to the Software Engineering Body of Knowledge (SWEBOK) was established with the following five
objectives:
1.
To promote a consistent view of software engineering worldwide
2.
To clarify the place�and set the boundary�of software engineering with respect to other disciplines such
3.
4.
5.
as computer science, project management, computer engineering, and mathematics
To characterize the contents of the software engineering discipline
To provide a topical access to the Software Engineering Body of Knowledge
To provide a foundation for curriculum development and for individual certification and licensing material
The first of these objectives, a consistent worldwide view of software engineering, was supported by a
development process which engaged approximately 500 reviewers from 42 countries in the Stoneman phase
(1998�2001) leading to the Trial version, and over 120 reviewers from 21 countries in the Ironman phase (2003)
leading to the 2004 version. More information regarding the development process can be found in the Preface and
on the Web site (www.swebok.org). Professional and learned societies and public agencies involved in software
engineering were officially contacted, made aware of this project, and invited to participate in the review process.
Associate editors were recruited from North America, the Pacific Rim, and Europe. Presentations on the project
were made at various international venues and more are scheduled for the upcoming year.
The second of the objectives, the desire to set a boundary for software engineering, motivates the fundamental
organization of the Guide. The material that is recognized as being within this discipline is organized into the first
ten Knowledge Areas (KAs) listed in Table 1. Each of these KAs is treated as a chapter in this Guide.
Table 1 The SWEBOK Knowledge Areas (KAs)
Software requirements
Software design
Software construction
Software testing
Software maintenance
Software configuration management
Software engineering management
Software engineering process
Software engineering tools and methods
Software quality
In establishing a boundary, it is also important to identify what disciplines share that boundary, and often a
common intersection, with software engineering. To this end, the Guide also recognizes eight related disciplines,
listed in Table 2 (see Chapter 12, �Related Disciplines of Software Engineering�). Software engineers should, of
course, have knowledge of material from these fields (and the KA descriptions may make reference to them). It is
not, however, an objective of the SWEBOK Guide to characterize the knowledge of the related disciplines, but
rather what knowledge is viewed as specific to software engineering.
Table 2 Related disciplines

Computer engineering

Project management

Computer science

Quality management

Management

Software ergonomics

Mathematics

Systems engineering
HIERARCHICAL ORGANIZATION
The organization of the KA descriptions or chapters supports the third of the project�s objectives�a
characterization of the contents of software engineering. The detailed specifications provided by the project�s
editorial team to the associate editors regarding the contents of the KA descriptions can be found in Appendix A.
The Guide uses a hierarchical organization to decompose each KA into a set of topics with recognizable labels. A
two- or three-level breakdown provides a reasonable way to find topics of interest. The Guide treats the selected
topics in a manner compatible with major schools of thought and with breakdowns generally found in industry and
in software engineering literature and standards. The breakdowns of topics do not presume particular application
domains, business uses, management philosophies, development methods, and so forth. The extent of each topic�s
description is only that needed to understand the generally accepted nature of the topics and for the reader to
successfully find reference material. After all, the Body of Knowledge is found in the reference material
themselves, not in the Guide.
REFERENCE MATERIAL AND MATRIX
To provide a topical access to the knowledge�the fourth of the project�s objectives�the Guide identifies
reference material for each KA, including book chapters, refereed papers, or other recognized sources of
authoritative information. Each KA description also includes a matrix relating the reference material to the listed
topics. The total volume of cited literature is intended to be suitable for mastery through the completion of an
undergraduate education plus four years of experience.
In this edition of the Guide, all KAs were allocated around 500 pages of reference material, and this was the
specification the associate editors were invited to apply. It may be argued that some KAs, such as software design
for instance, deserve more pages of reference material than others. Such modulation may be applied in future
editions of the Guide.
It should be noted that the Guide does not attempt to be comprehensive in its citations. Much material that is both
suitable and excellent is not referenced. Material was selected in part because�taken as a collection�it provides
coverage of the topics described.
DEPTH OF TREATMENT
From the outset, the question arose as to the depth of treatment the Guide should provide. The project team adopted
an approach which supports the fifth of the project�s objectives� providing a foundation for curriculum
development, certification, and licensing. The editorial team applied the criterion of generally accepted knowledge,
to be distinguished from advanced and research knowledge (on the grounds of maturity) and from specialized
knowledge (on the grounds of generality of application). The definition comes from the Project Management
Institute: �The generally accepted knowledge applies to most projects most of the time, and widespread consensus
validates its value and effectiveness.�4
Figure 1 Categories of knowledge
However, the term �generally accepted� does not imply that the designated knowledge should be uniformly
applied to all software engineering endeavors�each project�s needs determine that�but it does imply that
competent, capable software engineers should be equipped with this knowledge for potential application. More
precisely, generally accepted knowledge should be included in the study material for the software engineering
licensing examination that graduates would take after gaining four years of work experience. Although this
criterion is specific to the US style of education and does not necessarily apply to other countries, we deem it
useful. However, the two definitions of generally accepted knowledge should be seen as complementary.
LIMITATIONS RELATED TO THE BOOK FORMAT
The book format for which this edition was conceived has its limitations. The nature of the contents would be
better served using a hypertext structure, where a topic would be linked to topics other than the ones immediately
preceding and following it in a list.
Some boundaries between KAs, subareas, and so on are also sometimes relatively arbitrary. These boundaries are
not to be given too much importance. As much as possible, pointers and links have been given in the text where
relevant and useful.
Links between KAs are not of the input-output type. The KAs are meant to be views on the knowledge one should
possess in software engineering with respect to the KA in question. The decomposition of the discipline within
KAs and the order in which the KAs are presented are not to be assimilated with any particular method or model.
The methods are described in the appropriate KA in the Guide, and the Guide itself is not one of them.
THE KNOWLEDGE AREAS
Figure 1 maps out the eleven chapters and the important topics incorporated within them. The first five KAs are
presented in traditional waterfall life-cycle sequence. However, this does not imply that the Guide adopts or
encourages the waterfall model, or any other model. The subsequent KAs are presented in alphabetical order, and
those of the related disciplines are presented in the last chapter. This is identical to the sequence in which they are
presented in this Guide.
STRUCTURE OF THE KA DESCRIPTIONS
The KA descriptions are structured as follows.
In the introduction, a brief definition of the KA and an overview of its scope and of its relationship with other KAs
are presented.
The breakdown of topics constitutes the core of each KA description, describing the decomposition of the KA into
subareas, topics, and sub-topics. For each topic or sub-topic, a short description is given, along with one or more
references.
The reference material was chosen because it is considered to constitute the best presentation of the knowledge
relative to the topic, taking into account the limitations imposed on the choice of references (see above). A matrix
links the topics to the reference material.
The last part of the KA description is the list of recommended references. Appendix A of each KA includes
suggestions for further reading for those users who wish to learn more about the KA topics. Appendix B presents
the list of standards most relevant to the KA. Note that citations enclosed in square brackets �[ ]� in the text
identify recommended references, while those enclosed in parentheses �( )� identify the usual references used to
write or justify the text. The former are to be found in the corresponding section of the KA and the latter in
Appendix A of the KA.
Brief summaries of the KA descriptions and appendices are given next.
SOFTWARE REQUIREMENTS KA (SEE FIGURE 2, COLUMN A)
A requirement is defined as a property that must be exhibited in order to solve some real-world problem.
The first knowledge subarea is Software Requirements Fundamentals. It includes definitions of software
requirements themselves, but also of the major types of requirements: product vs. process, functional vs.
nonfunctional, emergent properties. The subarea also describes the importance of quantifiable requirements and
distinguishes between systems and software requirements.
The second knowledge subarea is the Requirements Process, which introduces the process itself, orienting the
remaining five subareas and showing how requirements engineering dovetails with the other software engineering
processes. It describes process models, process actors, process support and management, and process quality and
improvement.
The third subarea is Requirements Elicitation, which is concerned with where software requirements come from
and how the software engineer can collect them. It includes requirement sources and elicitation techniques.
The fourth subarea, Requirements Analysis, is concerned with the process of analyzing requirements to



Detect and resolve conflicts between requirements
Discover the bounds of the software and how it must interact with its environment
Elaborate system requirements to software requirements
Requirements analysis includes requirements classification, conceptual modeling, architectural design and
requirements allocation, and requirements negotiation.
The fifth subarea is Requirements Specification. Requirements specification typically refers to the production of a
document, or its electronic equivalent, that can be systematically reviewed, evaluated, and approved. For complex
systems, particularly those involving substantial non-software components, as many as three different types of
documents are produced: system definition, system requirements specification, and software requirements
specification. The subarea describes all three documents and the underlying activities.
The sixth subarea is Requirements Validation, the aim of which is to pick up any problems before resources are
committed to addressing the requirements. Requirements validation is concerned with the process of examining the
requirements documents to ensure that they are defining the right system (that is, the system that the user expects).
It is subdivided into descriptions of the conduct of requirements reviews, prototyping, and model validation and
acceptance tests.
The seventh and last subarea is Practical Considerations. It describes topics which need to be understood in
practice. The first topic is the iterative nature of the requirements process. The next three topics are fundamentally
about change management and the maintenance of requirements in a state which accurately mirrors the software to
be built, or that has already been built. It includes change management, requirements attributes, and requirements
tracing. The final topic is requirements measurement.
SOFTWARE DESIGN KA (SEE FIGURE 2, COLUMN B)
According to the IEEE definition [IEEE 610.12-90], design is both �the process of defining the architecture,
components, interfaces, and other characteristics of a system or component� and �the result of [that] process.�
The KA is divided into six subareas.
The first subarea presents Software Design Fundamentals, which form an underlying basis to the understanding of
the role and scope of software design. These are general software concepts, the context of software design, the
software design process, and the enabling techniques for software design.
The second subarea groups together the Key Issues in Software Design. They include concurrency, control and
handling of events, distribution of components, error and exception handling and fault tolerance, interaction and
presentation, and data persistence.
The third subarea is Software Structure and Architecture, the topics of which are architectural structures and
viewpoints, architectural styles, design patterns, and, finally, families of programs and frameworks.
The fourth subarea describes software Design Quality Analysis and Evaluation. While there is a entire KA devoted
to software quality, this subarea presents the topics specifically related to software design. These aspects are quality
attributes, quality analysis, and evaluation techniques and measures.
The fifth subarea is Software Design Notations, which are divided into structural and behavioral descriptions.
The last subarea describes Software Design Strategies and Methods. First, general strategies are described,
followed by function-oriented design methods, object-oriented design methods, data-structure-centered design,
component- based design, and others.
SOFTWARE CONSTRUCTION KA (SEE FIGURE 2, COLUMN C)
Software construction refers to the detailed creation of working, meaningful software through a combination of
coding, verification, unit testing, integration testing, and debugging. The KA includes three subareas.
The first subarea is Software Construction Fundamentals. The first three topics are basic principles of construction:
minimizing complexity, anticipating change, and constructing for verification. The last topic discusses standards
for construction.
The second subarea describes Managing Construction. The topics are construction models, construction planning,
and construction measurement.
The third subarea covers Practical Considerations. The topics are construction design, construction languages,
coding, construction testing, reuse, construction quality, and integration.
SOFTWARE TESTING (SEE FIGURE 2, COLUMN D)
Software Testing consists of the dynamic verification of the behavior of a program on a finite set of test cases,
suitably selected from the usually infinite executions domain, against the expected behavior. It includes five
subareas.
It begins with a description of Software Testing Fundamentals. First, the testing-related terminology is presented,
then key issues of testing are described, and finally the relationship of testing to other activities is covered.
The second subarea is Test Levels. These are divided between the targets of the tests and the objectives of the tests.
The third subarea is Test Techniques. The first category includes the tests based on the tester�s intuition and
experience. A second group comprises specification-based techniques, followed by code-based techniques, faultbased techniques, usage-based techniques, and techniques relative to the nature of the application. A discussion of
how to select and combine the appropriate techniques is also presented.
The fourth subarea covers Test-Related Measures. The measures are grouped into those related to the evaluation of
the program under test and the evaluation of the tests performed.
The last subarea describes the Test Process and includes practical considerations and the test activities.
SOFTWARE MAINTENANCE (SEE FIGURE 2, COLUMN E)
Once in operation, anomalies are uncovered, operating environments change, and new user requirements surface.
The maintenance phase of the life cycle commences upon delivery, but maintenance activities occur much earlier.
The Software Maintenance KA is divided into four subareas.
The first one presents Software Maintenance Fundamentals: definitions and terminology, the nature of
maintenance, the need for maintenance, the majority of maintenance costs, the evolution of software, and the
categories of maintenance.
The second subarea groups together the Key Issues in Software Maintenance. These are the technical issues, the
management issues, maintenance cost estimation, and software maintenance measurement.
The third subarea describes the Maintenance Process. The topics here are the maintenance processes and
maintenance activities.
Techniques for Maintenance constitute the fourth subarea.
These include program comprehension, re-engineering, and reverse engineering.
SOFTWARE CONFIGURATION MANAGEMENT (SEE FIGURE 3, COLUMN F)
Software Configuration Management (SCM) is the discipline of identifying the configuration of software at distinct
points in time for the purpose of systematically controlling changes to the configuration and of maintaining the
integrity and traceability of the configuration throughout the system life cycle. This KA includes six subareas.
The first subarea is Management of the SCM Process. It covers the topics of the organizational context for SCM,
constraints and guidance for SCM, planning for SCM, the SCM plan itself, and surveillance of SCM.
The second subarea is Software Configuration Identification, which identifies items to be controlled, establishes
identification schemes for the items and their versions, and establishes the tools and techniques to be used in
acquiring and managing controlled items. The first topics in this subarea are identification of the items to be
controlled and the software library.
The third subarea is Software Configuration Control, which is the management of changes during the software life
cycle. The topics are: first, requesting, evaluating, and approving software changes; second, implementing software
changes; and third, deviations and waivers.
The fourth subarea is Software Configuration Status Accounting. Its topics are software configuration status
information and software configuration status reporting.
The fifth subarea is Software Configuration Auditing. It consists of software functional configuration auditing,
software physical configuration auditing, and in-process audits of a software baseline.
The last subarea is Software Release Management and Delivery, covering software building and software release
management.
SOFTWARE ENGINEERING MANAGEMENT (SEE FIGURE 3, COLUMN G)
The Software Engineering Management KA addresses the management and measurement of software engineering.
While measurement is an important aspect of all KAs, it is here that the topic of measurement programs is
presented. There are six subareas for software engineering management. The first five cover software project
management and the sixth describes software measurement programs.
The first subarea is Initiation and Scope Definition, which comprises determination and negotiation of
requirements, feasibility analysis, and process for the review and revision of requirements.
The second subarea is Software Project Planning and includes process planning, determining deliverables, effort,
schedule and cost estimation, resource allocation, risk management, quality management, and plan management.
The third subarea is Software Project Enactment. The topics here are implementation of plans, supplier contract
management, implementation of measurement process, monitor process, control process, and reporting.
The fourth subarea is Review and Evaluation, which includes the topics of determining satisfaction of requirements
and reviewing and evaluating performance.
The fifth subarea describes Closure: determining closure and closure activities.
Finally, the sixth subarea describes Software Engineering Measurement, more specifically, measurement programs.
Product and process measures are described in the Software Engineering Process KA. Many of the other KAs also
describe measures specific to their KA. The topics of this subarea include establishing and sustaining measurement
commitment, planning the measurement process, performing the measurement process, and evaluating
measurement.
SOFTWARE ENGINEERING PROCESS (SEE FIGURE 3, COLUMN H)
The Software Engineering Process KA is concerned with the definition, implementation, assessment, measurement,
management, change, and improvement of the software engineering process itself. It is divided into four subareas.
The first subarea presents Process Implementation and Change. The topics here are process infrastructure, the
software process management cycle, models for process implementation and change, and practical considerations.
The second subarea deals with Process Definition. It includes the topics of software life cycle models, software life
cycle processes, notations for process definitions, process adaptation, and automation.
The third subarea is Process Assessment. The topics here include process assessment models and process
assessment methods.
The fourth subarea describes Process and Product Measurements. The software engineering process covers general
product measurement, as well as process measurement in general. Measurements specific to KAs are described in
the relevant KA. The topics are process measurement, software product measurement, quality of measurement
results, software information models, and process measurement techniques.
SOFTWARE ENGINEERING TOOLS AND METHODS (SEE FIGURE 3, COLUMN I)
The Software Engineering Tools and Methods KA includes both software engineering tools and software
engineering methods.
The Software Engineering Tools subarea uses the same structure as the Guide itself, with one topic for each of the
other nine software engineering KAs. An additional topic is provided: miscellaneous tools issues, such as tool
integration techniques, which are potentially applicable to all classes of tools.
The Software Engineering Methods subarea is divided into four subsections: heuristic methods dealing with
informal approaches, formal methods dealing with mathematically based approaches, and prototyping methods
dealing with software development approaches based on various forms of prototyping.
SOFTWARE QUALITY (SEEFIGURE 3, COLUMN J)
The Software Quality KA deals with software quality considerations which transcend the software life cycle
processes. Since software quality is a ubiquitous concern in software engineering, it is also considered in many of
the other KAs, and the reader will notice pointers to those KAs throughout this KA. The description of this KA
covers three subareas.
The first subarea describes the Software Quality Fundamentals such as software engineering culture and ethics, the
value and costs of quality, models and quality characteristics, and quality improvement.
The second subarea covers Software Quality Management Processes. The topics here are software quality
assurance, verification and validation, and reviews and audits.
The third and final subarea describes Practical Considerations related to software quality. The topics are software
quality requirements, defect characterization, software quality management techniques, and software quality
measurement.
RELATED DISCIPLINES OF SOFTWARE ENGINEERING (SEE FIGURE 3, COLUMN K)
The last chapter is entitled Related Disciplines of Software Engineering. In order to circumscribe software
engineering, it is necessary to identify the disciplines with which software engineering shares a common boundary.
This chapter identifies, in alphabetical order, these related disciplines. For each related discipline, and using a
consensus-based recognized source as found, are identified:


an informative definition (when feasible);
a list of KAs.
The related disciplines are:

Computer engineering

Project management

Computer science

Quality management

Management

Software ergonomics

Mathematics

Systems engineering
APPENDICES
APPENDIX A. KA DESCRIPTION SPECIFICATIONS
The appendix describes the specifications provided by the editorial team to the associate editors for the content,
recommended references, format, and style of the KA descriptions.
APPENDIX B. EVOLUTION OF THE GUIDE
The second appendix describes the project�s proposal for the evolution of the Guide. The 2004 Guide is simply
the current edition of a guide which will continue evolving to meet the needs of the software engineering
community. Planning for evolution is not yet complete, but a tentative outline of the process is provided in this
appendix. As of this writing, this process has been endorsed by the project�s Industrial Advisory Board and
briefed to the Board of Governors of the IEEE Computer Society but is not yet either funded or implemented.
APPENDIX C. ALLOCATION OF STANDARDS TO KAS
The third appendix is an annotated table of the most relevant standards, mostly from the IEEE and the ISO,
allocated to the KAs of the SWEBOK Guide.
APPENDIX D. BLOOM RATINGS
As an aid, notably to curriculum developers (and other users), in support of the project�s fifth objective, the fourth
appendix rates each topic with one of a set of pedagogical categories commonly attributed to Benjamin Bloom. The
concept is that educational objectives can be classified into six categories representing increasing depth:
knowledge, comprehension, application, analysis, synthesis, and evaluation. Results of this exercise for all KAs can
be found in Appendix D. This Appendix must not, however, be viewed as a definitive classification, but much
more as a starting point.
Figure 2 First five KAs
Figure 3 Last six KAs
1.
2.
3.
4.
�IEEE Standard Glossary of Software Engineering Terminology,� IEEE std 610.12-1990, 1990.
P. Starr, The Social Transformation of American Medicine, Basic Books, 1982, p. 15.
G. Ford and N.E. Gibbs, A Mature Profession of Software Engineering, Software Engineering
Institute, Carnegie Mellon University, Pittsburgh, Pa., tech. report CMU/SEI-96-TR-004, Jan. 1996.
A Guide to the Project Management Body of Knowledge, 2000 ed., Project Management Institute,
www.pmi.org.
Hosted by Software Engineering Research Laboratory (GÉLOG)

CHAPTER 8
SOFTWARE ENGINEERING MANAGEMENT
ACRONYM
PMBOK: Guide to the Project Management Body of Knowledge
SQA: Software Quality Assurance
INTRODUCTION

Software Engineering Management can be defined as the application of management activities�planning, coordinating, measuring,
monitoring, controlling, and reporting�to ensure that the development and maintenance of software is systematic, disciplined, and
quantified (IEEE610.12-90).
The Software Engineering Management KA therefore addresses the management and measurement of software engineering. While
measurement is an important aspect of all KAs, it is here that the topic of measurement programs is presented.
While it is true to say that in one sense it should be possible to manage software engineering in the same way as any other (complex)
process, there are aspects specific to software products and the software life cycle processes which complicate effective
management�just a few of which are as follows:






The perception of clients is such that there is often a lack of appreciation for the complexity inherent in software
engineering, particularly in relation to the impact of changing requirements.
It is almost inevitable that the software engineering processes themselves will generate the need for new or changed client
requirements.
As a result, software is often built in an iterative process rather than a sequence of closed tasks.
Software engineering necessarily incorporates aspects of creativity and discipline�maintaining an appropriate balance
between the two is often difficult.
The degree of novelty and complexity of software is often extremely high.
There is a rapid rate of change in the underlying technology.
With respect to software engineering, management activities occur at three levels: organizational and infrastructure management,
project management, and measurement program planning and control. The last two are covered in detail in this KA description.
However, this is not to diminish the importance of organizational management issues.
Since the link to the related disciplines�obviously management�is important, it will be described in more detail than in the other
KA descriptions. Aspects of organizational management are important in terms of their impact on software engineering�on policy
management, for instance: organizational policies and standards provide the framework in which software engineering is undertaken.
These policies may need to be influenced by the requirements of effective software development and maintenance, and a number of
software engineering-specific policies may need to be established for effective management of software engineering at an
organizational level. For example, policies are usually necessary to establish specific organization-wide processes or procedures for
such software engineering tasks as designing, implementing, estimating, tracking, and reporting. Such policies are essential to
effective long-term software engineering management, by establishing a consistent basis on which to analyze past performance and
implement improvements, for example.
Another important aspect of management is personnel management: policies and procedures for hiring, training, and motivating
personnel and mentoring for career development are important not only at the project level but also to the longer-term success of an
organization. Software engineering personnel may present unique training or personnel management challenges (for example,
maintaining currency in a context where the underlying technology undergoes continuous and rapid change). Communication
management is also often mentioned as an overlooked but major aspect of the performance of individuals in a field where precise
understanding of user needs and of complex requirements and designs is necessary. Finally, portfolio management, which is the
capacity to have an overall vision not only of the set of software under development but also of the software already in use in an
organization, is necessary. Furthermore, software reuse is a key factor in maintaining and improving productivity and
competitiveness. Effective reuse requires a strategic vision that reflects the unique power and requirements of this technique.
In addition to understanding the aspects of management that are uniquely influenced by software, software engineers must have some
knowledge of the more general aspects, even in the first four years after graduation that is targeted in the Guide.
Organizational culture and behavior, and functional enterprise management in terms of procurement, supply chain management,
marketing, sales, and distribution, all have an influence, albeit indirectly, on an organization�s software engineering process.
Relevant to this KA is the notion of project management, as �the construction of useful software artifacts� is normally managed in
the form of (perhaps programs of) individual projects. In this regard, we find extensive support in the Guide to the Project
Management Body of Knowledge (PMBOK) (PMI00), which itself includes the following project management KAs: project
integration management, project scope management, project time management, project cost management, project quality
management, project human resource management, and project communications management. Clearly, all these topics have direct
relevance to the Software Engineering Management KA. To attempt to duplicate the content of the Guide to the PMBOK here would
be both impossible and inappropriate. Instead, we suggest that the reader interested in project management beyond what is specific to
software engineering projects consult the PMBOK itself. Project management is also found in the Related Disciplines of Software
Engineering chapter.
The Software Engineering Management KA consists of both the software project management process, in its first five subareas, and
software engineering measurement in the last subarea. While these two subjects are often regarded as being separate, and indeed they
do possess many unique aspects, their close relationship has led to their combined treatment in this KA. Unfortunately, a common
perception of the software industry is that it delivers products late, over budget, and of poor quality and uncertain functionality.
Measurement-informed management � an assumed principle of any true engineering discipline � can help to turn this perception
around. In essence, management without measurement, qualitative and quantitative, suggests a lack of rigor, and measurement
without management suggests a lack of purpose or context. In the same way, however, management and measurement without expert
knowledge is equally ineffectual, so we must be careful to avoid over-emphasizing the quantitative aspects of Software Engineering
Management (SEM). Effective management requires a combination of both numbers and experience.
The following working definitions are adopted here:


Management process refers to the activities that are undertaken in order to ensure that the software engineering processes are
performed in a manner consistent with the organization�s policies, goals, and standards.
Measurement refers to the assignment of values and labels to aspects of software engineering (products, processes, and
resources as defined by [Fen98]) and the models that are derived from them, whether these models are developed using
statistical, expert knowledge or other techniques.
The software engineering project management subareas make extensive use of the software engineering measurement subarea.
Not unexpectedly, this KA is closely related to others in the Guide to the SWEBOK, and reading the following KA descriptions in
conjunction with this one would be particularly useful.

Software Requirements, where some of the activities to be performed during the Initiation and Scope definition phase of the
project are described



Software Configuration Management, as this deals with the identification, control, status accounting, and audit of the
software configuration along with software release management and delivery
Software Engineering Process, because processes and projects are closely related (this KA also describes process and
product measurement)
Software Quality, as quality is constantly a goal of management and is an aim of many activities that must be managed
BREAKDOWN OF TOPICS FOR SOFTWARE ENGINEERING MANAGEMENT
Figure 1 Breakdown of topics for the Software Engineering Management KA
As the Software Engineering Management KA is viewed here as an organizational process which incorporates the notion of process
and project management, we have created a breakdown that is both topic-based and life cycle-based. However, the primary basis for
the top-level breakdown is the process of managing a software engineering project. There are six major subareas. The first five
subareas largely follow the IEEE/EIA 12207 Management Process. The six subareas are:






Initiation and scope definition, which deals with the decision to initiate a software engineering project
Software project planning, which addresses the activities undertaken to prepare for successful software engineering from a
management perspective
Software project enactment, which deals with generally accepted software engineering management activities that occur
during software engineering
Review and evaluation, which deal with assurance that the software is satisfactory
Closure, which addresses the post-completion activities of a software engineering project
Software engineering measurement, which deals with the effective development and implementation of measurement
programs in software engineering organizations (IEEE12207.0-96)
The breakdown of topics for the Software Engineering Management KA is shown in Figure 1.
1. Initiation and Scope Definition
The focus of this set of activities is on the effective determination of software requirements via various elicitation methods and the
assessment of the project�s feasibility from a variety of standpoints. Once feasibility has been established, the remaining task within
this process is the specification of requirements validation and change procedures (see also the Software Requirements KA).
1.1. Determination and Negotiation of Requirements [Dor02: v2c4; Pfl01: c4; Pre04: c7; Som05: c5]
Software requirement methods for requirements elicitation (for example, observation), analysis (for example, data modeling, use-case
modeling), specification, and validation (for example, prototyping) must be selected and applied, taking into account the various
stakeholder perspectives. This leads to the determination of project scope, objectives, and constraints. This is always an important
activity, as it sets the visible boundaries for the set of tasks being undertaken, and is particularly so where the novelty of the
undertaking is high. Additional information can be found in the Software Requirements KA.
1.2. Feasibility Analysis (Technical, Operational, Financial, Social/Political) [Pre04: c6; Som05: c6]
Software engineers must be assured that adequate capability and resources are available in the form of people, expertise, facilities,
infrastructure, and support (either internally or externally) to ensure that the project can be successfully completed in a timely and
cost-effective manner (using, for example, a requirement-capability matrix). This often requires some �ballpark� estimation of
effort and cost based on appropriate methods (for example, expert-informed analogy techniques).
1.3. Process for the Review and Revision of Requirements
Given the inevitability of change, it is vital that agreement among stakeholders is reached at this early point as to the means by which
scope and requirements are to be reviewed and revised (for example, via agreed change management procedures). This clearly
implies that scope and requirements will not be �set in stone� but can and should be revisited at predetermined points as the
process unfolds (for example, at design reviews, management reviews). If changes are accepted, then some form of traceability
analysis and risk analysis (see topic 2.5 Risk Management) should be used to ascertain the impact of those changes. A managedchange approach should also be useful when it comes time to review the outcome of the project, as the scope and requirements should
form the basis for the evaluation of success. [Som05: c6] See also the software configuration control subarea of the Software
Configuration Management KA.
2. Software Project Planning
The iterative planning process is informed by the scope and requirements and by the establishment of feasibility. At this point,
software life cycle processes are evaluated and the most appropriate (given the nature of the project, its degree of novelty, its
functional and technical complexity, its quality requirements, and so on) is selected. Where relevant, the project itself is then planned
in the form of a hierarchical decomposition of tasks, the associated deliverables of each task are specified and characterized in terms
of quality and other attributes in line with stated requirements, and detailed effort, schedule, and cost estimation is undertaken.
Resources are then allocated to tasks so as to optimize personnel productivity (at individual, team, and organizational levels),
equipment and materials utilization, and adherence to schedule. Detailed risk management is undertaken and the �risk profile� of
the project is discussed among, and accepted by, all relevant stakeholders. Comprehensive software quality management processes
are determined as part of the planning process in the form of procedures and responsibilities for software quality assurance,
verification and validation, reviews, and audits (see the Software Quality KA). As an iterative process, it is vital that the processes
and responsibilities for ongoing plan management, review, and revision are also clearly stated and agreed.
2.1. Process Planning
Selection of the appropriate software life cycle model (for example, spiral, evolutionary prototyping) and the adaptation and
deployment of appropriate software life cycle processes are undertaken in light of the particular scope and requirements of the
project. Relevant methods and tools are also selected. [Dor02: v1c6,v2c8; Pfl01: c2; Pre04: c2; Rei02: c1,c3,c5; Som05: c3; Tha97:
c3] At the project level, appropriate methods and tools are used to decompose the project into tasks, with associated inputs, outputs,
and completion conditions (for example, work breakdown structure). [Dor02: v2c7; Pfl01: c3; Pre04: c21; Rei02: c4,c5; Som05: c4;
Tha97: c4,c6] This in turn influences decisions on the project�s high-level schedule and organization structure.
2.2. Determine Deliverables
The product(s) of each task (for example, architectural design, inspection report) are specified and characterized. [Pfl01: c3; Pre04:
c24; Tha97: c4] Opportunities to reuse software components from previous developments or to utilize off-the-shelf software products
are evaluated. Use of third parties and procured software are planned and suppliers are selected.
2.3. Effort, Schedule, and Cost Estimation
Based on the breakdown of tasks, inputs, and outputs, the expected effort range required for each task is determined using a calibrated
estimation model based on historical size-effort data where available and relevant, or other methods like expert judgment. Task
dependencies are established and potential bottlenecks are identified using suitable methods (for example, critical path analysis).
Bottlenecks are resolved where possible, and the expected schedule of tasks with projected start times, durations, and end times is
produced (for example, PERT chart). Resource requirements (people, tools) are translated into cost estimates. [Dor02: v2c7; Fen98:
c12; Pfl01: c3; Pre04: c23, c24; Rei02: c5,c6; Som05: c4,c23; Tha97: c5] This is a highly iterative activity which must be negotiated
and revised until consensus is reached among affected stakeholders (primarily engineering and management).
2.4. Resource Allocation [Pfl01: c3; Pre04: c24; Rei02: c8,c9; Som05: c4; Tha97: c6,c7]
Equipment, facilities, and people are associated with the scheduled tasks, including the allocation of responsibilities for completion
(using, for example, a Gantt chart). This activity is informed and constrained by the availability of resources and their optimal use
under these circumstances, as well as by issues relating to personnel (for example, productivity of individuals/teams, team dynamics,
organizational and team structures).
2.5. Risk Management
Risk identification and analysis (what can go wrong, how and why, and what are the likely consequences), critical risk assessment
(which are the most significant risks in terms of exposure, which can we do something about in terms of leverage), risk mitigation
and contingency planning (formulating a strategy to deal with risks and to manage the risk profile) are all undertaken. Risk
assessment methods (for example, decision trees and process simulations) should be used in order to highlight and evaluate risks.
Project abandonment policies should also be determined at this point in discussion with all other stakeholders. [Dor02: v2c7; Pfl01:
c3; Pre04: c25; Rei02: c11; Som05: c4; Tha97: c4] Software-unique aspects of risk, such as software engineers� tendency to add
unwanted features or the risks attendant in software�s intangible nature, must influence the project�s risk management.
2.6. Quality Management [Dor02: v1c8,v2c3-c5; Pre04: c26; Rei02: c10; Som05: c24,c25; Tha97: c9,c10]
Quality is defined in terms of pertinent attributes of the specific project and any associated product(s), perhaps in both quantitative
and qualitative terms. These quality characteristics will have been determined in the specification of detailed software requirements.
See also the Software Requirements KA.
Thresholds for adherence to quality are set for each indicator as appropriate to stakeholder expectations for the software at hand.
Procedures relating to ongoing SQA throughout the process and for product (deliverable) verification and validation are also specified
at this stage (for example, technical reviews and inspections) (see also the Software Quality KA).
2.7. Plan Management [Som05: c4; Tha97: c4]
How the project will be managed and how the plan will be managed must also be planned. Reporting, monitoring, and control of the
project must fit the selected software engineering process and the realities of the project, and must be reflected in the various artifacts
that will be used for managing it. But, in an environment where change is an expectation rather than a shock, it is vital that plans are
themselves managed. This requires that adherence to plans be systematically directed, monitored, reviewed, reported, and, where
appropriate, revised. Plans associated with other management-oriented support processes (for example, documentation, software
configuration management, and problem resolution) also need to be managed in the same manner.
3. Software Project Enactment
The plans are then implemented, and the processes embodied in the plans are enacted. Throughout, there is a focus on adherence to
the plans, with an overriding expectation that such adherence will lead to the successful satisfaction of stakeholder requirements and
achievement of the project objectives. Fundamental to enactment are the ongoing management activities of measuring, monitoring,
controlling, and reporting.
3.1. Implementation of Plans [Pfl01: c3; Som05: c4]
The project is initiated and the project activities are undertaken according to the schedule. In the process, resources are utilized (for
example, personnel effort, funding) and deliverables are produced (for example, architectural design documents, test cases).
3.2. Supplier Contract Management [Som05:c4]
Prepare and execute agreements with suppliers, monitor supplier performance, and accept supplier products, incorporating them as
appropriate.
3.3. Implementation of measurement process [Fen98: c13,c14; Pre04: c22; Rei02: c10,c12; Tha97: c3,c10]
The measurement process is enacted alongside the software project, ensuring that relevant and useful data are collected (see also
topics 6.2 Plan the Measurement Process and 6.3 Perform the Measurement Process).
3.4. Monitor Process [Dor02: v1c8, v2c2-c5,c7; Rei02: c10; Som05: c25; Tha97: c3;c9]
Adherence to the various plans is assessed continually and at predetermined intervals. Outputs and completion conditions for each
task are analyzed. Deliverables are evaluated in terms of their required characteristics (for example, via reviews and audits). Effort
expenditure, schedule adherence, and costs to date are investigated, and resource usage is examined. The project risk profile is
revisited, and adherence to quality requirements is evaluated.
Measurement data are modeled and analyzed. Variance analysis based on the deviation of actual from expected outcomes and values
is undertaken. This may be in the form of cost overruns, schedule slippage, and the like. Outlier identification and analysis of quality
and other measurement data are performed (for example, defect density analysis). Risk exposure and leverage are recalculated, and
decisions trees, simulations, and so on are rerun in the light of new data. These activities enable problem detection and exception
identification based on exceeded thresholds. Outcomes are reported as needed and certainly where acceptable thresholds are
surpassed.
3.5. Control Process [Dor02: v2c7; Rei02: c10; Tha97: c3,c9]
The outcomes of the process monitoring activities provide the basis on which action decisions are taken. Where appropriate, and
where the impact and associated risks are modeled and managed, changes can be made to the project. This may take the form of
corrective action (for example, retesting certain components), it may involve the incorporation of contingencies so that similar
occurrences are avoided (for example, the decision to use prototyping to assist in software requirements validation), and/or it may
entail the revision of the various plans and other project documents (for example, requirements specification) to accommodate the
unexpected outcomes and their implications.
In some instances, it may lead to abandonment of the project. In all cases, change control and software configuration management
procedures are adhered to (see also the Software Configuration Management KA), decisions are documented and communicated to all
relevant parties, plans are revisited and revised where necessary, and relevant data is recorded in the central database (see also topic
6.3 Perform the Measurement Process).
3.6. Reporting [Rei02: c10; Tha97: c3,c10]
At specified and agreed periods, adherence to the plans is reported, both within the organization (for example to the project portfolio
steering committee) and to external stakeholders (for example, clients, users). Reports of this nature should focus on overall
adherence as opposed to the detailed reporting required frequently within the project team.
4. Review and Evaluation
At critical points in the project, overall progress towards achievement of the stated objectives and satisfaction of stakeholder
requirements are evaluated. Similarly, assessments of the effectiveness of the overall process to date, the personnel involved, and the
tools and methods employed are also undertaken at particular milestones.
4.1. Determining Satisfaction of Requirements [Rei02: c10; Tha97: c3,c10]
Since attaining stakeholder (user and customer) satisfaction is one of our principal aims, it is important that progress towards this aim
be formally and periodically assessed. This occurs on achievement of major project milestones (for example, confirmation of
software design architecture, software integration technical review). Variances from expectations are identified and appropriate action
is taken. As in the control process activity above (see topic 3.5 Control Process), in all cases change control and software
configuration management procedures are adhered to (see the Software Configuration Management KA), decisions are documented
and communicated to all relevant parties, plans are revisited and revised where necessary, and relevant data are recorded in the central
database (see also topic 6.3 Perform the Measurement Process). More information can also be found in the Software Testing KA, in
topic 2.2 Objectives of Testing and in the Software Quality KA, in topic 2.3 Reviews and Audits.
4.2. Reviewing and Evaluating Performance [Dor02: v1c8,v2c3,c5; Pfl01: c8,c9; Rei02: c10; Tha97: c3,c10]
Periodic performance reviews for project personnel provide insights as to the likelihood of adherence to plans as well as possible
areas of difficulty (for example, team member conflicts). The various methods, tools, and techniques employed are evaluated for their
effectiveness and appropriateness, and the process itself is systematically and periodically assessed for its relevance, utility, and
efficacy in the project context. Where appropriate, changes are made and managed.
5. Closure
The project reaches closure when all the plans and embodied processes have been enacted and completed. At this stage, the criteria
for project success are revisited. Once closure is established, archival, post mortem, and process improvement activities are
performed.
5.1. Determining Closure [Dor02: v1c8,v2c3,c5; Rei02: c10; Tha97: c3,c10]
The tasks as specified in the plans are complete, and satisfactory achievement of completion criteria is confirmed. All planned
products have been delivered with acceptable characteristics. Requirements are checked off and confirmed as satisfied, and the
objectives of the project have been achieved. These processes generally involve all stakeholders and result in the documentation of
client acceptance and any remaining known problem reports.
5.2. Closure Activities [Pfl01: c12; Som05: c4]
After closure has been confirmed, archival of project materials takes place in line with stakeholder-agreed methods, location, and
duration. The organization�s measurement database is updated with final project data and post-project analyses are undertaken. A
project post mortem is undertaken so that issues, problems, and opportunities encountered during the process (particularly via review
and evaluation, see subarea 4 Review and evaluation) are analyzed, and lessons are drawn from the process and fed into
organizational learning and improvement endeavors (see also the Software Engineering Process KA).
6. Software Engineering Measurement [ISO 15939-02]
The importance of measurement and its role in better management practices is widely acknowledged, and so its importance can only
increase in the coming years. Effective measurement has become one of the cornerstones of organizational maturity.
Key terms on software measures and measurement methods have been defined in [ISO15939-02] on the basis of the ISO international
vocabulary of metrology [ISO93]. Nevertheless, readers will encounter terminology differences in the literature; for example, the
term �metrics� is sometimes used in place of �measures.�
This topic follows the international standard ISO/IEC 15939, which describes a process which defines the activities and tasks
necessary to implement a software measurement process and includes, as well, a measurement information model.
6.1. Establish and Sustain Measurement Commitment

Accept requirements for measurement. Each measurement endeavor should be guided by organizational objectives and
driven by a set of measurement requirements established by the organization and the project. For example, an organizational
objective might be �first-to-market with new products.� [Fen98: c3,c13; Pre04: c22] This in turn might engender a
requirement that factors contributing to this objective be measured so that projects might be managed to meet this objective.
o Define scope of measurement. The organizational unit to which each measurement requirement is to be applied
must be established. This may consist of a functional area, a single project, a single site, or even the whole
enterprise. All subsequent measurement tasks related to this requirement should be within the defined scope. In
addition, the stakeholders should be identified.
o Commitment of management and staff to measurement. The commitment must be formally established,
communicated, and supported by resources (see next item).

Commit resources for measurement. The organization�s commitment to measurement is an essential factor for success, as
evidenced by assignment of resources for implementing the measurement process. Assigning resources includes allocation
of responsibility for the various tasks of the measurement process (such as user, analyst, and librarian) and providing
adequate funding, training, tools, and support to conduct the process in an enduring fashion.
6.2. Plan the Measurement Process

Characterize the organizational unit. The organizational unit provides the context for measurement, so it is important to
make this context explicit and to articulate the assumptions that it embodies and the constraints that it imposes.
Characterization can be in terms of organizational processes, application domains, technology, and organizational interfaces.
An organizational process model is also typically an element of the organizational unit characterization [ISO15939-02:
5.2.1].






Identify information needs. Information needs are based on the goals, constraints, risks, and problems of the organizational
unit. They may be derived from business, organizational, regulatory, and/or product objectives. They must be identified and
prioritized. Then, a subset to be addressed must be selected and the results documented, communicated, and reviewed by
stakeholders [ISO 15939-02: 5.2.2].
Select measures. Candidate measures must be selected, with clear links to the information needs. Measures must then be
selected based on the priorities of the information needs and other criteria such as cost of collection, degree of process
disruption during collection, ease of analysis, ease of obtaining accurate, consistent data, and so on [ISO15939-02: 5.2.3 and
Appendix C].
Define data collection, analysis, and reporting procedures. This encompasses collection procedures and schedules, storage,
verification, analysis, reporting, and configuration management of data [ISO15939-02: 5.2.4].
Define criteria for evaluating the information products. Criteria for evaluation are influenced by the technical and business
objectives of the organizational unit. Information products include those associated with the product being produced, as well
as those associated with the processes being used to manage and measure the project [ISO15939-02: 5.2.5 and Appendices
D, E].
Review, approve, and provide resources for measurement tasks.
o The measurement plan must be reviewed and approved by the appropriate stakeholders. This includes all data
collection procedures, storage, analysis, and reporting procedures; evaluation criteria; schedules; and
responsibilities. Criteria for reviewing these artifacts should have been established at the organizational unit level or
higher and should be used as the basis for these reviews. Such criteria should take into consideration previous
experience, availability of resources, and potential disruptions to projects when changes from current practices are
proposed. Approval demonstrates commitment to the measurement process [ISO15939-02: 5.2.6.1 and Appendix F.
o Resources should be made available for implementing the planned and approved measurement tasks. Resource
availability may be staged in cases where changes are to be piloted before widespread deployment. Consideration
should be paid to the resources necessary for successful deployment of new procedures or measures [ISO15939-02:
5.2.6.2].
Acquire and deploy supporting technologies. This includes evaluation of available supporting technologies, selection of the
most appropriate technologies, acquisition of those technologies, and deployment of those technologies [ISO 15939-02:
5.2.7].
6.3. Perform the Measurement Process




Integrate measurement procedures with relevant processes. The measurement procedures, such as data collection, must be
integrated into the processes they are measuring. This may involve changing current processes to accommodate data
collection or generation activities. It may also involve analysis of current processes to minimize additional effort and
evaluation of the effect on employees to ensure that the measurement procedures will be accepted. Morale issues and other
human factors need to be considered. In addition, the measurement procedures must be communicated to those providing the
data, training may need to be provided, and support must typically be provided. Data analysis and reporting procedures must
typically be integrated into organizational and/or project processes in a similar manner [ISO 15939-02: 5.3.1].
Collect data. The data must be collected, verified, and stored [ISO 15939-02 :5.3.2].
Analyze data and develop information products. Data may be aggregated, transformed, or recoded as part of the analysis
process, using a degree of rigor appropriate to the nature of the data and the information needs. The results of this analysis
are typically indicators such as graphs, numbers, or other indications that must be interpreted, resulting in initial conclusions
to be presented to stakeholders. The results and conclusions must be reviewed, using a process defined by the organization
(which may be formal or informal). Data providers and measurement users should participate in reviewing the data to ensure
that they are meaningful and accurate, and that they can result in reasonable actions [ISO 15939-02: 5.3.3 and Appendix G].
Communicate results. Information products must be documented and communicated to users and stakeholders [ISO 1593902: 5.3.4].
6.4. Evaluate Measurement

Evaluate information products. Evaluate information products against specified evaluation criteria and determine strengths
and weaknesses of the information products. This may be performed by an internal process or an external audit and should
include feedback from measurement users. Record lessons learned in an appropriate database [ISO 15939-02: 5.4.1 and
Appendix D].


Evaluate the measurement process. Evaluate the measurement process against specified evaluation criteria and determine the
strengths and weaknesses of the process. This may be performed by an internal process or an external audit and should
include feedback from measurement users. Record lessons learned in an appropriate database [ISO 15939-02: 5.4.1 and
Appendix D].
Identify potential improvements. Such improvements may be changes in the format of indicators, changes in units measured,
or reclassification of categories. Determine the costs and benefits of potential improvements and select appropriate
improvement actions. Communicate proposed improvements to the measurement process owner and stakeholders for review
and approval. Also communicate lack of potential improvements if the analysis fails to identify improvements [ISO 1593902: 5.4.2].
MATRIX OF TOPICS VS. REFERENCE MATERIAL
RECOMMENDED REFERENCES FOR SOFTWARE ENGINEERING MANAGEMENT


[Dor02] M. Dorfman and R.H. Thayer, eds., Software Engineering, IEEE Computer Society Press, 2002, Vol. 1, Chap. 6, 8,
Vol. 2, Chap. 3, 4, 5, 7, 8.
[Fen98] N.E. Fenton and S.L. Pfleeger, Software Metrics: A Rigorous & Practical Approach, second ed., International
Thomson Computer Press, 1998, Chap. 1-14.






[ISO15939-02] ISO/IEC 15939:2002, Software Engineering � Software Measurement Process, ISO and IEC, 2002.
[Pfl01] S.L. Pfleeger, Software Engineering: Theory and Practice, second ed., Prentice Hall, 2001, Chap. 2-4, 8, 9, 12, 13.
[Pre04] R.S. Pressman, Software Engineering: A Practitioner's Approach, sixth ed., McGraw-Hill, 2004, Chap. 2, 6, 7, 2226.
[Rei02] D.J. Reifer, ed., Software Management, IEEE Computer Society Press, 2002, Chap. 1-6, 7-12, 13.
[Som05] I. Sommerville, Software Engineering, seventh ed., Addison-Wesley, 2005, Chap. 3-6, 23-25.
[Tha97] R.H. Thayer, ed., Software Engineering Project Management, IEEE Computer Society Press, 1997, Chap. 1-10.
APPENDIX A. LIST OF FURTHER READINGS

(Adl99) T.R. Adler, J.G. Leonard, and R.K. Nordgren, �Improving Risk Management: Moving from Risk Elimination to
Risk Avoidance,� Information and Software Technology, vol. 41, 1999, pp. 29-34.

(Bai98) R. Baines, �Across Disciplines: Risk, Design, Method, Process, and Tools,� IEEE Software, July/August 1998,
pp. 61-64.

(Bin97) R.V. Binder, �Can a Manufacturing Quality Model Work for Software?� IEEE Software, September/October
1997, pp. 101-102,105.

(Boe97) B.W. Boehm and T. DeMarco, �Software Risk Management,� IEEE Software, May/June 1997, pp. 17-19.

(Bri96) L.C. Briand, S. Morasca, and V.R. Basili, �Property-Based Software Engineering Measurement,� IEEE
Transactions on Software Engineering, vol. 22, iss. 1, 1996, pp. 68-86.

(Bri96a) L. Briand, K.E. Emam, and S. Morasca, �On the Application of Measurement Theory in Software Engineering,�
Empirical Software Engineering, vol. 1, 1996, pp. 61-88.

(Bri97) L.C. Briand, S. Morasca, and V.R. Basili, �Response to: Comments on �Property-based Software Engineering
Measurement: Refining the Addivity Properties,�� IEEE Transactions on Software Engineering, vol. 23, iss. 3, 1997, pp.
196-197.

(Bro87) F.P.J. Brooks, �No Silver Bullet: Essence and Accidents of Software Engineering,� Computer, Apr. 1987, pp. 1019.
(Cap96) J. Capers, Applied Software Measurement: Assuring Productivity and Quality, second ed., McGraw-Hill, 1996.

(Car97) M.J. Carr, �Risk Management May Not Be For Everyone,� IEEE Software, May/June 1997, pp. 21-24.

(Cha96) R.N. Charette, �Large-Scale Project Management Is Risk Management,� IEEE Software, July 1996, pp. 110-117.

(Cha97) R.N. Charette, K.M. Adams, and M.B. White, �Managing Risk in Software Maintenance,� IEEE Software,
May/June 1997, pp. 43-50.

(Col96) B. Collier, T. DeMarco,and P. Fearey, �A Defined Process for Project Postmortem Review,� IEEE Software, July
1996, pp. 65-72.

(Con97) E.H. Conrow and P.S. Shishido, �Implementing Risk Management on Software Intensive Projects,� IEEE
Software, May/June 1997, pp. 83-89.


(Dav98) A.M. Davis, �Predictions and Farewells,� IEEE Software, July/August 1998, pp. 6-9.
(Dem87) T. DeMarco and T. Lister, Peopleware: Productive Projects and Teams, Dorset House Publishing, 1987.

(Dem96) T. DeMarco and A. Miller, �Managing Large Software Projects,� IEEE Software, July 1996, pp. 24-27.

(Fav98) J. Favaro and S.L. Pfleeger, �Making Software Development Investment Decisions,� ACM SIGSoft Software
Engineering Notes, vol. 23, iss. 5, 1998, pp. 69-74.

(Fay96) M.E. Fayad and M. Cline, �Managing Object-Oriented Software Development,� Computer, September 1996, pp.
26-31.
(Fen98) N.E. Fenton and S.L. Pfleeger, Software Metrics: A Rigorous & Practical Approach, second ed., International
Thomson Computer Press, 1998.



(Fle99) R. Fleming, �A Fresh Perspective on Old Problems,� IEEE Software, January/February 1999, pp. 106-113.

(Fug98) A. Fuggetta et al., �Applying GQM in an Industrial Software Factory,� ACM Transactions on Software
Engineering and Methodology, vol. 7, iss. 4, 1998, pp. 411-448.

(Gar97) P.R. Garvey, D.J. Phair, and J.A. Wilson, �An Information Architecture for Risk Assessment and Management,�
IEEE Software, May/June 1997, pp. 25-34.

(Gem97) A. Gemmer, �Risk Management: Moving beyond Process,� Computer, May 1997, pp. 33-43.

(Gla97) R.L. Glass, �The Ups and Downs of Programmer Stress,� Communications of the ACM, vol. 40, iss. 4, 1997, pp.
17-19.

(Gla98) R.L. Glass, �Short-Term and Long-Term Remedies for Runaway Projects,� Communications of the ACM, vol.
41, iss. 7, 1998, pp. 13-15.

(Gla98a) R.L. Glass, �How Not to Prepare for a Consulting Assignment, and Other Ugly Consultancy Truths,�
Communications of the ACM, vol. 41, iss. 12, 1998, pp. 11-13.

(Gla99) R.L. Glass, �The Realities of Software Technology Payoffs,� Communications of the ACM, vol. 42, iss. 2, 1999,
pp. 74-79.

(Gra99) R. Grable et al., �Metrics for Small Projects: Experiences at the SED,� IEEE Software, March/April 1999, pp.
21-29.
(Gra87) R.B. Grady and D.L. Caswell, Software Metrics: Establishing A Company-Wide Program. Prentice Hall, 1987.


(Hal97) T. Hall and N. Fenton, �Implementing Effective Software Metrics Programs,� IEEE Software, March/April 1997,
pp. 55-64.

(Hen99) S.M. Henry and K.T. Stevens, �Using Belbin�s Leadership Role to Improve Team Effectiveness: An Empirical
Investigation,� Journal of Systems and Software, vol. 44, 1999, pp. 241-250.

(Hoh99) L. Hohmann, �Coaching the Rookie Manager,� IEEE Software, January/February 1999, pp. 16-19.


(Hsi96) P. Hsia, �Making Software Development Visible,� IEEE Software, March 1996, pp. 23-26.
(Hum97) W.S. Humphrey, Managing Technical People: Innovation, Teamwork, and the Software Process: Addison-Wesley,
1997.
(IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/IEC12207:1995, Industry Implementation of Int. Std. ISO/IEC 12207:95,
Standard for Information Technology-Software Life Cycle Processes, IEEE, 1996.


(Jac98) M. Jackman, �Homeopathic Remedies for Team Toxicity,� IEEE Software, July/August 1998, pp. 43-45.

(Kan97) K. Kansala, �Integrating Risk Assessment with Cost Estimation,� IEEE Software, May/June 1997, pp. 61-67.

(Kar97) J. Karlsson and K. Ryan, �A Cost-Value Aproach for Prioritizing Requirements,� IEEE Software,
September/October 1997, pp. 87-74.
(Kar96) D.W. Karolak, Software Engineering Risk Management, IEEE Computer Society Press, 1996.


(Kau99) K. Kautz, �Making Sense of Measurement for Small Organizations,� IEEE Software, March/April 1999, pp. 1420.

(Kei98) M. Keil et al., �A Framework for Identifying Software Project Risks,� Communications of the ACM, vol. 41, iss.
11, 1998, pp. 76-83.

(Ker99) B. Kernighan and R. Pike, �Finding Performance Improvements,� IEEE Software, March/April 1999, pp. 61-65.

(Kit97) B. Kitchenham and S. Linkman, �Estimates, Uncertainty, and Risk,� IEEE Software, May/June 1997, pp. 69-74.

(Lat98) F. v. Latum et al., �Adopting GQM-Based Measurement in an Industrial Environment,� IEEE Software, JanuaryFebruary 1998, pp. 78-86.

(Leu96) H.K.N. Leung, �A Risk Index for Software Producers,� Software Maintenance: Research and Practice, vol. 8,
1996, pp. 281-294.

(Lis97) T. Lister, �Risk Management Is Project Management for Adults,� IEEE Software, May/June 1997, pp. 20-22.

(Mac96) K. Mackey, �Why Bad Things Happen to Good Projects,� IEEE Software, May 1996, pp. 27-32.

(Mac98) K. Mackey, �Beyond Dilbert: Creating Cultures that Work,� IEEE Software, January/February 1998, pp. 48-49.



(Mad97) R.J. Madachy, �Heuristic Risk Assessment Using Cost Factors,� IEEE Software, May/June 1997, pp. 51-59.
(McC96) S.C. McConnell, Rapid Development: Taming Wild Software Schedules, Microsoft Press, 1996.
(McC97) S.C. McConnell, Software Project Survival Guide, Microsoft Press, 1997.

(McC99) S.C. McConnell, �Software Engineering Principles,� IEEE Software, March/April 1999, pp. 6-8.

(Moy97) T. Moynihan, �How Experienced Project Managers Assess Risk,� IEEE Software, May/June 1997, pp. 35-41.

(Ncs98) P. Ncsi, �Managing OO Projects Better,� IEEE Software, July/August 1998, pp. 50-60.

(Nol99) A.J. Nolan, �Learning From Success,� IEEE Software, January/February 1999, pp. 97-105.

(Off97) R.J. Offen and R. Jeffery, �Establishing Software Measurement Programs,� IEEE Software, March/April 1997,
pp. 45-53.

(Par96) K.V.C. Parris, �Implementing Accountability,� IEEE Software, July/August 1996, pp. 83-93.

(Pfl97) S.L. Pfleeger, �Assessing Measurement (Guest Editor�s Introduction),� IEEE Software, March/April 1997, pp.
25-26.

(Pfl97a) S.L. Pfleeger et al., �Status Report on Software Measurement,� IEEE Software, March/April 1997, pp. 33-43.

(Put97) L.H. Putman and W. Myers, Industrial Strength Software � Effective Management Using Measurement, IEEE
Computer Society Press, 1997.

(Rob99) P.N. Robillard, �The Role of Knowledge in Software Development,� Communications of the ACM, vol. 42, iss.
1, 1999, pp. 87-92.

(Rod97) A.G. Rodrigues and T.M. Williams, �System Dynamics in Software Project Management: Towards the
Development of a Formal Integrated Framework,� European Journal of Information Systems, vol. 6, 1997, pp. 51-66.

(Rop97) J. Ropponen and K. Lyytinen, �Can Software Risk Management Improve System Development: An Exploratory
Study,� European Journal of Information Systems, vol. 6, 1997, pp. 41-50.

(Sch99) C. Schmidt et al., �Disincentives for Communicating Risk: A Risk Paradox,� Information and Software
Technology, vol. 41, 1999, pp. 403-411.

(Sco92) R.L. v. Scoy, �Software Development Risk: Opportunity, Not Problem,� Software Engineering Institute,
Carnegie Mellon University CMU/SEI-92-TR-30, 1992.

(Sla98) S.A. Slaughter, D.E. Harter, and M.S. Krishnan, �Evaluating the Cost of Software Quality,� Communications of
the ACM, vol. 41, iss. 8, 1998, pp. 67-73.

(Sol98) R. v. Solingen, R. Berghout, and F. v. Latum, �Interrupts: Just a Minute Never Is,� IEEE Software,
September/October 1998, pp. 97-103.
(Whi95) N. Whitten, Managing Software Development Projects: Formulas for Success, Wiley, 1995.
(Wil99) B. Wiley, Essential System Requirements: A Practical Guide to Event-Driven Methods, Addison-Wesley, 1999.



(Zel98) M.V. Zelkowitz and D.R. Wallace, �Experimental Models for Validating Technology,� Computer, vol. 31, iss. 5,
1998, pp. 23-31.
APPENDIX B. LIST OF STANDARDS



(IEEE610.12-90) IEEE Std 610.12-1990 (R2002), IEEE Standard Glossary of Software Engineering Terminology, IEEE,
1990.
(IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/IEC12207:1995, Industry Implementation of Int. Std. ISO/IEC 12207:95,
Standard for Information Technology-Software Life Cycle Processes, IEEE, 1996.
(ISO15939-02) ISO/IEC 15939:2002, Software Engineering-Software Measurement Process, ISO and IEC, 2002.

(PMI00) Project Management Institute Standards Committee, A Guide to the Project Management Body of Knowledge
(PMBOK), Project Management Institute, 2000.
CHAPTER 9
SOFTWARE ENGINEERING PROCESS
ACRONYMS
CMMI: Capability Maturity Model Integration
EF: Experience Factory
FP: Function Point
HRM: Human Resources Management
IDEAL: Initiating-Diagnosing-Establishing-Acting-Leaning (model)
OMG: Object Management Group
QIP: Quality Improvement Paradigm
SCAMPI CMM: Based Appraisal for Process Improvement using the CMMI
SCE: Software Capability Evaluation
SEPG: Software Engineering Process Group
INTRODUCTION
The Software Engineering Process KA can be examined on two levels. The first level encompasses the technical and managerial
activities within the software life cycle processes that are performed during software acquisition, development, maintenance, and
retirement. The second is the meta-level, which is concerned with the definition, implementation, assessment, measurement,
management, change, and improvement of the software life cycle processes themselves. The first level is covered by the other KAs in
the Guide. This KA is concerned with the second.
The term �software engineering process� can be interpreted in different ways, and this may cause confusion.



One meaning, where the word the is used, as in the software engineering process, could imply that there is only one right
way of performing software engineering tasks. This meaning is avoided in the Guide, because no such process exists.
Standards such as IEEE12207 speak of software engineering processes, meaning that there are many processes involved,
such as Development Process or Configuration Management Process.
A second meaning refers to the general discussion of processes related to software engineering. This is the meaning intended
in the title of this KA, and the one most often intended in the KA description.
Finally, a third meaning could signify the actual set of activities performed within an organization, which could be viewed as
one process, especially from within the organization. This meaning is used in the KA in a very few instances.
This KA applies to any part of the management of software life cycle processes where procedural or technological change is being
introduced for process or product improvement.
Software engineering process is relevant not only to large organizations. On the contrary, process-related activities can, and have
been, performed successfully by small organizations, teams, and individuals.
The objective of managing software life cycle processes is to implement new or better processes in actual practices, be they
individual, project, or organizational.
This KA does not explicitly address human resources management (HRM), for example, as embodied in the People CMM (Cur02)
and systems engineering processes [ISO1528-028; IEEE 1220-98].
It should also be recognized that many software engineering process issues are closely related to other disciplines, such as
management, albeit sometimes using a different terminology.
BREAKDOWN OF TOPICS FOR SOFTWARE ENGINEERING PROCESS
Figure 1 shows the breakdown of topics in this KA.
Figure 1 Breakdown of topics for the Software Engineering Process KA
1. Process Implementation and Change
This subarea focuses on organizational change. It describes the infrastructure, activities, models, and practical considerations for
process implementation and change.
Described here is the situation in which processes are deployed for the first time (for example, introducing an inspection process
within a project or a method covering the complete life cycle), and where current processes are changed (for example, introducing a
tool, or optimizing a procedure). This can also be termed process evolution. In both instances, existing practices have to be modified.
If the modifications are extensive, then changes in the organizational culture may also be necessary.
1.1. Process Infrastructure [IEEE12207.0-96; ISO15504-98; SEL96]
This topic includes the knowledge related to the software engineering process infrastructure.
To establish software life cycle processes, it is necessary to have an appropriate infrastructure in place, meaning that the resources
must be available (competent staff, tools, and funding) and the responsibilities assigned. When these tasks have been completed, it is
an indication of management�s commitment to, and ownership of, the software engineering process effort.
Various committees may have to be established, such as a steering committee to oversee the software engineering process effort. A
description of an infrastructure for process improvement in general is provided in [McF96]. Two main types of infrastructure are used
in practice: the Software Engineering Process Group and the Experience Factory.
1.1.1. Software Engineering Process Group (SEPG)
The SEPG is intended to be the central focus of software engineering process improvement, and it has a number of responsibilities in
terms of initiating and sustaining it. These are described in [Fow90].
1.1.2. Experience Factory (EF)
The concept of the EF separates the project organization (the software development organization, for example) from the improvement
organization. The project organization focuses on the development and maintenance of software, while the EF is concerned with
software engineering process improvement.
The EF is intended to institutionalize the collective learning of an organization by developing, updating, and delivering to the project
organization experience packages (for example, guides, models, and training courses), also referred to as process assets. The project
organization offers the EF their products, the plans used in their development, and the data gathered during development and
operation. Examples of experience packages are presented in [Bas92].
1.2. Software Process Management Cycle [Bas92; Fow90; IEEE12207.0-96; ISO15504-98; McF96; SEL96]
The management of software processes consists of four activities sequenced in an iterative cycle allowing for continuous feedback
and improvement of the software process:




The Establish Process Infrastructure activity consists of establishing commitment to process implementation and change
(including obtaining management buy-in) and putting in place an appropriate infrastructure (resources and responsibilities)
to make it happen.
The goal of the Planning activity is to understand the current business objectives and process needs of the individual, project,
or organization, to identify its strengths and weaknesses, and to make a plan for process implementation and change.
The goal of Process Implementation and Change is to execute the plan, deploy new processes (which may involve, for
example, the deployment of tools and training of staff), and/or change existing processes.
Process Evaluation is concerned with finding out how well the implementation and change went, whether or not the
expected benefits materialized. The results are then used as input for subsequent cycles.
1.3. Models For Process Implementation And Change
Two general models that have emerged for driving process implementation and change are the Quality Improvement Paradigm (QIP)
[SEL96] and the IDEAL model [McF96]. The two paradigms are compared in [SEL96]. Evaluation of process implementation and
change outcomes can be qualitative or quantitative.
1.4. Practical Considerations
Process implementation and change constitute an instance of organizational change. Most successful organizational change efforts
treat the change as a project in its own right, with appropriate plans, monitoring, and review.
Guidelines about process implementation and change within software engineering organizations, including action planning, training,
management sponsorship, commitment, and the selection of pilot projects, and which cover both processes and tools, are given in
[Moi98; San98; Sti99]. Empirical studies on success factors for process change are reported in (ElE99a).
The role of change agents in this activity is discussed in (Hut94). Process implementation and change can also be seen as an instance
of consulting (either internal or external).
One can also view organizational change from the perspective of technology transfer (Rog83). Software engineering articles which
discuss technology transfer and the characteristics of recipients of new technology (which could include process-related technologies)
are (Pfl99; Rag89).
There are two ways of approaching the evaluation of process implementation and change, either in terms of changes to the process
itself or in terms of changes to the process outcomes (for example, measuring the return on investment from making the change). A
pragmatic look at what can be achieved from such evaluation studies is given in (Her98).
Overviews of how to evaluate process implementation and change, and examples of studies that do so, can be found in [Gol99],
(Kit98; Kra99; McG94).
2. Process Definition
A process definition can be a procedure, a policy, or a standard. Software life cycle processes are defined for a number of reasons,
including increasing the quality of the product, facilitating human understanding and communication, supporting process
improvement, supporting process management, providing automated process guidance, and providing automated execution support.
The types of process definitions required will depend, at least partially, on the reason for the definition.
It should also be noted that the context of the project and organization will determine the type of process definition that is most
useful. Important variables to consider include the nature of the work (for example, maintenance or development), the application
domain, the life cycle model, and the maturity of the organization.
2.1. Software Life Cycle Models [Pfl01:c2; IEEE12207.0-96]
Software life cycle models serve as a high-level definition of the phases that occur during development. They are not aimed at
providing detailed definitions but at highlighting the key activities and their interdependencies. Examples of software life cycle
models are the waterfall model, the throwaway prototyping model, evolutionary development, incremental/iterative delivery, the
spiral model, the reusable software model, and automated software synthesis. Comparisons of these models are provided in [Com97],
(Dav88), and a method for selecting among many of them in (Ale91).
2.2. Software Life Cycle Processes
Definitions of software life cycle processes tend to be more detailed than software life cycle models. However, software life cycle
processes do not attempt to order their processes in time. This means that, in principle, the software life cycle processes can be
arranged to fit any of the software life cycle models. The main reference in this area is IEEE/EIA 12207.0: Information Technology
� Software Life Cycle Processes [IEEE 12207.0-96].
The IEEE 1074:1997 standard on developing life cycle processes also provides a list of processes and activities for software
development and maintenance [IEEE1074-97], as well as a list of life cycle activities which can be mapped into processes and
organized in the same way as any of the software life cycle models. In addition, it identifies and links other IEEE software standards
to these activities. In principle, IEEE Std 1074 can be used to build processes conforming to any of the life cycle models. Standards
which focus on maintenance processes are IEEE Std 1219-1998 and ISO 14764: 1998 [IEEE 1219-98].
Other important standards providing process definitionsinclude



IEEE Std 1540: Software Risk Management (IEEE1540-01)
IEEE Std 1517: Software Reuse Processes (IEEE 1517-99)
ISO/IEC 15939: Software Measurement Process [ISO15939-02]. See also the Software Engineering Management KA for a
detailed description of this process.
In some situations, software engineering processes must be defined taking into account the organizational processes for quality
management. ISO 9001 [ISO9001-00] provides requirements for quality management processes, and ISO/IEC 90003 interprets those
requirements for organizations developing software (ISO90003-04).
Some software life cycle processes emphasize rapid delivery and strong user participation, namely agile methods such as Extreme
Programming [Bec99]. A form of the selection problem concerns the choice along the agile plan-driven method axis. A risk-based
approach to making that decision is described in (Boe03a).
2.3. Notations for Process Definitions
Processes can be defined at different levels of abstraction (for example, generic definitions vs. adapted definitions, descriptive vs.
prescriptive vs. proscriptive) [Pfl01]. Various elements of a process can be defined, for example, activities, products (artifacts), and
resources. Detailed frameworks which structure the types of information required to define processes are described in (Mad94).
There are a number of notations being used to define processes (SPC92). A key difference between them is in the type of information
the frameworks mentioned above define, capture, and use. The software engineer should be aware of the following approaches: data
flow diagrams, in terms of process purpose and outcomes [ISO15504-98], as a list of processes decomposed into constituent activities
and tasks defined in natural language [IEEE12207.0-96], Statecharts (Har98), ETVX (Rad85), Actor-Dependency modeling (Yu94),
SADT notation (Mcg93), Petri nets (Ban95); IDEF0 (IEEE 1320.1-98), and rule-based (Bar95). More recently, a process modeling
standard has been published by the OMG which is intended to harmonize modeling notations. This is termed the SPEM (Software
Process Engineering Meta-Model) specification. [OMG02]
2.4. Process Adaptation [IEEE 12207.0-96; ISO15504-98; Joh99]
It is important to note that predefined processes�even standardized ones�must be adapted to local needs, for example,
organizational context, project size, regulatory requirements, industry practices, and corporate cultures. Some standards, such as
IEEE/EIA 12207, contain mechanisms and recommendations for accomplishing the adaptation.
2.5. Automation [Pfl01:c2]
Automated tools either support the execution of the process definitions or they provide guidance to humans performing the defined
processes. In cases where process analysis is performed, some tools allow different types of simulations (for example, discrete event
simulation).
In addition, there are tools which support each of the above process definition notations. Moreover, these tools can execute the
process definitions to provide automated support to the actual processes, or to fully automate them in some instances. An overview of
process-modeling tools can be found in [Fin94] and of process-centered environments in (Gar96). Work on applying the Internet to
the provision of real-time process guidance is described in (Kel98).
3. Process Assessment
Process assessment is carried out using both an assessment model and an assessment method. In some instances, the term
�appraisal� is used instead of assessment, and the term �capability evaluation� is used when the appraisal is for the purpose of
awarding a contract.
3.1. Process Assessment Models
An assessment model captures what is recognized as good practices. These practices may pertain to technical software engineering
activities only, or may also refer to, for example, management, systems engineering, and human resources management activities as
well.
ISO/IEC 15504 [ISO15504-98] defines an exemplar assessment model and conformance requirements on other assessment models.
Specific assessment models available and in use are SW-CMM (SEI95), CMMI [SEI01], and Bootstrap [Sti99]. Many other
capability and maturity models have been defined�for example, for design, documentation, and formal methods, to name a few. ISO
9001 is another common assessment model which has been applied by software organizations (ISO9001-00).
A maturity model for systems engineering has also been developed, which would be useful where a project or organization is
involved in the development and maintenance of systems, including software (EIA/IS731-99).
The applicability of assessment models to small organizations is addressed in [Joh99; San98].
There are two general architectures for an assessment model that make different assumptions about the order in which processes must
be assessed: continuous and staged (Pau94). They are very different, and should be evaluated by the organization considering them to
determine which would be the most pertinent to their needs and objectives.
3.2. Process Assessment Methods [Gol99]
In order to perform an assessment, a specific assessment method needs to be followed to produce a quantitative score which
characterizes the capability of the process (or maturity of the organization).
The CBA-IPI assessment method, for example, focuses on process improvement (Dun96), and the SCE method focuses on evaluating
the capability of suppliers (Bar95). Both of these were developed for the SW-CMM. Requirements on both types of methods which
reflect what are believed to be good assessment practices are provided in [ISO15504-98], (Mas95). The SCAMPI methods are geared
toward CMMI assessments [SEI01]. The activities performed during an assessment, the distribution of effort on these activities, as
well as the atmosphere during an assessment are different when they are for improvement than when they are for a contract award.
There have been criticisms of process assessment models and methods, for example (Fay97; Gra98). Most of these criticisms have
been concerned with the empirical evidence supporting the use of assessment models and methods. However, since the publication of
these articles, there has been some systematic evidence supporting the efficacy of process assessments. (Cla97; Ele00; Ele00a; Kri99)
4. Process and Product Measurement
While the application of measurement to software engineering can be complex, particularly in terms of modeling and analysis
methods, there are several aspects of software engineering measurement which are fundamental and which underlie many of the more
advanced measurement and analysis processes. Furthermore, achievement of process and product improvement efforts can only be
assessed if a set of baseline measures has been established.
Measurement can be performed to support the initiation of process implementation and change or to evaluate the consequences of
process implementation and change, or it can be performed on the product itself.
Key terms on software measures and measurement methods have been defined in ISO/IEC 15939 on the basis of the ISO
international vocabulary of metrology. ISO/IEC 15359 also provides a standard process for measuring both process and product
characteristics. [VIM93]
Nevertheless, readers will encounter terminological differences in the literature; for example, the term �metric� is sometimes used
in place of �measure.�
4.1. Process Measurement [ISO15539-02]
The term �process measurement� as used here means that quantitative information about the process is collected, analyzed, and
interpreted. Measurement is used to identify the strengths and weaknesses of processes and to evaluate processes after they have been
implemented and/or changed.
Process measurement may serve other purposes as well. For example, process measurement is useful for managing a software
engineering project. Here, the focus is on process measurement for the purpose of process implementation and change.
The path diagram in Figure 2 illustrates an important assumption made in most software engineering projects, which is that usually
the process has an impact on project outcomes. The context affects the relationship between the process and process outcomes. This
means that this process-to-process outcome relationship depends on the context.
Not every process will have a positive impact on all outcomes. For example, the introduction of software inspections may reduce
testing effort and cost, but may increase elapsed time if each inspection introduces long delays due to the scheduling of large
inspection meetings. (Vot93) Therefore, it is preferable to use multiple process outcome measures which are important to the
organization�s business.
While some effort can be made to assess the utilization of tools and hardware, the primary resource that needs to be managed in
software engineering is personnel. As a result, the main measures of interest are those related to the productivity of teams or
processes (for example, using a measure of function points produced per unit of person-effort) and their associated levels of
experience in software engineering in general, and perhaps in particular technologies. [Fen98: c3, c11; Som05: c25]
Process outcomes could, for example, be product quality (faults per KLOC (Kilo Lines of Code) or per Function Point (FP)),
maintainability (the effort to make a certain type of change), productivity (LOC (Lines of Code) or Function Points per personmonth), time-to-market, or customer satisfaction (as measured through a customer survey). This relationship depends on the
particular context (for example, size of the organization or size of the project).
In general, we are most concerned about process outcomes. However, in order to achieve the process outcomes that we desire (for
example, better quality, better maintainability, greater customer satisfaction), we have to implement the appropriate process.
Of course, it is not only the process that has an impact on outcomes. Other factors, such as the capability of the staff and the tools that
are used, play an important role. When evaluating the impact of a process change, for example, it is important to factor out these other
influences. Furthermore, the extent to which the process is institutionalized (that is, process fidelity) is important, as it may explain
why �good� processes do not always give the desired outcomes in a given situation.
Figure 2 Path diagram showing the relationship between process and outcomes (results).
Software Product Measurement [ISO9126-01]
Software product measurement includes, notably, the measurement of product size, product structure, and product quality.
4.1.1. Size measurement
Software product size is most often assessed by measures of length (for example, lines of source code in a module, pages in a
software requirements specification document), or functionality (for example, function points in a specification). The principles of
functional size measurement are provided in IEEE Std 14143.1. International standards for functional size measurement methods
include ISO/IEC 19761, 20926, and 20968 [IEEE 14143.1-00; ISO19761-03; ISO20926-03; ISO20968-02].
4.1.2. Structure measurement
A diverse range of measures of software product structure may be applied to both high- and low-level design and code artifacts to
reflect control flow (for example the cyclomatic number, code knots), data flow (for example, measures of slicing), nesting (for
example, the nesting polynomial measure, the BAND measure), control structures (for example, the vector measure, the NPATH
measure), and modular structure and interaction (for example, information flow, tree-based measures, coupling and cohesion).
[Fen98: c8; Pre04: c15]
4.1.3. Quality measurement
As a multi-dimensional attribute, quality measurement is less straightforward to define than those above. Furthermore, some of the
dimensions of quality are likely to require measurement in qualitative rather than quantitative form. A more detailed discussion of
software quality measurement is provided in the Software Quality KA, topic 3.4. ISO models of software product quality and of
related measurements are described in ISO 9126, parts 1 to 4 [ISO9126-01]. [Fen98: c9,c10; Pre04: c15; Som05: c24]
4.2. Quality Of Measurement Results
The quality of the measurement results (accuracy, reproducibility, repeatability, convertibility, random measurement errors) is
essential for the measurement programs to provide effective and bounded results. Key characteristics of measurement results and
related quality of measuring instruments have been defined in the ISO International vocabulary on metrology. [VIM93]
The theory of measurement establishes the foundation on which meaningful measurements can be made. The theory of measurement
and scale types is discussed in [Kan02]. Measurement is defined in the theory as �the assignment of numbers to objects in a
systematic way to represent properties of the object.�
An appreciation of software measurement scales and the implications of each scale type in relation to the subsequent selection of data
analysis methods is especially important. [Abr96; Fen98: c2; Pfl01: c11] Meaningful scales are related to a classification of scales.
For those, measurement theory provides a succession of more and more constrained ways of assigning the measures. If the numbers
assigned are merely to provide labels to classify the objects, they are called nominal. If they are assigned in a way that ranks the
objects (for example, good, better, best), they are called ordinal. If they deal with magnitudes of the property relative to a defined
measurement unit, they are called interval (and the intervals are uniform between the numbers unless otherwise specified, and are
therefore additive). Measurements are at the ratio level if they have an absolute zero point, so ratios of distances to the zero point are
meaningful.
4.3. Software Information Models
As the data are collected and the measurement repository is populated, we become able to build models using both data and
knowledge.
These models exist for the purposes of analysis, classification, and prediction. Such models need to be evaluated to ensure that their
levels of accuracy are sufficient and that their limitations are known and understood. The refinement of models, which takes place
both during and after projects are completed, is another important activity.
4.3.1. Model building
Model building includes both calibration and evaluation of the model. The goal-driven approach to measurement informs the model
building process to the extent that models are constructed to answer relevant questions and achieve software improvement goals. This
process is also influenced by the implied limitations of particular measurement scales in relation to the choice of analysis method.
The models are calibrated (by using particularly relevant observations, for example, recent projects, projects using similar
technology) and their effectiveness is evaluated (for example, by testing their performance on holdout samples). [Fen98:
c4,c6,c13;Pfl01: c3,c11,c12; Som05: c25]
4.3.2. Model implementation
Model implementation includes both interpretation and refinement of models�the calibrated models are applied to the process, their
outcomes are interpreted and evaluated in the context of the process/project, and the models are then refined where appropriate.
[Fen98: c6; Pfl01: c3,c11,c12; Pre04: c22; Som05: c25]
4.4. Process Measurement Techniques
Measurement techniques may be used to analyze software engineering processes and to identify strengths and weaknesses. This can
be performed to initiate process implementation and change, or to evaluate the consequences of process implementation and change.
The quality of measurement results, such as accuracy, repeatability, and reproducibility, are issues in the measurement of software
engineering processes, since there are both instrument-based and judgmental measurements, as, for example, when assessors assign
scores to a particular process. A discussion and method for achieving quality of measurement are presented in [Gol99].
Process measurement techniques have been classified into two general types: analytic and benchmarking. The two types of
techniques can be used together since they are based on different types of information. (Car91)
4.4.1. Analytical techniques
The analytical techniques are characterized as relying on �quantitative evidence to determine where improvements are needed and
whether an improvement initiative has been successful.� The analytical type is exemplified by the Quality Improvement Paradigm
(QIP) consisting of a cycle of understanding, assessing, and packaging [SEL96]. The techniques presented next are intended as other
examples of analytical techniques, and reflect what is done in practice. [Fen98; Mus99], (Lyu96; Wei93; Zel98) Whether or not a
specific organization uses all these techniques will depend, at least partially, on its maturity.

Experimental Studies: Experimentation involves setting up controlled or quasi experiments in the organization to evaluate
processes. (McG94) Usually, a new process is compared with the current process to determine whether or not the former has
better process outcomes.
Another type of experimental study is process simulation. This type of study can be used to analyze process behavior, explore process
improvement potentials, predict process outcomes if the current process is changed in a certain way, and control process execution.
Initial data about the performance of the current process need to be collected, however, as a basis for the simulation.

Process Definition Review is a means by which a process definition (either a descriptive or a prescriptive one, or both) is
reviewed, and deficiencies and potential process improvements identified. Typical examples of this are presented in (Ban95;
Kel98). An easy operational way to analyze a process is to compare it to an existing standard (national, international, or
professional body), such as IEEE/EIA 12207.0[IEEE12207.0-96]. With this approach, quantitative data are not collected on
the process, or, if they are, they play a supportive role. The individuals performing the analysis of the process definition use
their knowledge and capabilities to decide what process changes would potentially lead to desirable process outcomes.
Observational studies can also provide useful feedback for identifying process improvements. (Agr99)

Orthogonal Defect Classification is a technique which can be used to link faults found with potential causes. It relies on a
mapping between fault types and fault triggers. (Chi92; Chi96) The IEEE Standard on the classification of faults (or
anomalies) may be useful in this context (IEEE Standard for the Classification of Software Anomalies (IEEE1044-93).

Root Cause Analysis is another common analytical technique which is used in practice. This involves tracing back from
detected problems (for example, faults) to identify the process causes, with the aim of changing the process to avoid these
problems in the future. Examples for different types of processes are described in (Col93; Ele97; Nak91).

The Orthogonal Defect Classification technique described above can be used to find catagories in which many problems
exist, at which point they can be analyzed. Orthogonal Defect Classification is thus a technique used to make a quantitative
selection for where to apply Root Cause Analysis.

Statistical Process Control is an effective way to identify stability, or the lack of it, in the process through the use of control
charts and their interpretations. A good introduction to SPC in the context of software engineering is presented in (Flo99).

The Personal Software Process defines a series of improvements to an individual�s development practices in a specified
order [Hum95]. It is �bottom-up� in the sense that it stipulates personal data collection and improvements based on the
data interpretations.
4.4.2. Benchmarking techniques
The second type of technique, benchmarking, �depends on identifying an �excellent� organization in a field and documenting its
practices and tools.� Benchmarking assumes that if a less-proficient organization adopts the practices of the excellent organization,
it will also become excellent. Benchmarking involves assessing the maturity of an organization or the capability of its processes. It is
exemplified by the software process assessment work. A general introductory overview of process assessments and their application
is provided in (Zah98).
MATRIX OF TOPICS VS. REFERENCE MATERIAL
RECOMMENDED REFERENCES

[Abr96] A. Abran and P.N. Robillard, �Function Points Analysis: An Empirical Study of its Measurement Processes,�
IEEE Transactions on Software Engineering, vol. 22, 1996, pp. 895-909.

[Bas92] V. Basili et al., �The Software Engineering Laboratory � An Operational Software Experience Factory,�
presented at the International Conference on Software Engineering, 1992.
[Bec99] K. Beck, Extreme Programming Explained: Embrace Change, Addison-Wesley, 1999.


[Boe03] B. Boehm and R. Turner, �Using Risk to Balance Agile and Plan-Driven Methods,� Computer, June 2003, pp.
57-66.

[Com97] E. Comer, �Alternative Software Life Cycle Models,� presented at International Conference on Software
Engineering, 1997.
[ElE99] K. El-Emam and N. Madhavji, Elements of Software Process Assessment and Improvement, IEEE Computer
Society Press, 1999.
[Fen98] N.E. Fenton and S.L. Pfleeger, Software Metrics: A Rigorous & Practical Approach, second ed., International
Thomson Computer Press, 1998.





[Fin94] A. Finkelstein, J. Kramer, and B. Nuseibeh, �Software Process Modeling and Technology,� Research Studies
Press Ltd., 1994.
[Fow90] P. Fowler and S. Rifkin, Software Engineering Process Group Guide, Software Engineering Institute, Technical
Report CMU/SEI-90-TR-24, 1990, available at http://www.sei.cmu.edu/pub/documents/90.reports/pdf/tr24.90.pdf.
[Gol99] D. Goldenson et al., �Empirical Studies of Software Process Assessment Methods,� presented at Elements of
Software Process Assessment and Improvement, 1999.





[IEEE1074-97] IEEE Std 1074-1997, IEEE Standard for Developing Software Life Cycle Processes, IEEE, 1997.
[IEEE12207.0-96] IEEE/EIA 12207.0-1996//ISO/IEC12207:1995, Industry Implementation of Int. Std ISO/IEC 12207:95,
Standard for Information Technology-Software Life Cycle Processes, IEEE, 1996.
[VIM93] ISO VIM, International Vocabulary of Basic and General Terms in Metrology, ISO, 1993.
[ISO9126-01] ISO/IEC 9126-1:2001, Software Engineering - Product Quality-Part 1: Quality Model, ISO and IEC, 2001.
[ISO15504-98] ISO/IEC TR 15504:1998, Information Technology - Software Process Assessment (parts 1-9), ISO and IEC,
1998.

[ISO15939-02] ISO/IEC 15939:2002, Software Engineering � Software Measurement Process, ISO and IEC, 2002.

[Joh99] D. Johnson and J. Brodman, �Tailoring the CMM for Small Businesses, Small Organizations, and Small
Projects,� presented at Elements of Software Process Assessment and Improvement, 1999.

[McF96] B. McFeeley, IDEAL: A User�s Guide for Software Process Improvement, Software Engineering Institute
CMU/SEI-96-HB-001, 1996, available at http://www.sei.cmu.edu/pub/documents/96.reports/pdf/hb001.96.pdf.

[Moi98] D. Moitra, �Managing Change for Software Process Improvement Initiatives: A Practical Experience-Based

Approach,� Software Process � Improvement and Practice, vol. 4, iss. 4, 1998, pp. 199-207.
[Mus99] J. Musa, Software Reliability Engineering: More Reliable Software, Faster Development and Testing, McGrawHill, 1999.

[OMG02] Object Management Group, �Software Process Engineering Metamodel Specification,� 2002, available at
http://www.omg.org/docs/formal/02-11-14.pdf.
[Pfl01] S.L. Pfleeger, Software Engineering: Theory and Practice, second ed., Prentice Hall, 2001.

[Pre04] R.S. Pressman, Software Engineering: A Practitioner�s Approach, sixth ed., McGraw-Hill, 2004.

[San98] M. Sanders, �The SPIRE Handbook: Better, Faster, Cheaper Software Development in Small Organisations,�
European Commission, 1998.

[SEI01] Software Engineering Institute, �Capability Maturity Model Integration, v1.1,� 2001, available at
http://www.sei.cmu.edu/cmmi/cmmi.html.
[SEL96] Software Engineering Laboratory, Software Process Improvement Guidebook, NASA/GSFC, Technical Report
SEL-95-102, April 1996, available at http://sel.gsfc.nasa.gov/website/documents/online-doc/95-102.pdf.
[Som05] I. Sommerville, Software Engineering, seventh ed., Addison-Wesley, 2005.




[Sti99] H. Stienen, �Software Process Assessment and Improvement: 5 Years of Experiences with Bootstrap,� Elements
of Software Process Assessment and Improvement, K. El-Emam and N. Madhavji, eds., IEEE Computer Society Press,
1999.
Appendix A. List of Further Readings

(Agr99) W. Agresti, �The Role of Design and Analysis in Process Improvement,� presented at Elements of Software
Process Assessment and Improvement, 1999.

(Ale91) L. Alexander and A. Davis, �Criteria for Selecting Software Process Models,� presented at COMPSAC �91,
1991.

(Ban95) S. Bandinelli et al., �Modeling and Improving an Industrial Software Process,� IEEE Transactions on Software
Engineering, vol. 21, iss. 5, 1995, pp. 440-454.

(Bar95) N. Barghouti et al., �Two Case Studies in Modeling Real, Corporate Processes,� Software Process �
Improvement and Practice, Pilot Issue, 1995, pp. 17-32.
(Boe03a) B. Boehm and R. Turner, Balancing Agility and Discipline: A Guide for the Perplexed, Addison-Wesley, 2003.


(Bur99) I. Burnstein et al., �A Testing Maturity Model for Software Test Process Assessment and Improvement,�
Software Quality Professional, vol. 1, iss. 4, 1999, pp. 8-21.

(Chi92) R. Chillarege et al., �Orthogonal Defect Classification - A Concept for In-Process Measurement,� IEEE
Transactions on Software Engineering, vol. 18, iss. 11, 1992, pp. 943-956.

(Chi96) R. Chillarege, �Orthogonal Defect Classification,� Handbook of Software Reliability Engineering, M. Lyu, ed.,
IEEE Computer Society Press, 1996.

(Col93) J. Collofello and B. Gosalia, �An Application of Causal Analysis to the Software Production Process,� Software
Practice and Experience, vol. 23, iss. 10, 1993, pp. 1095-1105.
(Cur02) B. Curtis W. Hefley, and S. Miller, The People Capability Maturity Model: Guidelines for Improving the
Workforce, Addison-Wesley, 2002.


(Dav88) A. Davis, E. Bersoff, and E. Comer, �A Strategy for Comparing Alternative Software Development Life Cycle
Models,� IEEE Transactions on Software Engineering, vol. 14, iss. 10, 1988, pp. 1453-1461.

(Dun96) D. Dunnaway and S. Masters, �CMM-Based Appraisal for Internal Process Improvement (CBA IPI): Method
Description,� Software Engineering Institute CMU/SEI-96-TR-007, 1996, available at
http://www.sei.cmu.edu/pub/documents/96.reports/pdf/tr007. 96.pdf.

(EIA/IS731-99) EIA, �EIA/IS 731 Systems Engineering Capability Model,� 1999, available at
http://www.geia.org/eoc/G47/index.html.

(ElE-97) K. El-Emam, D. Holtje, and N. Madhavji, �Causal Analysis of the Requirements Change Process for a Large
System,� presented at Proceedings of the International Conference on Software Maintenance, 1997.

(ElE-99a) K. El-Emam, B. Smith, and P. Fusaro, �Success Factors and Barriers in Software Process Improvement: An
Empirical Study,� Better Software Practice for Business Benefit: Principles and Experiences, R. Messnarz and C. Tully,
eds., IEEE Computer Society Press, 1999.

(ElE-00a) K. El-Emam and A. Birk, �Validating the ISO/IEC 15504 Measures of Software Development Process
Capability,� Journal of Systems and Software, vol. 51, iss. 2, 2000, pp. 119-149.

(ElE-00b) K. El-Emam and A. Birk, �Validating the ISO/IEC 15504 Measures of Software Requirements Analysis Process
Capability,� IEEE Transactions on Software Engineering, vol. 26, iss. 6, June 2000, pp. 541-566.


(Fay97) M. Fayad and M. Laitinen, �Process Assessment: Considered Wasteful,� Communications of the ACM, vol. 40,
iss. 11, November 1997.
(Flo99) W. Florac and A. Carleton, Measuring the Software Process: Statistical Process Control for Software Process
Improvement, Addison-Wesley, 1999.

(Gar96) P. Garg and M. Jazayeri, �Process-Centered Software Engineering Environments: A Grand Tour,� Software
Process, A. Fuggetta and A. Wolf, eds., John Wiley & Sons, 1996.
(Gra97) R. Grady, Successful Software Process Improvement, Prentice Hall, 1997.

(Gra88) E. Gray and W. Smith, �On the Limitations of Software Process Assessment and the Recognition of a Required

Re-Orientation for Global Process Improvement,� Software Quality Journal, vol. 7, 1998, pp. 21-34.
(Har98) D. Harel and M. Politi, Modeling Reactive Systems with Statecharts: The Statemate Approach, McGraw-Hill, 1998.



(Her98) J. Herbsleb, �Hard Problems and Hard Science: On the Practical Limits of Experimentation,� IEEE TCSE
Software Process Newsletter, vol. 11, 1998, pp. 18-21, available at http://www.seg.iit.nrc.ca/SPN.
(Hum95) W. Humphrey, A Discipline for Software Engineering, Addison-Wesley, 1995.
(Hum99) W. Humphrey, An Introduction to the Team Software Process, Addison-Wesley, 1999.


(Hut94) D. Hutton, The Change Agent�s Handbook: A Survival Guide for Quality Improvement Champions, Irwin, 1994.
(Kan02) S.H. Kan, Metrics and Models in Software Quality Engineering, second ed., Addison-Wesley, 2002.

(Kel98) M. Kellner et al., �Process Guides: Effective Guidance for Process Participants,� presented at the 5th
International Conference on the Software Process, 1998.

(Kit98) B. Kitchenham, �Selecting Projects for Technology Evaluation,� IEEE TCSE Software Process Newsletter, iss.
11, 1998, pp. 3-6, available at http://www.seg.iit.nrc.ca/SPN.


(Kra99) H. Krasner, �The Payoff for Software Process Improvement: What It Is and How to Get It,� presented at
Elements of Software Process Assessment and Improvement, 1999.

(Kri99) M.S. Krishnan and M. Kellner, �Measuring Process Consistency: Implications for Reducing Software Defects,�
IEEE Transactions on Software Engineering, vol. 25, iss. 6, November/December 1999, pp. 800-815.
(Lyu96) M.R. Lyu, Handbook of Software Reliability Engineering, Mc-Graw-Hill/IEEE, 1996.


(Mad94) N. Madhavji et al., �Elicit: A Method for Eliciting Process Models,� presented at Proceedings of the Third
International Conference on the Software Process, 1994.

(Mas95) S. Masters and C. Bothwell, �CMM Appraisal Framework - Version 1.0,� Software Engineering Institute
CMU/SEI-TR-95-001, 1995, available at http://www.sei.cmu.edu/pub/documents/95.reports/pdf/tr001.95.pdf.

(McG94) F. McGarry et al., �Software Process Improvement in the NASA Software Engineering Laboratory,� Software
Engineering Institute CMU/SEI-94-TR-22, 1994, available at http://www.sei.cmu.edu/pub/documents/94.reports/pdf/
tr22.94.pdf.
(McG01) J. McGarry et al., Practical Software Measurement: Objective Information for Decision Makers, Addison-Wesley,
2001.


(Mcg93) C. McGowan and S. Bohner, �Model Based Process Assessments,� presented at International Conference on
Software Engineering, 1993.

(Nak91) T. Nakajo and H. Kume, �A Case History Analysis of Software Error Cause-Effect Relationship,� IEEE
Transactions on Software Engineering, vol. 17, iss. 8, 1991.

(Pau94) M. Palk and M. Konrad, �Measuring Process Capability Versus Organizational Process Maturity,� presented at
4th International Conference on Software Quality, 1994.


(Pfl99) S.L. Pfleeger, �Understanding and Improving Technology Transfer in Software Engineering,� Journal of Systems
and Software, vol. 47, 1999, pp. 111-124.
(Pfl01) S.L. Pfleeger, Software Engineering: Theory and Practice, second ed., Prentice Hall, 2001.

(Rad85) R. Radice et al., �A Programming Process Architecture,� IBM Systems Journal, vol. 24, iss. 2, 1985, pp. 79-90.




(Rag89) S. Raghavan and D. Chand, �Diffusing Software-Engineering Methods,� IEEE Software, July 1989, pp. 81-90.
(Rog83) E. Rogers, Diffusion of Innovations, Free Press, 1983.
(Sch99) E. Schein, Process Consultation Revisited: Building the Helping Relationship, Addison-Wesley, 1999.
(SEI95) Software Engineering Institute, The Capability Maturity Model: Guidelines for Improving the Software Process,
Addison-Wesley, 1995.
(SEL96) Software Engineering Laboratory, Software Process Improvement Guidebook, Software Engineering Laboratory,
NASA/GSFC, Technical Report SEL-95-102, April 1996, available at http://sel.gsfc.nasa.gov/website/documents/onlinedoc/95-102.pdf
(SPC92) Software Productivity Consortium, Process Definition and Modeling Guidebook, Software Productivity
Consortium, SPC-92041-CMC, 1992.
(Som97) I. Sommerville and P. Sawyer, Requirements Engineering: A Good Practice Guide, John Wiley & Sons, 1997.




(Vot93) L. Votta, �Does Every Inspection Need a Meeting?� ACM Software Engineering Notes, vol. 18, iss. 5, 1993, pp.
107-114.

(Wei93) G.M. Weinberg, �Quality Software Management,� First-Order Measurement (Ch. 8, Measuring Cost and Value),
vol. 2, 1993.

(Yu94) E. Yu and J. Mylopolous, �Understanding �Why� in Software Process Modeling, Analysis, and Design,�
presented at 16th International Conference on Software Engineering, 1994
(Zah98) S. Zahran, Software Process Improvement: Practical Guidelines for Business Success, Addison-Wesley, 1998.


(Zel98) M. V. Zelkowitz and D. R. Wallace, �Experimental Models for Validating Technology,� Computer, vol. 31, iss.
5, 1998, pp. 23-31.
Appendix B. List of Standards






















(IEEE1044-93) IEEE Std 1044-1993 (R2002), IEEE Standard for the Classification of Software Anomalies, IEEE, 1993.
(IEEE1061-98) IEEE Std 1061-1998, IEEE Standard for a Software Quality Metrics Methodology, IEEE, 1998.
(IEEE1074-97) IEEE Std 1074-1997, IEEE Standard for Developing Software Life Cycle Processes, IEEE, 1997.
(IEEE1219-98) IEEE Std 1219-1998, IEEE Standard for Software Maintenance, IEEE, 1998.
(IEEE1220-98) IEEE Std 1220-1998, IEEE Standard for the Application and Management of the Systems Engineering
Process, IEEE, 1998.
(IEEE1517-99) IEEE Std 1517-1999, IEEE Standard for Information Technology-Software Life Cycle Processes-Reuse
Processes, IEEE, 1999.
(IEEE1540-01) IEEE Std 1540-2001//ISO/IEC16085:2003, IEEE Standard for Software Life Cycle Processes-Risk
Management, IEEE, 2001.
(IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/IEC12207:1995, Industry Implementation of Int. Std ISO/IEC 12207:95,
Standard for Information Technology-Software Life Cycle Processes, IEEE, 1996.
(IEEE12207.1-96) IEEE/EIA 12207.1-1996, Industry Implementation of Int. Std ISO/IEC 12207:95, Standard for
Information Technology-Software Life Cycle Processes - Life Cycle Data, IEEE, 1996.
(IEEE12207.2-97) IEEE/EIA 12207.2-1997, Industry Implementation of Int. Std ISO/IEC 12207:95, Standard for
Information Technology-Software Life Cycle Processes - Implementation Considerations, IEEE, 1997.
(IEEE14143.1-00) IEEE Std 14143.1-2000//ISO/IEC14143-1:1998, Information Technology-Software MeasurementFunctional Size Measurement-Part 1: Definitions of Concepts, IEEE, 2000.
(ISO9001-00) ISO 9001:2000, Quality Management Systems-Requirements, ISO, 1994.
(ISO9126-01) ISO/IEC 9126-1:2001, Software Engineering-Product Quality-Part 1: Quality Model, ISO and IEC, 2001.
(ISO14674-99) ISO/IEC 14674:1999, Information Technology - Software Maintenance, ISO and IEC, 1999.
(ISO15288-02) ISO/IEC 15288:2002, Systems Engineering-System Life Cycle Process, ISO and IEC, 2002.
(ISO15504-98) ISO/IEC TR 15504:1998, Information Technology - Software Process Assessment (parts 1-9), ISO and IEC,
1998.
(ISO15939-02) ISO/IEC 15939:2002, Software Engineering-Software Measurement Process, ISO and IEC, 2002.
(ISO19761-03) ISO/IEC 19761:2003, Software Engineering-Cosmic FPP-A Functional Size Measurement Method, ISO and
IEC, 2003.
(ISO20926-03) ISO/IEC 20926:2003, Software Engineering-IFPUG 4.1 Unadjusted Functional Size Measurement MethodCounting Practices Manual, ISO and IEC, 2003.
(ISO20968-02) ISO/IEC 20968:2002, Software Engineering-MK II Function Point Analysis - Counting Practices Manual,
ISO and IEC, 2002.
(ISO90003-04) ISO/IEC 90003:2004, Software and Systems Engineering - Guidelines for the Application of ISO9001:2000
to Computer Software, ISO and IEC, 2004.
(VIM93) ISO VIM, International Vocabulary of Basic and General Terms in Metrology, ISO, 1993.
CHAPTER 10
SOFTWARE ENGINEERING TOOLS AND METHODS
ACRONYM
CASE Computer Assisted Software Engineering
INTRODUCTION
Software development tools are the computer-based tools that are intended to assist the software life cycle processes. Tools allow
repetitive, well-defined actions to be automated, reducing the cognitive load on the software engineer who is then free to concentrate
on the creative aspects of the process. Tools are often designed to support particular software engineering methods, reducing any
administrative load associated with applying the method manually. Like software engineering methods, they are intended to make
software engineering more systematic, and they vary in scope from supporting individual tasks to encompassing the complete life
cycle.
Software engineering methods impose structure on the software engineering activity with the goal of making the activity systematic
and ultimately more likely to be successful. Methods usually provide a notation and vocabulary, procedures for performing
identifiable tasks, and guidelines for checking both the process and the product. They vary widely in scope, from a single life cycle
phase to the complete life cycle. The emphasis in this KA is on software engineering methods encompassing multiple life cycle
phases, since phase-specific methods are covered by other KAs.
While there are detailed manuals on specific tools and numerous research papers on innovative tools, generic technical writings on
software engineering tools are relatively scarce. One difficulty is the high rate of change in software tools in general. Specific details
alter regularly, making it difficult to provide concrete, up-to-date examples.
The Software Engineering Tools and Methods KA covers the complete life cycle processes, and is therefore related to every KA in
the Guide.
BREAKDOWN OF TOPICS FOR SOFTWARE ENGINEERING TOOLS AND METHODS
Figure 1 Breakdown of topics in the Software Engineering Tools and Methods KA
1. Software Engineering Tools
The first five topics in the Software Engineering Tools subarea correspond to the first five KAs of the Guide (Software Requirements,
Software Design, Software Construction, Software Testing, and Software Maintenance). The next four topics correspond to the
remaining KAs (Software Configuration Management, Software Engineering Management, Software Engineering Process, and
Software Quality). An additional topic is provided, Miscellaneous, addressing areas such as tool integration techniques which are
potentially applicable to all classes of tools.
1.1. Software Requirements Tools [Dor97, Dor02]
Tools for dealing with software requirements have been classified into two categories: modeling and traceability tools.


Requirements modeling tools. These tools are used for eliciting, analyzing, specifying, and validating software requirements
Requirement traceability tools. [Dor02] These tools are becoming increasingly important as the complexity of software
grows. Since they are also relevant in other life cycle processes, they are presented separately from the requirements
modeling tools.
1.2. Software Design Tools [Dor02]
This topic covers tools for creating and checking software designs. There are a variety of such tools, with much of this variety being a
consequence of the diversity of software design notations and methods. In spite of this variety, no compelling divisions for this topic
have been found.
1.3. Software Construction Tools [Dor02, Rei96]
This topic covers software construction tools. These tools are used to produce and translate program representation (for instance,
source code) which is sufficiently detailed and explicit to enable machine execution.




Program editors. These tools are used for the creation and modification of programs, and possibly the documents associated
with them. They can be general-purpose text or document editors, or they can be specialized for a target language.
Compilers and code generators. Traditionally, compilers have been non-interactive translators of source code, but there has
been a trend to integrate compilers and program editors to provide integrated programming environments. This topic also
covers preprocessors, linker/loaders, and code generators.
Interpreters. These tools provide software execution through emulation. They can support software construction activities by
providing a more controllable and observable environment for program execution.
Debuggers. These tools are considered a separate category since they support the software construction process, but they are
different from program editors and compilers.
1.4. Software Testing Tools [Dor02, Pfl01, Rei96]





Test generators. These tools assist in the development of test cases.
Test execution frameworks. These tools enable the execution of test cases in a controlled environment where the behavior of
the object under test is observed.
Test evaluation tools. These tools support the assessment of the results of test execution, helping to determine whether or not
the observed behavior conforms to the expected behavior.
Test management tools. These tools provide support for all aspects of the software testing process.
Performance analysis tools. [Rei96] These tools are used for measuring and analyzing software performance, which is a
specialized form of testing where the goal is to assess performance behavior rather than functional behavior (correctness).
1.5. Software Maintenance Tools [Dor02, Pfl01]
This topic encompasses tools which are particularly important in software maintenance where existing software is being modified.
Two categories are identified: comprehension tools and reengineering tools.


Comprehension tools. [Re196] These tools assist in the human comprehension of programs. Examples include visualization
tools such as animators and program slicers.
Reengineering tools. In the Software Maintenance KA, reengineering is defined as the examination and alteration of the
subject software to reconstitute it in a new form, and includes the subsequent implementation of the new form.
Reengineering tools support that activity.
Reverse engineering tools assist the process by working backwards from an existing product to create artifacts such as specification
and design descriptions, which then can be transformed to generate a new product from an old one.
1.6. Software Configuration Management Tools [Dor02, Rei96, Som05]
Tools for configuration management have been divided into three categories: tracking, version management, and release tools.



Defect, enhancement, issue, and problem-tracking tools. These tools are used in connection with the problem-tracking issues
associated with a particular software product.
Version management tools. These tools are involved in the management of multiple versions of a product.
Release and build tools. These tools are used to manage the tasks of software release and build. The category includes
installation tools which have become widely used for configuring the installation of software products.
Additional information is given in the Software Configuration Management KA, topic 1.3 Planning for SCM.
1.7. Software Engineering Management Tools [Dor02]
Software engineering management tools are subdivided into three categories: project planning and tracking, risk management, and
measurement.



Project planning and tracking tools. These tools are used in software project effort measurement and cost estimation, as well
as project scheduling.
Risk management tools. These tools are used in identifying, estimating, and monitoring risks.
Measurement tools. The measurement tools assist in performing the activities related to the software measurement program.
1.8. Software Engineering Process Tools [Dor02, Som05]
Software engineering process tools are divided into modeling tools, management tools, and software development environments.




Process modeling tools. [Pfl01] These tools are used to model and investigate software engineering processes.
Process management tools. These tools provide support for software engineering management.
Integrated CASE environments. [Rei96, Som05] (ECMA55-93, ECMA69-94, IEEE1209-92, IEEE1348-95, Mul96)
Integrated computer-aided software engineering tools or environments covering multiple phases of the software engineering
life cycle belong in this subtopic. Such tools perform multiple functions and hence potentially interact with the software life
cycle process being executed.
Process-centered software engineering environments. [Rei96] (Gar96) These environments explicitly incorporate
information on the software life cycle processes and guide and monitor the user according to the defined process.
1.9. Software Quality Tools [Dor02]
Quality tools are divided into two categories: inspection and analysis tools.


Review and audit tools. These tools are used to support reviews and audits.
Static analysis tools. [Cla96, Pfl01, Rei96] These tools are used to analyze software artifacts, such as syntactic and semantic
analyzers, as well as data, control flow, and dependency analyzers. Such tools are intended for checking software artifacts
for conformance or for verifying desired properties.
1.10. Miscellaneous Tool Issues [Dor02]
This topic covers issues applicable to all classes of tools. Three categories have been identified: tool integration techniques, metatools, and tool evaluation.

Tool integration techniques. [Pfl01, Rei96, Som01] (Bro94) Tool integration is important for making individual tools
cooperate. This category potentially overlaps with the integrated CASE environments category where integration techniques
are applied; however, it was felt that it is sufficiently distinct to merit a category of its own. Typical kinds of tool integration
are platform, presentation, process, data, and control.


Meta-tools. Meta-tools generate other tools; compiler-compilers are the classic example.
Tool evaluation. [Pfl01] (IEEE1209-92, IEEE1348-95, Mos92, Val97) Because of the continuous evolution of software
engineering tools, tool evaluation is an essential topic.
2. Software Engineering Methods
The Software Engineering Methods subarea is divided into three topics: heuristic methods dealing with informal approaches, formal
methods dealing with mathematically based approaches, and prototyping methods dealing with software engineering approaches
based on various forms of prototyping. These three topics are not disjoint; rather they represent distinct concerns. For example, an
object-oriented method may incorporate formal techniques and rely on prototyping for verification and validation. Like software
engineering tools, methodologies continuously evolve. Consequently, the KA description avoids as far as possible naming particular
methodologies.
2.1. Heuristic methods [Was96]
This topic contains four categories: structured, data-oriented, object-oriented, and domain-specific. The domain-specific category
includes specialized methods for developing systems which involve real-time, safety, or security aspects.



Structured methods. [Dor02, Pfl01, Pre04, Som05] The system is built from a functional viewpoint, starting with a highlevel view and progressively refining this into a more detailed design.
Data-oriented methods. [Dor02, Pre04] Here, the starting points are the data structures that a program manipulates rather
than the function it performs.
Object-oriented methods. [Dor02, Pfl01, Pre04, Som05] The system is viewed as a collection of objects rather than
functions.
2.2. Formal Methods [Dor02, Pre04, Som05]
This subsection deals with mathematically based software engineering methods, and is subdivided according to the various aspects of
formal methods.



Specification languages and notations. [Cla96, Pfl01, Pre01] This topic concerns the specification notation or language used.
Specification languages can be classified as model-oriented, property-oriented, or behavior-oriented.
Refinement. [Pre04] This topic deals with how the method refines (or transforms) the specification into a form which is
closer to the desired final form of an executable program.
Verification/proving properties. [Cla96, Pfl01, Som05] This topic covers the verification properties that are specific to the
formal approach, including both theorem proving and model checking.
2.3. Prototyping Methods [Pre04, Som05, Was96]
This subsection covers methods involving software prototyping and is subdivided into prototyping styles, targets, and evaluation
techniques.



Prototyping styles. [Dor02, Pfl01, Pre04] (Pom96) The prototyping styles topic identifies the various approaches:
throwaway, evolutionary, and executable specification.
Prototyping target. [Dor97] (Pom96) Examples of the targets of a prototyping method may be requirements, architectural
design, or the user interface.
Prototyping evaluation techniques. This topic covers the ways in which the results of a prototype exercise are used.
RECOMMENDED REFERENCES FOR SOFTWARE ENGINEERING TOOLS AND METHODS








[Cla96] E.M. Clarke et al., �Formal Methods: State of the Art and Future Directions,� ACM Computer Surveys, vol. 28,
iss. 4, 1996, pp. 626-643.
[Dor97] M. Christensen, M. Dorfman and R.H. Thayer, eds., Software Engineering, IEEE Computer Society Press, 1997.
[Dor02] M. Christensen, M. Dorfman and R.H. Thayer, eds., Software Engineering, Vol. 1 & Vol. 2, IEEE Computer
Society Press, 2002.
[Pfl01] S.L. Pfleeger, Software Engineering: Theory and Practice, second ed., Prentice Hall, 2001.
[Pre04] R.S. Pressman, Software Engineering: A Practitioner's Approach, sixth ed., McGraw-Hill, 2004.
[Rei96] S.P. Reiss, Software Tools and Environments in The Computer Science and Engineering Handbook, CRC Press,
1996.
[Som05] I. Sommerville, Software Engineering, seventh ed., Addison-Wesley, 2005.
[Was96] A.I. Wasserman, �Toward a Discipline of Software Engineering,� IEEE Software, vol. 13, iss. 6, November
1996, pp. 23-31.
APPENDIX A. LIST OF FURTHER READINGS

(Ber93) E.V. Berard, Essays on Object-Oriented Software Engineering, Prentice Hall, 1993.


(Bis92) W. Bischofberger and G. Pomberger, Prototyping-Oriented Software Development: Concepts and Tools, SpringerVerlag, 1992.
(Bro94) A.W. Brown et al., Principles of CASE Tool Integration, Oxford University Press, 1994.

(Car95) D.J. Carney and A.W. Brown, �On the Necessary Conditions for the Composition of Integrated Software

Engineering Environments,� presented at Advances in Computers, 1995.
(Col94) D. Coleman et al., Object-Oriented Development: The Fusion Method, Prentice Hall, 1994.



(Cra95) D. Craigen, S. Gerhart, and T. Ralston, �Formal Methods Reality Check: Industrial Usage,� IEEE Transactions
on Software Engineering, vol. 21, iss. 2, February 1995, pp. 90-98.
(Fin00) A. Finkelstein, ed., The Future of Software Engineering, ACM, 2000.
(Gar96) P.K. Garg and M. Jazayeri, Process-Centered Software Engineering Environments, IEEE Computer Society Press,
1996.

(Har00) W. Harrison, H. Ossher, and P. Tarr, �Software Engineering Tools and Environments: A Roadmap,� 2000.

(Jar98) S. Jarzabek and R. Huang, �The Case for User-Centered CASE Tools,� Communications of the ACM, vol. 41, iss.
8, August 1998, pp. 93-99.

(Kit95) B. Kitchenham, L. Pickard, and S.L. Pfleeger, �Case Studies for Method and Tool Evaluation,� IEEE Software,
vol. 12, iss. 4, July 1995, pp. 52-62.

(Lam00) A. v. Lamsweerde, �Formal Specification: A Roadmap,� The Future of Software Engineering, A. Finkelstein,
ed., ACM, 2000, pp. 149-159.
(Mey97) B. Meyer, Object-Oriented Software Construction, second ed., Prentice Hall, 1997.
(Moo98) J.W. Moore, Software Engineering Standards, A User's Roadmap, IEEE Computer Society Press, 1998.



(Mos92) V. Mosley, �How to Assess Tools Efficiently and Quantitatively,� IEEE Software, vol. 9, iss. 3, May 1992, pp.
29-32.

(M�l96) H.A. Muller, R.J. Norman, and J. Slonim, eds., �Computer Aided Software Engineering,� special issue of
Automated Software Engineering, vol. 3, iss. 3/4, Kluwer, 1996.

(M�l00) H. M�ller et al., �Reverse Engineering: A Roadmap,� The Future of Software Engineering, A. Finkelstein, ed.,
ACM, 2000, pp. 49-60.
(Pom96) G. Pomberger and G. Blaschek, Object-Orientation and Prototyping in Software Engineering: Prentice Hall, 1996.
(Pos96) R.M. Poston, Automating Specification-based Software Testing, IEEE Press, 1996.



(Ric92) C. Rich and R.C. Waters, �Knowledge Intensive Software Engineering Tools,� IEEE Transactions on Knowledge
and Data Engineering, vol. 4, iss. 5, October 1992, pp. 424-430.

(Son92) X. Song and L.J. Osterweil, �Towards Objective, Systematic Design-Method Comparisons,� IEEE Software, vol.
9, iss. 3, May 1992, pp. 43-53.
(Tuc96) A.B. Tucker, The Computer Science and Engineering Handbook, CRC Press, 1996.


(Val97) L.A. Valaer and R.C.B. II, �Choosing a User Interface Development Tool,� IEEE Software, vol. 14, iss. 4, 1997,
pp. 29-39.

(Vin90) W.G. Vincenti, What Engineers Know and How They Know It � Analytical Studies from Aeronautical History,
John Hopkins University Press, 1990.

(Wie98) R. Wieringa, �A Survey of Structured and Object-Oriented Software Specification Methods and Techniques,�
ACM Computing Surveys, vol. 30, iss. 4, 1998, pp. 459-527.
APPENDIX B. LIST OF STANDARDS


(ECMA55-93) ECMA, TR/55 Reference Model for Frameworks of Software Engineering Environments, third ed., 1993.
(ECMA69-94) ECMA, TR/69 Reference Model for Project Support Environments, 1994.

(IEEE1175.1-02) IEEE Std 1175.1-2002, IEEE Guide for CASE Tool Interconnections�Classification and Description,
IEEE Press, 2002.



(IEEE1209-92) IEEE Std 1209-1992, Recommended Practice for the Evaluation and Selection of CASE Tools, (ISO/IEC
14102, 1995), IEEE Press, 1992.
(IEEE1348-95) IEEE Std 1348-1995, Recommended Practice for the Adoption of CASE Tools, (ISO/IEC 14471), IEEE
Press, 1995.
(IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/IEC12207:1995, Industry Implementation of Int. Std. ISO/IEC 12207:95,
Standard for Information Technology�Software Life Cycle Processes, IEEE Press, 1996.
CHAPTER 11
SOFTWARE QUALITY
ACRONYMS
CMMI: Capability Maturity Model Integrated
COTS: Commercial Off-the-Shelf Software
PDCA: Plan, Do, Check, Act
SQA: Software Quality Assurance
SQM: Software Quality Management
TQM: Total Quality Management
V&V: Verification and Validation
INTRODUCTION
What is software quality, and why is it so important that it be pervasive in the SWEBOK Guide? Over the years, authors and
organizations have defined the term �quality� differently. To Phil Crosby (Cro79), it was �conformance to user requirements.�
Watts Humphrey (Hum89) refers to it as �achieving excellent levels of fitness for use,� while IBM coined the phrase �marketdriven quality,� which is based on achieving total customer satisfaction. The Baldrige criteria for organizational quality (NIST03)
use a similar phrase, �customer-driven quality,� and include customer satisfaction as a major consideration. More recently, quality
has been defined in (ISO9001-00) as �the degree to which a set of inherent characteristics fulfills requirements.�
This chapter deals with software quality considerations which transcend the life cycle processes. Software quality is a ubiquitous
concern in software engineering, and so it is also considered in many of the KAs. In summary, the SWEBOK Guide describes a
number of ways of achieving software quality. In particular, this KA will cover static techniques, those which do not require the
execution of the software being evaluated, while dynamic techniques are covered in the Software Testing KA.
BREAKDOWN OF SOFTWARE QUALITY TOPICS
Figure 1 Breakdown of topics for the Software Quality KA
1. Software Quality Fundamentals
Agreement on quality requirements, as well as clear communication to the software engineer on what constitutes quality, require that
the many aspects of quality be formally defined and discussed.
A software engineer should understand the underlying meanings of quality concepts and characteristics and their value to the software
under development or to maintenance.
The important concept is that the software requirements define the required quality characteristics of the software and influence the
measurement methods and acceptance criteria for assessing these characteristics.
1.1. Software Engineering Culture and Ethics
Software engineers are expected to share a commitment to software quality as part of their culture. A healthy software engineering
culture is described in [Wie96].
Ethics can play a significant role in software quality, the culture, and the attitudes of software engineers. The IEEE Computer Society
and the ACM [IEEE99] have developed a code of ethics and professional practice based on eight principles to help software
engineers reinforce attitudes related to quality and to the independence of their work.
1.2. Value and Costs of Quality [Boe78; NIST03; Pre04; Wei93]
The notion of �quality� is not as simple as it may seem. For any engineered product, there are many desired qualities relevant to a
particular perspective of the product, to be discussed and determined at the time that the product requirements are set down. Quality
characteristics may be required or not, or may be required to a greater or lesser degree, and trade-offs may be made among them.
[Pfl01]
The cost of quality can be differentiated into prevention cost, appraisal cost, internal failure cost, and external failure cost. [Hou99]
A motivation behind a software project is the desire to create software that has value, and this value may or may not be quantified as a
cost. The customer will have some maximum cost in mind, in return for which it is expected that the basic purpose of the software
will be fulfilled. The customer may also have some expectation as to the quality of the software. Sometimes customers may not have
thought through the quality issues or their related costs. Is the characteristic merely decorative, or is it essential to the software? If the
answer lies somewhere in between, as is almost always the case, it is a matter of making the customer a part of the decision process
and fully aware of both costs and benefits. Ideally, most of these decisions will be made in the software requirements process (see the
Software Requirements KA), but these issues may arise throughout the software life cycle. There is no definite rule as to how these
decisions should be made, but the software engineer should be able to present quality alternatives and their costs. A discussion
concerning cost and the value of quality requirements can be found in [Jon96:c5; Wei96:c11].
1.3. Models and Quality Characteristics [Dac01; Kia95; Lap91; Lew92; Mus99; NIST; Pre01; Rak97; Sei02; Wal96]
Terminology for software quality characteristics differs from one taxonomy (or model of software quality) to another, each model
perhaps having a different number of hierarchical levels and a different total number of characteristics. Various authors have
produced models of software quality characteristics or attributes which can be useful for discussing, planning, and rating the quality
of software products. [Boe78; McC77] ISO/IEC has defined three related models of software product quality (internal quality,
external quality, and quality in use) (ISO9126-01) and a set of related parts (ISO14598-98).
1.3.1. Software engineering process quality
Software quality management and software engineering process quality have a direct bearing on the quality of the software product.
Models and criteria which evaluate the capabilities of software organizations are primarily project organization and management
considerations, and, as such, are covered in the Software Engineering Management and Software Engineering Process KAs.
Of course, it is not possible to completely distinguish the quality of the process from the quality of the product.
Process quality, discussed in the Software Engineering Process KA of this Guide, influences the quality characteristics of software
products, which in turn affect quality-in-use as perceived by the customer.
Two important quality standards are TickIT [Llo03] and one which has an impact on software quality, the ISO9001-00 standard,
along with its guidelines for application to software [ISO90003-04].
Another industry standard on software quality is CMMI [SEI02], also discussed in the Software Engineering Process KA. CMMI
intends to provide guidance for improving processes. Specific process areas related to quality management are (a) process and
product quality assurance, (b) process verification, and (c) process validation. CMMI classifies reviews and audits as methods of
verification, and not as specific processes like (IEEE12207.0-96).
There was initially some debate over whether ISO9001 or CMMI should be used by software engineers to ensure quality. This debate
is widely published, and, as a result, the position has been taken that the two are complementary and that having ISO9001
certification can help greatly in achieving the higher maturity levels of the CMMI. [Dac01]
1.3.2. Software product quality
The software engineer needs, first of all, to determine the real purpose of the software. In this regard, it is of prime importance to keep
in mind that the customer�s requirements come first and that they include quality requirements, not just functional requirements.
Thus, the software engineer has a responsibility to elicit quality requirements which may not be explicit at the outset and to discuss
their importance as well as the level of difficulty in attaining them. All processes associated with software quality (for example,
building, checking, and improving quality) will be designed with these requirements in mind, and they carry additional costs.
Standard (ISO9126-01) defines, for two of its three models of quality, the related quality characteristics and sub-characteristics, and
measures which are useful for assessing software product quality. (Sur03)
The meaning of the term �product� is extended to include any artifact which is the output of any process used to build the final
software product. Examples of a product include, but are not limited to, an entire system requirements specification, a software
requirements specification for a software component of a system, a design module, code, test documentation, or reports produced as a
result of quality analysis tasks. While most treatments of quality are described in terms of the final software and system performance,
sound engineering practice requires that intermediate products relevant to quality be evaluated throughout the software engineering
process.
1.4. Quality Improvement [NIST03; Pre04; Wei96]
The quality of software products can be improved through an iterative process of continuous improvement which requires
management control, coordination, and feedback from many concurrent processes: (1) the software life cycle processes, (2) the
process of error/defect detection, removal, and prevention, and (3) the quality improvement process. (Kin92)
The theory and concepts behind quality improvement, such as building in quality through the prevention and early detection of errors,
continuous improvement, and customer focus, are pertinent to software engineering. These concepts are based on the work of experts
in quality who have stated that the quality of a product is directly linked to the quality of the process used to create it. (Cro79,
Dem86, Jur89)
Approaches such as the Total Quality Management (TQM) process of Plan, Do, Check, and Act (PDCA) are tools by which quality
objectives can be met. Management sponsorship supports process and product evaluations and the resulting findings. Then, an
improvement program is developed identifying detailed actions and improvement projects to be addressed in a feasible time frame.
Management support implies that each improvement project has enough resources to achieve the goal defined for it. Management
sponsorship must be solicited frequently by implementing proactive communication activities. The involvement of work groups, as
well as middle-management support and resources allocated at project level, are discussed in the Software Engineering Process KA.
2. Software Quality Management Processes
Software quality management (SQM) applies to all perspectives of software processes, products, and resources. It defines processes,
process owners, and requirements for those processes, measurements of the process and its outputs, and feedback channels. (Art93)
Software quality management processes consist of many activities. Some may find defects directly, while others indicate where
further examination may be valuable. The latter are also referred to as direct-defect-finding activities. Many activities often serve as
both.
Planning for software quality involves:
1.
2.
Defining the required product in terms of its quality characteristics (described in more detail in, for instance, the Software
Engineering Management KA).
Planning the processes to achieve the required product (described in, for instance, the Software Design and the Software
Construction KAs).
These aspects differ from, for instance, the planning SQM processes themselves, which assess planned quality characteristics versus
actual implementation of those plans. The software quality management processes must address how well software products
will, or do, satisfy customer and stakeholder requirements, provide value to the customers and other stakeholders, and
provide the software quality needed to meet software requirements.
SQM can be used to evaluate the intermediate products as well as the final product.
Some of the specific SQM processes are defined in standard (IEEE12207.0-96):





Quality assurance process
Verification process
Validation process
Review process
Audit process
These processes encourage quality and also find possible problems. But they differ somewhat in their emphasis.
SQM processes help ensure better software quality in a given project. They also provide, as a by-product, general information to
management, including an indication of the quality of the entire software engineering process. The Software Engineering Process and
Software Engineering Management KAs discuss quality programs for the organization developing the software. SQM can provide
relevant feedback for these areas.
SQM processes consist of tasks and techniques to indicate how software plans (for example, management, development,
configuration management) are being implemented and how well the intermediate and final products are meeting their specified
requirements. Results from these tasks are assembled in reports for management before corrective action is taken. The management of
an SQM process is tasked with ensuring that the results of these reports are accurate.
As described in this KA, SQM processes are closely related; they can overlap and are sometimes even combined. They seem largely
reactive in nature because they address the processes as practiced and the products as produced; but they have a major role at the
planning stage in being proactive in terms of the processes and procedures needed to attain the quality characteristics and degree of
quality needed by the stakeholders in the software.
Risk management can also play an important role in delivering quality software. Incorporating disciplined risk analysis and
management techniques into the software life cycle processes can increase the potential for producing a quality product (Cha89).
Refer to the Software Engineering Management KA for related material on risk management.
2.1. Software Quality Assurance [Ack02; Ebe94; Fre98; Gra92; Hor03; Pfl01; Pre04; Rak97; Sch99; Som05; Voa99; Wal89;
Wal96]
SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified
requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into
the software. This means ensuring that the problem is clearly and adequately stated and that the solution�s requirements are properly
defined and expressed. SQA seeks to maintain the quality throughout the development and maintenance of the product by the
execution of a variety of activities at each stage which can result in early identification of problems, an almost inevitable feature of
any complex activity. The role of SQA with respect to process is to ensure that planned processes are appropriate and later
implemented according to plan, and that relevant measurement processes are provided to the appropriate organization.
The SQA plan defines the means that will be used to ensure that software developed for a specific product satisfies the user�s
requirements and is of the highest quality possible within project constraints. In order to do so, it must first ensure that the quality
target is clearly defined and understood. It must consider management, development, and maintenance plans for the software. Refer to
standard (IEEE730-98) for details.
The specific quality activities and tasks are laid out, with their costs and resource requirements, their overall management objectives,
and their schedule in relation to those objectives in the software engineering management, development, or maintenance plans. The
SQA plan should be consistent with the software configuration management plan (refer to the Software Configuration Management
KA). The SQA plan identifies documents, standards, practices, and conventions governing the project and how they will be checked
and monitored to ensure adequacy and compliance. The SQA plan also identifies measures, statistical techniques, procedures for
problem reporting and corrective action, resources such as tools, techniques, and methodologies, security for physical media, training,
and SQA reporting and documentation. Moreover, the SQA plan addresses the software quality assurance activities of any other type
of activity described in the software plans, such as procurement of supplier software to the project or commercial off-the-shelf
software (COTS) installation, and service after delivery of the software. It can also contain acceptance criteria as well as reporting
and management activities which are critical to software quality.
2.2. Verification & Validation [Fre98; Hor03; Pfl01; Pre04; Som05; Wal89; Wal96]
For purposes of brevity, Verification and Validation (V&V) are treated as a single topic in this Guide rather than as two separate
topics as in the standard (IEEE12207.0-96). �Software V&V is a disciplined approach to assessing software products throughout the
product life cycle. A V&V effort strives to ensure that quality is built into the software and that the software satisfies user
requirements� (IEEE1059-93).
V&V addresses software product quality directly and uses testing techniques which can locate defects so that they can be addressed.
It also assesses the intermediate products, however, and, in this capacity, the intermediate steps of the software life cycle processes.
The V&V process determines whether or not products of a given development or maintenance activity conform to the requirement of
that activity, and whether or not the final software product fulfills its intended purpose and meets user requirements. Verification is an
attempt to ensure that the product is built correctly, in the sense that the output products of an activity meet the specifications imposed
on them in previous activities. Validation is an attempt to ensure that the right product is built, that is, the product fulfills its specific
intended purpose. Both the verification process and the validation process begin early in the development or maintenance phase. They
provide an examination of key product features in relation both to the product�s immediate predecessor and to the specifications it
must meet.
The purpose of planning V&V is to ensure that each resource, role, and responsibility is clearly assigned. The resulting V&V plan
documents and describes the various resources and their roles and activities, as well as the techniques and tools to be used. An
understanding of the different purposes of each V&V activity will help in the careful planning of the techniques and resources needed
to fulfill their purposes. Standards (IEEE1012-98:s7 and IEEE1059-93: Appendix A) specify what ordinarily goes into a V&V plan.
The plan also addresses the management, communication, policies, and procedures of the V&V activities and their interaction, as
well as defect reporting and documentation requirements.
2.3. Reviews and Audits
For purposes of brevity, reviews and audits are treated as a single topic in this Guide, rather than as two separate topics as in
(IEEE12207.0-96). The review and audit process is broadly defined in (IEEE12207.0-96) and in more detail in (IEEE1028-97). Five
types of reviews or audits are presented in the IEEE1028-97 standard:





Management reviews
Technical reviews
Inspections
Walk-throughs
Audits
2.3.1. Management reviews
�The purpose of a management review is to monitor progress, determine the status of plans and schedules, confirm requirements and
their system allocation, or evaluate the effectiveness of management approaches used to achieve fitness for purpose� [IEEE102897]. They support decisions about changes and corrective actions that are required during a software project. Management reviews
determine the adequacy of plans, schedules, and requirements and monitor their progress or inconsistencies. These reviews may be
performed on products such as audit reports, progress reports, V&V reports, and plans of many types, including risk management,
project management, software configuration management, software safety, and risk assessment, among others. Refer to the Software
Engineering Management and to the Software Configuration Management KAs for related material.
2.3.2. Technical reviews [Fre98; Hor03; Lew92; Pfl01; Pre04; Som05; Voa99; Wal89; Wal96]
�The purpose of a technical review is to evaluate a software product to determine its suitability for its intended use. The objective is
to identify discrepancies from approved specifications and standards. The results should provide management with evidence
confirming (or not) that the product meets the specifications and adheres to standards, and that changes are controlled� (IEEE102897).
Specific roles must be established in a technical review: a decision-maker, a review leader, a recorder, and technical staff to support
the review activities. A technical review requires that mandatory inputs be in place in order to proceed:





Statement of objectives
A specific software product
The specific project management plan
The issues list associated with this product
The technical review procedure
The team follows the review procedure. A technically qualified individual presents an overview of the product, and the examination
is conducted during one or more meetings. The technical review is completed once all the activities listed in the examination have
been completed.
2.3.3. Inspections [Ack02; Fre98; Gil93; Rad02; Rak97]
�The purpose of an inspection is to detect and identify software product anomalies� (IEEE1028-97). Two important differentiators
of inspections as opposed to reviews are as follows:
1.
2.
An individual holding a management position over any member of the inspection team shall not participate in the inspection.
An inspection is to be led by an impartial facilitator who is trained in inspection techniques.
Software inspections always involve the author of an intermediate or final product, while other reviews might not. Inspections also
include an inspection leader, a recorder, a reader, and a few (2 to 5) inspectors. The members of an inspection team may possess
different expertise, such as domain expertise, design method expertise, or language expertise. Inspections are usually conducted on
one relatively small section of the product at a time. Each team member must examine the software product and other review inputs
prior to the review meeting, perhaps by applying an analytical technique (refer to section 3.3.3) to a small section of the product, or to
the entire product with a focus only on one aspect, for example, interfaces. Any anomaly found is documented and sent to the
inspection leader. During the inspection, the inspection leader conducts the session and verifies that everyone has prepared for the
inspection. A checklist, with anomalies and questions germane to the issues of interest, is a common tool used in inspections. The
resulting list often classifies the anomalies (refer to IEEE1044-93 for details) and is reviewed for completeness and accuracy by the
team. The inspection exit decision must correspond to one of the following three criteria:
1.
2.
3.
Accept with no or at most minor reworking
Accept with rework verification
Reinspect
Inspection meetings typically last a few hours, whereas technical reviews and audits are usually broader in scope and take longer.
2.3.4. Walk-throughs [Fre98; Hor03; Pfl01; Pre04; Som05; Wal89; Wal96]
�The purpose of a walk-through is to evaluate a software product. A walk-through may be conducted for the purpose of educating
an audience regarding a software product.� (IEEE1028-97) The major objectives are to [IEEE1028-97]:




Find anomalies
Improve the software product
Consider alternative implementations
Evaluate conformance to standards and specifications
The walk-through is similar to an inspection but is typically conducted less formally. The walk-through is primarily organized by the
software engineer to give his teammates the opportunity to review his work, as an assurance technique.
2.3.5. Audits [Fre98; Hor03; Pfl01; Pre01; Som05; Voa99; Wal89; Wal96]
�The purpose of a software audit is to provide an independent evaluation of the conformance of software products and processes to
applicable regulations, standards, guidelines, plans, and procedures� [IEEE1028-97]. The audit is a formally organized activity, with
participants having specific roles, such as lead auditor, another auditor, a recorder, or an initiator, and includes a representative of the
audited organization. The audit will identify instances of nonconformance and produce a report requiring the team to take corrective
action.
While there may be many formal names for reviews and audits such as those identified in the standard (IEEE1028- 97), the important
point is that they can occur on almost any product at any stage of the development or maintenance process.
3. Practical Considerations
3.1. Software Quality Requirements [Hor03; Lew92; Rak97; Sch99; Wal89; Wal96]
3.1.1. Influence factors
Various factors influence planning, management, and selection of SQM activities and techniques, including:








The domain of the system in which the software will reside (safety-critical, mission-critical, business-critical)
System and software requirements
The commercial (external) or standard (internal) components to be used in the system
The specific software engineering standards applicable
The methods and software tools to be used for development and maintenance and for quality evaluation and improvement
The budget, staff, project organization, plans, and scheduling of all the processes
The intended users and use of the system
The integrity level of the system
Information on these factors influences how the SQM processes are organized and documented, how specific SQM activities are
selected, what resources are needed, and which will impose bounds on the efforts.
3.1.2. Dependability
In cases where system failure may have extremely severe consequences, overall dependability (hardware, software, and human) is the
main quality requirement over and above basic functionality. Software dependability includes such characteristics as fault tolerance,
safety, security, and usability. Reliability is also a criterion which can be defined in terms of dependability (ISO9126).
The body of literature for systems must be highly dependable (�high confidence� or �high integrity systems�). Terminology for
traditional mechanical and electrical systems which may not include software has been imported for discussing threats or hazards,
risks, system integrity, and related concepts, and may be found in the references cited for this section.
3.1.3. Integrity levels of software
The integrity level is determined based on the possible consequences of failure of the software and the probability of failure. For
software in which safety or security is important, techniques such as hazard analysis for safety or threat analysis for security may be
used to develop a planning activity which would identify where potential trouble spots lie. The failure history of similar software may
also help in identifying which techniques will be most useful in detecting faults and assessing quality. Integrity levels (for example,
gradation of integrity) are proposed in (IEEE1012-98).
3.2. Defect Characterization [Fri95; Hor03; Lew92; Rub94; Wak99; Wal89]
SQM processes find defects. Characterizing those defects leads to an understanding of the product, facilitates corrections to the
process or the product, and informs project management or the customer of the status of the process or product. Many defect (fault)
taxonomies exist, and, while attempts have been made to gain consensus on a fault and failure taxonomy, the literature indicates that
there are quite a few in use [Bei90, Chi96, Gra92], IEEE1044-93) Defect (anomaly) characterization is also used in audits and
reviews, with the review leader often presenting a list of anomalies provided by team members for consideration at a review meeting.
As new design methods and languages evolve, along with advances in overall software technologies, new classes of defects appear,
and a great deal of effort is required to interpret previously defined classes. When tracking defects, the software engineer is interested
in not only the number of defects but also the types. Information alone, without some classification, is not really of any use in
identifying the underlying causes of the defects, since specific types of problems need to be grouped together in order for
determinations to be made about them. The point is to establish a defect taxonomy that is meaningful to the organization and to the
software engineers.
SQM discovers information at all stages of software development and maintenance. Typically, where the word �defect� is used, it
refers to a �fault� as defined below. However, different cultures and standards may use somewhat different meanings for these
terms, which have led to attempts to define them. Partial definitions taken from standard (IEEE610.12-90) are:

Error: �A difference�between a computed result and the correct result�

Fault: �An incorrect step, process, or data definition in a computer program�

Failure: �The [incorrect] result of a fault�

Mistake: �A human action that produces an incorrect result�
Failures found in testing as a result of software faults are included as defects in the discussion in this section. Reliability models are
built from failure data collected during software testing or from software in service, and thus can be used to predict future failures and
to assist in decisions on when to stop testing. [Mus89]
One probable action resulting from SQM findings is to remove the defects from the product under examination. Other actions enable
the achievement of full value from the findings of SQM activities. These actions include analyzing and summarizing the findings, and
using measurement techniques to improve the product and the process as well as to track the defects and their removal. Process
improvement is primarily discussed in the Software Engineering Process KA, with the SQM process being a source of information.
Data on the inadequacies and defects found during the implementation of SQM techniques may be lost unless they are recorded. For
some techniques (for example, technical reviews, audits, inspections), recorders are present to set down such information, along with
issues and decisions. When automated tools are used, the tool output may provide the defect information. Data about defects may be
collected and recorded on an SCR (software change request) form and may subsequently be entered into some type of database, either
manually or automatically, from an analysis tool. Reports about defects are provided to the management of the organization.
3.3. Software Quality Management Techniques [Bas94; Bei90; Con86; Chi96; Fen97; Fri95; Lev95; Mus89; Pen93; Sch99;
Wak99; Wei93; Zel98]
SQM techniques can be categorized in many ways: static, people-intensive, analytical, dynamic.
3.3.1. Static techniques
Static techniques involve examination of the project documentation and software, and other information about the software products,
without executing them. These techniques may include people-intensive activities (as defined in 3.3.2) or analytical activities (as
defined in 3.3.3) conducted by individuals, with or without the assistance of automated tools.
3.3.2. People-intensive techniques
The setting for people-intensive techniques, including reviews and audits, may vary from a formal meeting to an informal gathering
or a desk-check situation, but (usually, at least) two or more people are involved. Preparation ahead of time may be necessary.
Resources other than the items under examination may include checklists and results from analytical techniques and testing. These
activities are discussed in (IEEE1028-97) on reviews and audits. [Fre98, Hor03] and [Jon96, Rak97]
3.3.3. Analytical techniques
A software engineer generally applies analytical techniques. Sometimes several software engineers use the same technique, but each
applies it to different parts of the product. Some techniques are tool-driven; others are manual. Some may find defects directly, but
they are typically used to support other techniques. Some also include various assessments as part of overall quality analysis.
Examples of such techniques include complexity analysis, control flow analysis, and algorithmic analysis.
Each type of analysis has a specific purpose, and not all types are applied to every project. An example of a support technique is
complexity analysis, which is useful for determining whether or not the design or implementation is too complex to develop correctly,
to test, or to maintain. The results of a complexity analysis may also be used in developing test cases. Defect-finding techniques, such
as control flow analysis, may also be used to support another activity. For software with many algorithms, algorithmic analysis is
important, especially when an incorrect algorithm could cause a catastrophic result. There are too many analytical techniques to list
them all here. The list and references provided may offer insights into the selection of a technique, as well as suggestions for further
reading.
Other, more formal, types of analytical techniques are known as formal methods. They are used to verify software requirements and
designs. Proof of correctness applies to critical parts of software. They have mostly been used in the verification of crucial parts of
critical systems, such as specific security and safety requirements. (Nas97)
3.3.4. Dynamic techniques
Different kinds of dynamic techniques are performed throughout the development and maintenance of software. Generally, these are
testing techniques, but techniques such as simulation, model checking, and symbolic execution may be considered dynamic. Code
reading is considered a static technique, but experienced software engineers may execute the code as they read through it. In this
sense, code reading may also qualify as a dynamic technique. This discrepancy in categorizing indicates that people with different
roles in the organization may consider and apply these techniques differently.
Some testing may thus be performed in the development process, SQA process, or V&V process, again depending on project
organization. Because SQM plans address testing, this section includes some comments on testing. The Software Testing KA
provides discussion and technical references to theory, techniques for testing, and automation.
3.3.5. Testing
The assurance processes described in SQA and V&V examine every output relative to the software requirement specification to
ensure the output�s traceability, consistency, completeness, correctness, and performance. This confirmation also includes the
outputs of the development and maintenance processes, collecting, analyzing, and measuring the results. SQA ensures that
appropriate types of tests are planned, developed, and implemented, and V&V develops test plans, strategies, cases, and procedures.
Testing is discussed in detail in the Software Testing KA. Two types of testing may fall under the headings SQA and V&V, because
of their responsibility for the quality of the materials used in the project:


Evaluation and test of tools to be used on the project(IEEE1462-98)
Conformance test (or review of conformance test) of components and COTS products to be used in the product; there now
exists a standard for software packages (IEEE1465-98)
Sometimes an independent V&V organization may be asked to monitor the test process and sometimes to witness the actual
execution to ensure that it is conducted in accordance with specified procedures. Again, V&V may be called upon to evaluate the
testing itself: adequacy of plans and procedures, and adequacy and accuracy of results.
Another type of testing that may fall under the heading of V&V organization is third-party testing. The third party is not the
developer, nor is in any way associated with the development of the product. Instead, the third party is an independent facility,
usually accredited by some body of authority. Their purpose is to test a product for conformance to a specific set of requirements.
3.4. Software Quality Measurement [Gra92]
The models of software product quality often include measures to determine the degree of each quality characteristic attained by the
product.
If they are selected properly, measures can support software quality (among other aspects of the software life cycle processes) in
multiple ways. They can help in the management decision-making process. They can find problematic areas and bottlenecks in the
software process; and they can help the software engineers assess the quality of their work for SQA purposes and for longer-term
process quality improvement.
With the increasing sophistication of software, questions of quality go beyond whether or not the software works to how well it
achieves measurable quality goals.
There are a few more topics where measurement supports SQM directly. These include assistance in deciding when to stop testing.
For this, reliability models and benchmarks, both using fault and failure data, are useful.
The cost of SQM processes is an issue which is almost always raised in deciding how a project should be organized. Often, generic
models of cost are used, which are based on when a defect is found and how much effort it takes to fix the defect relative to finding
the defect earlier in the development process. Project data may give a better picture of cost. Discussion on this topic can be found in
[Rak97: pp. 39-50]. Related information can be found in the Software Engineering Process and Software Engineering Management
KAs.
Finally, the SQM reports themselves provide valuable information not only on these processes, but also on how all the software life
cycle processes can be improved. Discussions on these topics are found in [McC04] and (IEEE1012-98).
While the measures for quality characteristics and product features may be useful in themselves (for example, the number of defective
requirements or the proportion of defective requirements), mathematical and graphical techniques can be applied to aid in the
interpretation of the measures. These fit into the following categories and are discussed in [Fen97, Jon96, Kan02, Lyu96, Mus99].




Statistically based (for example, Pareto analysis, runcharts, scatter plots, normal distribution)
Statistical tests (for example, the binomial test, chi-squared test)
Trend analysis
Prediction (for example, reliability models)
The statistically based techniques and tests often provide a snapshot of the more troublesome areas of the softwareproduct under
examination. The resulting charts andgraphs are visualization aids which the decision-makerscan use to focus resources where they
appear most needed. Results from trend analysis may indicate that a schedule has not been respected, such as in testing, or that certain
classes of faults will become more intense unless some corrective action is taken in development. The predictive techniques assist in
planning test time and in predicting failure. More discussion on measurement in general appears in the Software Engineering Process
and Software Engineering Management KAs. More specific information on testing measurement is presented in the Software Testing
KA.
References [Fen97, Jon96, Kan02, Pfl01] provide discussion on defect analysis, which consists of measuring defect occurrences and
then applying statistical methods to understanding the types of defects that occur most frequently, that is, answering questions in
order to assess their density. They also aid in understanding the trends and how well detection techniques are working, and how well
the development and maintenance processes are progressing. Measurement of test coverage helps to estimate how much test effort
remains to be done, and to predict possible remaining defects. From these measurement methods, defect profiles can be developed for
a specific application domain. Then, for the next software system within that organization, the profiles can be used to guide the SQM
processes, that is, to expend the effort where the problems are most likely to occur. Similarly, benchmarks, or defect counts typical of
that domain, may serve as one aid in determining when the product is ready for delivery.
Discussion on using data from SQM to improve development and maintenance processes appears in the Software Engineering
Management and the Software Engineering Process KAs.
RECOMMENDED REFERENCES FOR SOFTWARE QUALITY

[Ack02] F.A. Ackerman, �Software Inspections and the Cost Effective Production of Reliable Software,� Software
Engineering, Volume 2: The Supporting Processes, Richard H. Thayer and Mark Christensen, eds., Wiley-IEEE Computer
Society Press, 2002.


[Bas84] V.R. Basili and D.M. Weiss, �A Methodology for Collecting Valid Software Engineering Data,� IEEE
Transactions on Software Engineering, vol. SE-10, iss. 6, November 1984, pp. 728-738.
[Bei90] B. Beizer, Software Testing Techniques, International Thomson Press, 1990.

[Boe78] B.W. Boehm et al., �Characteristics of Software Quality,� TRW Series on Software Technologies, vol. 1, 1978.

[Chi96] R. Chillarege, �Orthogonal Defect Classification,� Handbook of Software Reliability Engineering, M. Lyu, ed.,
IEEE Computer Society Press, 1996.
[Con86] S.D. Conte, H.E. Dunsmore, and V.Y. Shen, Software Engineering Metrics and Models: The Benjamin Cummings
Publishing Company, 1986.





[Dac01] G. Dache, �IT Companies will gain competitive advantage by integrating CMM with ISO9001,� Quality System
Update, vol. 11, iss. 11, November 2001.
[Ebe94] R.G. Ebenau and S. Strauss, Software Inspection Process, McGraw-Hill, 1994.
[Fen98] N.E. Fenton and S.L. Pfleeger, Software Metrics: A Rigorous & Practical Approach, second ed., International
Thomson Computer Press, 1998.
[Fre98] D.P. Freedman and G.M. Weinberg, Handbook of Walkthroughs, Inspections, and Technical Reviews, Little,
Brown and Company, 1998.
[Fri95] M.A. Friedman and J.M. Voas, Software Assessment: Reliability, Safety Testability, John Wiley & Sons, 1995.
[Gil93] T. Gilb and D. Graham, Software Inspections, Addison-Wesley, 1993.
[Gra92] R.B. Grady, Practical Software Metrics for Project Management and Process Management, Prentice Hall, 1992.
[Hor03] J. W. Horch, Practical Guide to Software Quality Management, Artech House Publishers, 2003.

[Hou99] D. Houston, �Software Quality Professional,� ASQC, vol. 1, iss. 2, 1999.

[IEEE-CS-99] IEEE-CS-1999, �Software Engineering Code of Ethics and Professional Practice,� IEEE-CS/ACM, 1999,
available at http://www.computer.org/certification/ethics.htm.


[ISO9001-00] ISO 9001:2000, Quality Management Systems � Requirements, ISO, 2000.
[ISO90003-04] ISO/IEC 90003:2004, Software and Systems Engineering-Guidelines for the Application of ISO9001:2000
to Computer Software, ISO and IEC, 2004.
[Jon96] C. Jones and J. Capers, Applied Software Measurement: Assuring Productivity and Quality, second ed., McGrawHill, 1996.
[Kan02] S.H. Kan, Metrics and Models in Software Quality Engineering, second ed., Addison-Wesley, 2002.










[Kia95] D. Kiang, �Harmonization of International Software Standards on Integrity and Dependability,� Proc. IEEE
International Software Engineering Standards Symposium, IEEE Computer Society Press, 1995.
[Lap91] J.C. Laprie, Dependability: Basic Concepts and Terminology in English, French, German, Italian and Japanese, IFIP
WG 10.4, Springer-Verlag, 1991.
[Lev95] N.G. Leveson, Safeware: System Safety and Computers, Addison-Wesley, 1995.
[Lew92] R.O. Lewis, Independent Verification and Validation: A Life Cycle Engineering Process for Quality Software, John
Wiley & Sons, 1992.

[Llo03] Lloyd's Register, �TickIT Guide,� iss. 5, 2003, available at http://www.tickit.org. [Lyu96] M.R. Lyu, Handbook
of Software Reliability Engineering: McGraw-Hill/IEEE, 1996.


[Mcc77] J.A. McCall, �Factors in Software Quality � General Electric,� n77C1502, June 1977.
[McC04] S. McConnell, Code Complete: A Practical Handbook of Software Construction, Microsoft Press, second ed.,
2004.

[Mus89] J.D. Musa and A.F. Ackerman, �Quantifying Software Validation: When to Stop Testing?� IEEE Software, vol.
6, iss. 3, May 1989, pp. 19-27.
[Mus99] J. Musa, Software Reliability Engineering: More Reliable Software, Faster Development and Testing: McGraw
Hill, 1999.


[NIST03] National Institute of Standards and Technology, �Baldrige National Quality Program,� available at
http://www.quality.nist.gov.


[Pen93] W.W. Peng and D.R. Wallace, �Software Error Analysis,� National Institute of Standards and Technology,
Gaithersburg, NIST SP 500-209, December 1993, available at http://hissa.nist.gov/SWERROR/.
[Pfl01] S.L. Pfleeger, Software Engineering: Theory and Practice, second ed., Prentice Hall, 2001.


[Pre04] R.S. Pressman, Software Engineering: A Practitioner�s Approach, sixth ed., McGraw-Hill, 2004.
[Rad02] R. Radice, High Quality Low Cost Software Inspections, Paradoxicon, 2002, p. 479.


[Rak97] S.R. Rakitin, Software Verification and Validation: A Practitioner�s Guide, Artech House, 1997.
[Rub94] J. Rubin, Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests, John Wiley & Sons,
1994.
[Sch99] G.C. Schulmeyer and J.I. McManus, Handbook of Software Quality Assurance, third ed., Prentice Hall, 1999.



[SEI02] �Capability Maturity Model Integration for Software Engineering (CMMI),� CMU/SEI-2002-TR-028, ESC-TR2002-028, Software Engineering Institute, Carnegie Mellon University, 2002.
[Som05] I. Sommerville, Software Engineering, seventh ed., Addison-Wesley, 2005.

[Voa99] J. Voas, �Certifying Software For High Assurance Environments,� IEEE Software, vol. 16, iss. 4, July-August
1999, pp. 48-54.

[Wak99] S. Wakid, D.R. Kuhn, and D.R. Wallace, �Toward Credible IT Testing and Certification,� IEEE Software,
July/August 1999, pp. 39-47.

[Wal89] D.R. Wallace and R.U. Fujii, �Software Verification and Validation: An Overview,� IEEE Software, vol. 6, iss.
3, May 1989, pp. 10-17.

[Wal96] D.R. Wallace, L. Ippolito, and B. Cuthill, �Reference Information for the Software Verification and Validation
Process,� NIST SP 500-234, NIST, April 1996, available at http://hissa.nist.gov/VV234/.



[Wei93] G.M. Weinberg, �Measuring Cost and Value,� Quality Software Management: First-Order Measurement, vol. 2,
chap. 8, Dorset House, 1993.
[Wie96] K. Wiegers, Creating a Software Engineering Culture, Dorset House, 1996.
[Zel98] M.V. Zelkowitz and D.R. Wallace, �Experimental Models for Validating Technology,� Computer, vol. 31, iss. 5,
1998, pp. 23-31.
APPENDIX A. LIST OF FURTHER READINGS

(Abr96) A. Abran and P.N. Robillard, �Function Points Analysis: An Empirical Study of Its Measurement Processes,�
presented at IEEE Transactions on SoftwareEngineering, 1996. //journal or conference?//

(Art93) L.J. Arthur, Improving Software Quality: An Insider�s Guide to TQM, John Wiley & Sons, 1993.




(Bev97) N. Bevan, �Quality and Usability: A New Framework,� Achieving Software Product Quality, E. v. Veenendaal
and J. McMullan, eds., Uitgeverij Tutein Nolthenius, 1997.
(Cha89) R.N. Charette, Software Engineering Risk Analysis and Management, McGraw-Hill, 1989.
(Cro79) P.B. Crosby, Quality Is Free, McGraw-Hill, 1979.
(Dem86) W.E. Deming, Out of the Crisis: MIT Press, 1986.

(Dod00) Department of Defense and US Army, �Practical Software and Systems Measurement: A Foundation for
Objective Project Management, Version 4.0b,� October 2000, available at http://www.psmsc.com.

(Hum89) W. Humphrey, �Managing the Software Process,� Chap. 8, 10, 16, Addison-Wesley, 1989.

(Hya96) L.E. Hyatt and L. Rosenberg, �A Software Quality Model and Metrics for Identifying Project Risks and Assessing


Software Quality,� presented at 8th Annual Software Technology Conference, 1996.
(Inc94) D. Ince, ISO 9001 and Software Quality Assurance, McGraw-Hill, 1994.
(Jur89) J.M. Juran, Juran on Leadership for Quality, The Free Press, 1989.

(Kin92) M.R. Kindl, �Software Quality and Testing: What DoD Can Learn from Commercial Practices,� U.S. Army
Institute for Research in Management Information, Communications and Computer Sciences, Georgia Institute of
Technology, August 1992.

(NAS97) NASA, �Formal Methods Specification and Analysis Guidebook for the Verification of Software and Computer
Systems, Volume II: A Practitioner�s Companion,� 1997, available at http://eis.jpl.nasa.gov/quality/Formal_Methods/.

(Pal97) J.D. Palmer, �Traceability,� Software Engineering, M. Dorfman and R. Thayer, eds., 1997, pp. 266-276.

(Ros98) L. Rosenberg, �Applying and Interpreting Object-Oriented Metrics,� presented at Software Technology
Conference, 1998, available at http://satc.gsfc.nasa.gov/support/index. html.

(Sur03) W. Suryn, A. Abran, and A. April, �ISO/IEC SQuaRE. The Second Generation of Standards for Software Product
Quality,� presented at IASTED 2003, 2003.

(Vin90) W.G. Vincenti, What Engineers Know and How They Know It � Analytical Studies from Aeronautical History,
John Hopkins University Press, 1990.
APPENDIX B. LIST OF STANDARDS

(FIPS140.1-94) FIPS 140-1, Security Requirements for Cryptographic Modules, 1994.


(IEC61508-98) IEC 61508, Functional Safety � Safety-Related Systems Parts 1, 2, 3, IEEE, 1998.
(IEEE610.12-90) IEEE Std 610.12-1990 (R2002), IEEE Standard Glossary of Software Engineering Terminology, IEEE,
1990.
(IEEE730-02) IEEE Std 730-2002, IEEE Standard for Software Quality Assurance Plans, IEEE, 2002.
(IEEE982.1-88) IEEE Std 982.1-1988, IEEE Standard Dictionary of Measures to Produce Reliable Software, 1988.
(IEEE1008-87) IEEE Std 1008-1987 (R2003), IEEE Standard for Software Unit Testing, IEEE, 1987.
(IEEE1012-98) IEEE Std 1012-1998, Software Verification and Validation, IEEE, 1998.
(IEEE1028-97) IEEE Std 1028-1997 (R2002), IEEE Standard for Software Reviews, IEEE, 1997.
(IEEE1044-93) IEEE Std 1044-1993 (R2002), IEEE Standard for the Classification of Software Anomalies, IEEE, 1993.
(IEEE1059-93) IEEE Std 1059-1993, IEEE Guide for Software Verification and Validation Plans, IEEE, 1993.
(IEEE1061-98) IEEE Std 1061-1998, IEEE Standard for a Software Quality Metrics Methodology, IEEE, 1998.
(IEEE1228-94) IEEE Std 1228-1994, Software Safety Plans, IEEE, 1994.










(IEEE1462-98) IEEE Std 1462-1998//ISO/IEC14102, Information Technology � Guideline for the Evaluation and
Selection of CASE Tools. (IEEE1465-98) IEEE Std 1465-1998//ISO/IEC12119:1994, IEEE Standard Adoption of

International Standard IDO/IEC12119:1994(E), Information Technology-Software Packages � Quality Requirements and
Testing, IEEE, 1998.
(IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/IEC12207:1995, Industry Implementation of Int. Std ISO/IEC 12207:95,
Standard for Information Technology-Software Life Cycle Processes, IEEE, 1996.

(ISO9001-00) ISO 9001:2000, Quality Management Systems � Requirements, ISO, 2000.


(ISO9126-01) ISO/IEC 9126-1:2001, Software Engineering � Product Quality, Part 1: Quality Model, ISO and IEC, 2001.
(ISO14598-98) ISO/IEC 14598:1998, Software Product Evaluation, ISO and IEC, 1998.

(ISO15026-98) ISO/IEC 15026:1998, Information Technology � System and Software Integrity Levels, ISO and IEC,
1998.

(ISO15504-98) ISO/IEC TR 15504-1998, Information Technology � Software Process Assessment (parts 1-9), ISO and
IEC, 1998.

(ISO15939-00) ISO/IEC 15939:2000, Information Technology � Software Measurement Process, ISO and IEC, 2000.

(ISO90003-04) ISO/IEC 90003:2004, Software and Systems Engineering � Guidelines for the Application of
ISO9001:2000 to Computer Software, ISO and IEC, 2004.
CHAPTER 12
RELATED DISCIPLINES OF SOFTWARE
ENGINEERING
INTRODUCTION
In order to circumscribe software engineering, it is necessary to identify the disciplines with which software engineering shares a
common boundary. This chapter identifies, in alphabetical order, these Related Disciplines. Of course, the Related Disciplines also
share many common boundaries between themselves.
Using a consensus-based recognized source, this chapter identifies for each Related Discipline:


An informative definition (when feasible)
A list of knowledge areas
LIST OF RELATED DISCIPLINES AND THEIR KNOWLEDGE AREAS
Figure 1 gives a graphical representation of these Related Disciplines.
Figure 1 Related Disciplines of Software Engineering
Computer Engineering
The draft report of the volume on computer engineering of the Computing Curricula 2001 project (CC2001)1 states that �computer
engineering embodies the science and technology of design, construction, implementation and maintenance of software and hardware
components of modern computing systems and computer-controlled equipment.�
This report identifies the following Knowledge Areas (known as areas in the report) for computer engineering:





















Algorithms and Complexity
Computer Architecture and Organization
Computer Systems Engineering
Circuits and Systems
Digital Logic
Discrete Structures
Digital Signal Processing
Distributed Systems
Electronics
Embedded Systems
Human-Computer Interaction
Information Management
Intelligent Systems
Computer Networks
Operating Systems
Programming Fundamentals
Probability and Statistics
Social and Professional Issues
Software Engineering
Test and Verification
VLSI/ASIC Design
Computer Science
The final report of the volume on computer science of the Computing Curricula 2001 project (CC2001)2 identifies the following list
of knowledge areas (identified as areas in the report) for computer science:














Discrete Structures
Programming Fundamentals
Algorithms and Complexity
Architecture and Organization
Operating Systems
Net-Centric Computing
Programming Languages
Human-Computer Interaction
Graphics and Visual Computing
Intelligent Systems
Information Management
Social and Professional Issues
Software Engineering
Computational Science and Numerical Methods
Management
The European MBA Guidelines defined by the European association of national accreditation bodies (EQUAL)3 states that the
Master of Business Administration degree should include coverage of and instruction in
Accounting









Finance
Marketing and Sales
Operations Management
Information Systems Management
Law
Human Resource Management
Economics
Quantitative Analysis
Business Policy and Strategy
Mathematics
Two sources are selected to identify the list of knowledge areas for mathematics. The report titled �Accreditation Criteria and
Procedures�4 of the Canadian Engineering Accreditation Board identifies that appropriate elements of the following areas should be
present in an undergraduate engineering curriculum:







Linear Algebra
Differential and Integral Calculus
Differential Equations
Probability
Statistics
Numerical analysis
Discrete Mathematics
A more focused list of mathematical topics (called units and topics in the report) that underpin software engineering can be found in
the draft report of the volume on software engineering of the Computing Curricula 2001 project (CC2001).5
Project Management
Project management is defined in the 2000 Edition of A Guide to the Project Management Body of Knowledge (PMBOK � Guide6)
published by the Project Management Institute and adopted as IEEE Std 1490-2003, as �the application of knowledge, skills, tools,
and techniques to project activities to meet project requirements.�
The Knowledge Areas identified in the PMBOK Guide for project management are







Project Integration Management
Project Scope Management
Project Time Management
Project Cost Management
Project Quality Management
Project Human Resource Management
Project Communications Management


Project Risk Management
Project Procurement Management
Quality Management
Quality management is defined in ISO 9000-2000 as �coordinated activities to direct and control an organization with regard to
quality.� The three selected reference on quality management are

ISO 9000:2000 Quality management systems -- Fundamentals and vocabulary


ISO 9001:2000 Quality management systems � Requirements
ISO 9004:2000 Quality management systems -- Guidelines for performance improvements
The American Society for Quality identifies the following Knowledge Areas (first-level breakdown topics in their outline) in their
Body of Knowledge for certification as a Quality Engineer:7
2) Management and Leadership in Quality Engineering





Quality Systems Development, Implementation And Verification
Planning, Controlling, and Assuring Product and Process Quality
Reliability and Risk Management
Problem Solving and Quality Improvement
Quantitative Methods
Software Ergonomics
The field of ergonomics is defined by ISO Technical Committee 159 on Ergonomics as follows: �Ergonomics or (human factors) is
the scientific discipline concerned with the understanding of the interactions among human and other elements of a system, and the
profession that applies theory, principles, data and methods to design in order to optimize human well-being and overall system
performance.�8
A list of Knowledge Areas for ergonomics as it applies to software is proposed below:9






Cognition
Cognitive AI I: Reasoning
Machine Learning and Grammar Induction
Formal Methods in Cognitive Science: Language
Formal Methods in Cognitive Science: Reasoning
Formal Methods in Cognitive Science:

Cognitive Architecture






Cognitive AI II: Learning
Foundations of Cognitive Science
Information Extraction from Speech and Text
Lexical Processing
Computational Language Acquisition
The Nature of HCI

(Meta-)Models of HCI

Use and Context of Computers


Human Social Organization and Work
Application Areas


Human-Machine Fit and Adaptation
Human Characteristics



Human Information Processing
Language, Communication, Interaction
Ergonomics

Computer System and Interface Architecture




Input and Output Devices
Dialogue Techniques
Dialogue Genre
Computer Graphics


Dialogue Architecture
Development Process




Design Approaches
Implementation Techniques
Evaluation Techniques
Example Systems and Case Studies
A more focused list of topics on human-computer interface design (called units and topics in the report) for software engineering
curriculum purposes can be found in the draft report of the volume on software engineering of the Computing Curricula 2001 project
(CC2001).10
Systems Engineering
The International Council on Systems Engineering (INCOSE)11 states that �Systems Engineering is an interdisciplinary approach
and means to enable the realization of successful systems. It focuses on defining customer needs and required functionality early in
the development cycle, documenting requirements, then proceeding with design synthesis and system validation while considering
the complete problem: operations performance, test, manufacturing, cost and schedule, training and support and disposal.�
Systems engineering integrates all the disciplines and specialty groups into a team effort forming a structured development process
that proceeds from concept to production to operation. Systems engineering considers both the business and the technical needs of all
customers with the goal of providing a quality product that meets user needs.
The International Council on Systems Engineering (INCOSE, www.incose.org) is working on a Guide to the Systems Engineering
Body of Knowledge. Preliminary versions include the following first-level competency areas:
1.
2.
3.
4.
5.
Business Processes and Operational Assessment (BPOA)
System/Solution/Test Architecture (SSTA)
Life Cycle Cost & Cost-Benefit Analysis (LCC & CBA)
Serviceability / Logistics (S/L)
Modeling, Simulation, & Analysis (MS&A)
6.
Management: Risk, Configuration, Baseline (Mgt)
1.
2.
3.
4.
5.
6.
7.
http://www.eng.auburn.edu/ece/CCCE/Iron_Man_Draft_October_2003.pdf
http://www.computer.org/education/cc2001/final/cc2001.pdf
http://www.efmd.be/
http://www.ccpe.ca/e/files/report_ceab.pdf
http://sites.computer.org/ccse/volume/FirstDraft.pdf
PMBOK is a registered trademark in the United States and other nations.
http://isotc.iso.ch/livelink/livelink.exe/fetch/2000/2122/687806/ISO_TC_159__
Ergonomics_.pdfnodeid=1162319&vernum=0http://www.asq.org/cert/types/cqe/bok.html
http://isotc.iso.ch/livelink/livelink.exe/fetch/2000/2122/687806/ISO_TC_
159_Ergonomics_.pdfnodeid=1162319&vernum=0
This list was compiled for the 2001 edition of the SWEBOK Guide from the list of courses offered at the John
Hopkins University Department of Cognitive Sciences and from the ACM SIGCHI Curricula for Human-Computer
8.
9.
Interaction. The list was then refined by three experts in the field: two from Universit� du Qu�bec � Montr�al
and W. W. McMillan, from Eastern Michigan University. They were asked to indicate which of these topics should be
known by a software engineer. The topics that were rejected by two of the three respondents were removed from the
original list.
10. http://sites.computer.org/ccse/volume/FirstDraft.pdf
11. www.incose.org
APPENDIX D
CLASSIFICATION OF TOPICS ACCORDING TO
BLOOM�S TAXONOMY
INTRODUCTION
Bloom�s taxonomy1 is a well-known and widely used classification of cognitive educational goals. In order to help audiences who
wish to use the Guide as a tool in defining course material, university curricula, university program accreditation criteria, job
descriptions, role descriptions within a software engineering process definition, professional development paths and professional
training programs, and other needs, Bloom�s taxonomy levels for SWEBOK Guide topics are proposed in this appendix for a
software engineering graduate with four years of experience. A software engineering graduate with four years of experience is in
essence the �target� of the SWEBOK Guide as defined by what is meant by generally accepted knowledge (See Introduction of the
SWEBOK Guide).
Since this Appendix only pertains to what can be considered as �generally accepted� knowledge, it is very important to remember
that a software engineer must know substantially more than this �category� of knowledge. In addition to �generally accepted�
knowledge, a software engineering graduate with four years of knowledge must possess some elements from the Related Disciplines
as well as certain elements of specialized knowledge, advanced knowledge, and possibly even research knowledge (see Introduction
of the SWEBOK Guide). The following assumptions were made when specifying the proposed taxonomy levels:


The evaluations are proposed for a �generalist� software engineer and not a software engineer working in a specialized
group such as a software configuration management team, for instance. Obviously, such a software engineer would require
or would attain much higher taxonomy levels in the specialty area of their group;
A software engineer with four years of experience is still at the beginning of their career and would be assigned relatively
few management duties, or at least not for major endeavors. �Management-related topics� are therefore not given priority

in the proposed evaluations. For the same reason, taxonomy levels tend to be lower for �early life cycle topics� such as
those related to software requirements than for more technically-oriented topics such as those within software design,
software construction or software testing.
So the evaluations can be adapted for more senior software engineers or software engineers specializing in certain
knowledge areas, no topic is given a taxonomy level higher than Analysis. This is consistent with the approach taken in the
Software Engineering Education Body of Knowledge (SEEK) where no topic is assigned a taxonomy level higher than
Application.2 The purpose of SEEK is to define a software engineering education body of knowledge appropriate for
guiding the development of undergraduate software engineering curricula. Though distinct notably in terms of scope, SEEK
and the SWEBOK Guide are closely related.3
Bloom�s Taxonomy of the Cognitive Domain proposed in 1956 contains six levels. Table 14 presents these levels and keywords
often associated with each level.
Table 1 Bloom�s Taxonomy
The breakdown of topics in the tables does not match perfectly the breakdown in the Knowledge Areas. The evaluation for this
Appendix was prepared while some comments were still coming in.
Finally, please bear in mind that the evaluations of this Appendix should definitely only be seen as a proposal to be further developed
and validated.
SOFTWARE REQUIREMENTS5
SOFTWARE DESIGN
SOFTWARE CONSTRUCTION
SOFTWARE TESTING
SOFTWARE MAINTENANCE
SOFTWARE CONFIGURATION MANAGEMENT
SOFTWARE ENGINEERING MANAGEMENT
APPENDIX C
ALLOCATION OF IEEE AND ISO SOFTWARE
ENGINEERING STANDARDS TO
SWEBOK KNOWLEDGE AREAS
This table lists software engineering standards produced by IEEE and ISO/IEC JTC1/SC7, as well as a few selected standards from
other sources. For each standard, its number and title is provided. In addition, the �Description� column provides material
excerpted from the standard�s abstract or other introductory material. Each of the standards is mapped to the Knowledge Areas of
the SWEBOK Guide. In a few cases�like vocabulary standards�the standard applies equally to all KAs; in such cases, an X
appears in every column. In most cases, each standard has a forced allocation to a single primary Knowledge Area; this allocation is
marked with a �P�. In many cases, there are secondary allocations to other KAs; these are marked with an �S�. The list is
ordered by standard number, regardless of its category (IEEE, ISO, etc.).
Hosted by Software Engineering Research Laboratory (GÉLOG)
This site and all contents are Copyright (c) 2001-2007, Institute of Electrical and Electronics Engineers,
Inc. All rights reserved
Download