Guide for the Qualification of Software Products

advertisement
COG-95-179-1
Guide for the Qualification of Software Products
D.R. Tremaine, D.S. Ahluwalia, J.F.P. de Grosbois, E.G. Echlin
1995 October
COG-95-179-1
AECL
GUIDE FOR THE QUALIFICATION OF SOFTWARE PRODUCTS
D.R. Tremaine
SWI Systemware Inc.
151 Eglinton Ave. W.
Toronto, Ontario
D.S.Ahluwalia
Instrumentation &
Control Branch
Chalk River Laboratories
Chalk River, Ontario
J.F.P deGrosbois
Control Centre
Technology Branch
Chalk River Laboratories
Chalk River, Ontario
E.G. Echlin
Instrumentation &
Control Branch
Chalk River Laboratories
Chalk River, Ontario
This document has been reclassified as a public document with the formal written
approval of COG Technical Committee 16. It may be released without prior approval
to outside commercial vendors or to any other interested group, agency, or individual.
Both COG and AECL retain copyright permissions on the document. The document
may not be reproduced, published, or sold in any form without written permission
from COG or AECL. Single printed copies may be made for personal use only
provided the cover pages are included. This report is not a formal publication. The
sole source of this document is Atomic Energy of Canada Ltd., Chalk River Ontario,
KOJ 1JO.
Disclaimer of Liability: This document was prepared as a result of a CANDU
Owners Group (COG) sponsored initiative. Neither COG nor its members, nor any of
their employees, officers, or directors, makes any warranty, expressed or implied, or
assumes any legal liability or responsibility for the accuracy, completeness, or
usefulness of any information, apparatus, product, or process disclosed, or represents
that its use would not infiinge on privately owned rights.
The work reported in this document was funded by the COG R&D Program: Working
Party No. 16, WPIR 1652.
Instrumentation and Control Branch
Chalk River Laboroatories
Chalk River Ontario, KOJ 1JO
1995, October
COG-95-179
AECL
Guide for the Qualification of Software Products
by
DR. Tremaine*, D.S.Ahluwalia, J.F.P.deGmsbois, and E.G. Echlin
ABSTRACT
The Ontario Hydro and AECL Software Engineering Standards (OASES) committee has developed a series of
standards and procedures to help ensure that software custom-engineered for CANDU stations will be of
acceptable quality. These OASES standards are not used by vendors outside of AECL and Ontario Hydro for
the engineering of their software products. Therefore, the level of quality of software products considered for
use in CANDU stations may not be consistent with that defined by the OASES standards. Hence, there is a
need to evaluate software products to determine whether they are suitable before using them in nuclear
applications. This evaluation involves investigating a variety of quality objectives, including safety,
functionality, reliability, reviewability, usability, efficiency, maintainability, and portability. These objectives are
typically evaluated by project team members intimately familiar with the requirements and design specifics of
the application. However, the objectives--reliability, reviewability and maintainability and some aspects of
safety can readily be investigated as a separate undertaking by specialists in software quality assurance. This
separate undertaking is referred to as qualification. This document is intended as a guide for someone (the
"qualifier") performing qualification work. It defines the objectives of qualification and describes the methods
available to the qualifier for assessing product suitability. These methods are accompanied with a collection of
checklists and questionnaires to be used as tools for evidence gathering. This document also provides step by
step guidance on the qualification process and an outline of the qualification report. The focus of this guide is
OASES Nuclear Safety Category I1 and Category I11 applications.
*Systemware Inc.
151 Eglinton Ave. West
Toronto, Ontario
Instrumentation and Control Branch
Chalk River Laboratories
Chalk River, Ontario KOJ 110
1995 October
COG-95- 179
AECL
Guide for the Qualification of Software Products
by
D.R. Tremaine*, D.S.Ahluwalia, J.F.P.deGrosbois, and E.G. Echlin
VALUE AND IMPLICATIONS
This guide provides a clear, concise and useable method for the qualification of software products
for use in OASES Nuclear Safety Category II and Category III applications in CANDU stations. It
describes methods for assessing the suitability of software products, and provides checklists and
questionnaires to assist in the qualification process. The guide thus provides a consistent approach
to qualification of software products for different applications in CANDU.
R.R. Shah,Manager
Instrumentation and Control Branch
Chalk River Laboratories
Chalk River, Ontario KOJ 1JO
1995 October
COG-95-179-1
i
REVISION HISTORY
Revision
0
Date
Description
BY
1995 October
Issued for Use
D.R. Tremaine,
D.S.Ahluwalia,
J.F.P.de Grosbois, &
E.G. Echlin
1996 November
Reclassified to
“Available”
J. Gierlach, COG
..
11
COG-95-179
TABLE OF CONTENTS
.
1
INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1.1
1.2
1.3
1.4
2
.
OVERVIEW . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.1
2.2
2.3
2.4
3
.
3.7
.
What Qualification Is . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
What Qualification is NOT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
The Qualifier's Skill Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
The Qualification Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
THE QUALIFICATION METHODS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3.1
3.2
3.3
3.4
3.5
3.6
4
Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Intended Audience . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
How to Use this Guide . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Software Quality Assurance Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Operating History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Referencesites . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Failure Modes Data Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Goodness-of-Design Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Maintenance Process Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Anecdotal Evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1
2
2
2
3
4
4
5
6
6
8
8
8
8
9
9
9
10
THE STEPS IN PERFORMING A QUALIFICATION . . . . . . . . . . . . . . 11
4.1
4.2
4.3
4.4
4.5
STEP 1: Determine the Qualification Requirements ......................
STEP 2: Determine the Focus of the Qualification . . . . . . . . . . . . . . . . . . . . . . .
STEP 3: Plan the Qualification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.3.1 Choose a Preliminary Appmach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
The Use of Previous Qualifications Performed on the Product . . . . . . . . .
4.3.2
4.3.3
Prepare the Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.3.4
Example Qualification Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
STEP 4: Execute the Qualification Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.4.1
Prepare Qualification Checklists and Questionnaires . . . . . . . . . . . . . . . .
4.4.2
Gather Reference Site Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.4.3 Review Anecdotal Evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.4.4 Review Available Product SQA Documentation . . . . . . . . . . . . . . . . . . .
4.4.5
Visit the Vendor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.4.6 Determine the Complexity of the Software Product . . . . . . . . . . . . . . . . .
4.4.7
Filling out Checklists and Questionnaires ........................
STEP 5: Analyze the Evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.5.1 Software Quality Assurance Process Assessment . . . . . . . . . . . . . . . . . . .
4.5.2
Operating History Assessment ................................
4.5.2.1 Calculating Unit-Years of Operation ......................
45.2.2 Operating History Calculation Example . . . . . . . . . . . . . . . . . . .
4.5.2.3 Interpreting Operating Histoy Results ....................
11
12
13
13
14
15
17
21
21
21
22
22
23
24
25
26
26
28
28
29
31
...
111
4.6
5
.
32
33
33
35
36
36
36
37
TableofContents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Intmduction (Section 1.) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Qualification Approach (Section 2.) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Analysis of the Assessment Results (Section 3.) . . . . . . . . . . . . . . . . . . . . . . . . .
Conclusions and Recommendations (Section 4.) . . . . . . . . . . . . . . . . . . . . . . . . .
References (Section 5.) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Appendices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
38
38
39
39
39
39
40
40
40
40
41
REFERENCES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
42
5.7
5.8
5.9
5.10
.
453 Reference Site Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.5.4 Failuit Modes Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
455 Goodness-of-Design Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
45.6 Maintenance Process Assessment ..............................
45.7 Anecdotal Evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
STEP 6: Issue the Qualification Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.6.1 Dmw Conclusions and Determine Recommendations . . . . . . . . . . . . . . . .
4.62 Perform Follow-up Activities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
The Qualification Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.1
Titlepage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
52
Confidentiality Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.3
Executive Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.4
5.5
5.6
6
COG-95-179
APPENDICES
A.
B.
C.
D.
Category
Category
Category
Category
E.
Category
Category
Category
Category
F.
G.
H.
1.
J.
K.
L.
M.
N.
0.
P.
II Computer System Requirements Checklist . . . . . . . . . . . . . . . . . . . . . . . . . .
II Development Requirements Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
A1
B1
I1 Verification Requirements Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
II Process Suppon Outputs Requkments Checklist . . . . . . . . . . . . . . . . . . . . .
Cl
D1
III Computer System Requirements Checklist . . . . . . . . . . . . . . . . . . . . . . . . .
III Development Requirements Checklist .............................
III Verification Requirements Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
El
111Process Support Outputs Requirements Checklist ....................
F1
GI
HI
Goodness-of-Design Requirements Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
11
Safety Net Requirements Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
J1
Failure Mode Analysis Questionnaire . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . K1
Operating History Questionnaire . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
11
Maintenance Requirements Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
M1
Reference Site Questionnaiit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
N1
Maintenance Questionnaire . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
01
Distribution List . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
PI
1
1.
COG-95-179
INTRODUCTION
Reducing operating costs is a key priority for all CANDU stations. To help achieve this, the use of
software products is being considered more frequently, over traditional custom software engineering.
The aim is to obtain the best dollar value without sacrificing quality. Software products are also
being purchased in the less conspicuous form of embedded systems controlling such devices as
programmable logic contmllers (PLCs), panel meters, actuators and sensors. Some of the common
target applications for software pmducts in CANDU stations are: safety systems, mitigating systems,
process control systems, monitoring and surveillance systems. Whenever software is used in a
CANDU station, whether custom software or a software product, it is incumbent upon the station to
ensure that its quality is acceptable before placing it into operation.
Software used within CANDU station applications is categorized, with respect to the effect that its
failure has on nuclear safety, from Category I (safetycritical) to Category N (fully non safetycritical)
[I]. The Ontario Hydro and AECL Software Engineering Standards (OASES) committee has
developed a series of standards and procedures to help ensure that software custom engineered for
CANDU stations will be of acceptable quality. There is an OASES standard for the software
engineering of Category I, Category 11, and Category 111 software [2, 3.41.
The OASES standards are not used by vendors outside of AECL and Ontario Hydro for the
engineering of their software products. Therefore, the level of quality of software products considered
for use in CANDU stations may not be consistent with that defined by the OASES standards. Hence,
there is a need to evaluate software products to determine whether they are of a sufficient level of
quality for use in a given application and category. The evaluation of software products is a complex
undertaking involving the investigation of a variety of quality objectives, including safety,
functionality, reliability, reviewability, usability, efficiency, maintainability, and ponability . These
objectives are typically evaluated by project team members intimately familiar with the requirements
and design specifics of the application. However, the objectives--reliability,reviewability and
maintainability and some aspects of safety-can readily be investigated as a separate undertaking by
specialists in software quality assurance (qualifier). This separate undertaking is referred to as
qualification. Qualification exists to facilitate the use of software products in CANDU stations. Its
goal is to help assess the risk involved in using a specific software product for a specific application
and category. The output of a qualification is a qualification report which provides the qualifier's
opinion, supported by the collected evidence, as to whether or not the software product meets the
qualijication requirements.
The objective of this guide is to define a clear and practical process for qualifying software products.
It defines the objectives of qualification, describes the qualification methods available and provides
guidance in their selection and use. Finally, the guide defines a format for a typical qualification
report.
The terms software product qualificution and qualification are used interchangeably throughout this
guide.
2
1.1
COG-95-179
Scope
This guide will be a pari of the OASES family of standards and procedures for defining the acceptable
quality of software used in nuclear applications. It is intended for use in the qualification of software
products targeted for Category II and Category I.U applications. This guide is net applicable to
Category I applications.
This guide is based on experience gained from actual qualification wok performed by the authors and
other Ontario Hydro and AECL staff. The basic framework was derived from Reference [ 5 ] .
1.2
Intended Audience
This guide is primarily written for someone wishing to perform a qualification for a software product
(ie., a qualifier).
1.3
Terminology
For a complete list of OASES terminology see [2, 3 or 41. The following terminology is used in this
guide.
application
A specific use for a software product. Also referred to as target application.
assessment
The pmcess of detetmining whether a software product meets a set of
qualification requirements.
category
Also referred to as "OASES nuclear safety category". A means of indicating
the criticality of an application based on its impact on nuclear safety and
possibly other concerns related to non-safety risks (e.g., economics). The
OASES family of standards and procedures comprises four categories related
to nuclear safety, Category I being the most critical and Category IV being the
fully non safetycritical.
duv cycle
The percentage of time that the software product is actually performing its
primary functions in an operational system.
evaluation
The process of determining if a software product is suitable for use in a target
application by investigating a variety of quality objectives.
firmware
A combination of software and data that reside on red-only memory.
project team
The project team is responsible for the overail specification, design,
development, integration, verification and validation of an application. If
software products are pmposed as design solutions, the project team are
responsible for the evaluation, selection, integration and testing of those
products.
3
COG-95-179
qualifiation
The process of determining, using tangible evidence, whether a software
product meets a set of qualification requirements.
qualification method
A means of gathering and analyzing evidence to determine a software
product's ability to meet one or more qualification requirements.
qualification
requirements
The requirements that a software product must meet in order to be qualified
for use in a specific target application (or set of applications). The
requirements identify the category of the application and address those quality
objectives among safety, reliability, maintainability and reviewability that are
of importance to the target application.
qualifier
The person (or persons) performing all or part of a qualification. (Due to the
subjective nature of qualification and the variety of experience required, it may
be desirable to have more than one person involved.)
quality
The totality of features and characteristics of a software product that bear on
its ability to satisfy stated or implied needs.
quality objectives
Those objectives which must be met by a software product to be considered of
acceptable quality. The general set of quality objectives are safety,
functionality, reliability, reviewability, usability, efficiency, maintainability,
and portability [3,4].The imponance and applicability of these quality
objectives will vary from application to application.
software product
A piece of software that has been developed to satisfy a general market need
and is readily available from an external vendor (or from within the
organization), but has not specifically been developed and verified to meet the
requirements of the target application. It may be delivered in many forms,
such as binaries, sources, configurable software, embedded firm ware in a
hardware product, linkable libraries, utilities, source, etc. It may be a
complete system or a component of a system containing other software
products and/or customdeveloped software. Also referred to as product.
1.4
How to Use this Guide
The body of this guide should be read in its entirety. A number of appendices have been included to
be used as tools for performing a qualification. They may or may not be applicable to a given
qualification, depending on the category of the application and the qualification methods chosen.
Section 2 provides an overview of the evaluation process and describes the objectives of performing a
qualification. Section 3 describes the seven methods available for qualifying Category II and Category
III software products. Section 4 describes each step involved in performing a qualification. Section 5
provides an outline of the qualification report.
4
2.
OVERVIEW
2.1
What Qualification Is
COG-95-179
The purpose of qualification is to determine a software product's acceptability, for use in a target
application, with respect to the quality objectives: safety, =liability, maintainability and reviewability.
A qualification is always performed within the context of the target application's OASES nuclear
safety category.
A qualification:
1.
Establishes the qualification requirements of the target application; ie.,the specific
requirements that the software product must meet. Note that not all four quality objectives
will be relevant for every application.
2.
Assesses the software product's ability to meet the qualification requirements by applying one
or more qualification methods.
3.
Provides an opinion, supported by the collected evidence, as to whether the software product
meets the qualification requimments.
4.
Identifies any deficiencies that should be guarded against in using the software product in the
target application and recommends any necessary compensating measures.
5.
Identifies any additional assessment or verification wok that should be undertaken to
compensate for the unavailable information, or to address specific deficiencies in the product.
A qualification may be performed before, during or after the pmcurement of a software product, and
may even occur after it has been placed into operation at a station. A qualification may be performed:
As part of an evaluation, conducted by a project team, to determine the suitability of a
software product for a specific target application.
As part of a feasibility study, to prequalify a product for one or more applications.
To provide evidence for the licensing of an application that contains a software product.
When a qualification is performed on a software product before it has been placed into operation, the
qualifier always assumes that verification and validation, consistent with the applicable OASES
standard, will be performed on the target application before it is commissioned for use. When a
qualification is performed on a software product that is part of an operating application, the qualifier
may use, as evidence, the verification and validation performed before the application was
commissioned for use.
5
COG-95-179
This guide treats qualification as a distinct activity from the overall evaluation of software products,
because:
a
A qualification may take place either before an evaluation begins or after it has ended.
a
Qualifications are typically performed by a specialist (i.e., qualifier) who has a specific skill
set and can work independent of the project team.
2.2
What Qualification is NOT
Qualification is W:
0
0
a
An endorsement that a software product is safe for use within a target application. Safety is a
key quality objective driving the qualification work. However, the qualification process is not
a substitute for a comprehensive assessment of a pmduct's ability to meet application-specific
safety requirements. Although a qualification may provide evidence related to the safety of a
software product, it will not provide a conclusion that a product is safe for use.
An assessment of the suitability of any hardware within which the software product may be
bundled, in the form offirmware.
An assessment of a vendor's qualification, except where it relates to the specific qualification
requirements. For example, the qualifier does not assess corporate health and stability,
manufacturing capabilities or ability to meet delivery schedules.
The selection of candidate products.
0
0
The determination of a product's fitness to purpose; Le., its ability to meet the functionality,
efficiency, usability and portability requirements of the target application.
The determination of the adequacy and appropriateness of the target application's overall
system design and principles.
a
A guarantee that the software product is perfect.
0
An attack on a vendor.
COG-95-179
6
2.3
The Qualifier's Skill Set
The following are necessary prerequisites of someone performing a qualification:
Experience in systems and software engineering and software quality assurance (SQA).
0
Familiarity with widely accepted industry standards and methods.
Ability to understand the functionality of applications and software products typically used in
CANDU stations.
An understanding of the significance that safety has in the applications and software products
typically used in CANDU stations.
A working knowledge of the OASES Nuclear Safety Category
0
Good analytical, communication and writing skills.
2.4
The Qualification Process
II and Category III standards.
Figure 1 illustrates the qualification process. It shows the six steps involved, the various data that can
be investigated to provide evidence, and the typical outputs.
The qualifier begins the process by documenting the qualification requirements relevant to the target
application, and determining the focus of the work. The focus will be the entire software product or,
if the product is a complex system, the qualifier may choose to concentrate on a critical subset of
software components. Based on the qualification requirements, a preliminary qualification approach is
chosen by selecting a preferred set of methods from the various methods available. A number of
factors are then considered to determine the feasibility of the approach. These factors may include the
maturity of the software product, the existence of a documented SQA process, the willingness and
ability of the vendor and tefenznce sites to provide infomation, and the qualification schedule and
budget constraints. As a result, a qualification plan is prepared and appropriate checklists and
questionnaires are compiled. These checklists and questionnaires are completed by conducting
interviews and reviewing available documentation. The collected evidence is then analyzed and the
conclusions and recommendations are documented in the foxm of a qualification report.
COG-95-179
7
Things to watch
for in using the
software product!
Software product
meets qualification
requirements?
Additional
verification work
recommended!
Figure 1 - Software Product Qualification Process
8
3.
COG-95-179
THE QUALIFICATION METHODS
The following sections describe the various qualification methods available. They also describe the
tools (checklists and questionnaires) available to assist the qualifier in collecting evidence. The
effectiveness of each method in providing useful evidence to assess safety, reliability, maintainability
and reviewability is also addressed. The methods presented have been derived from the OASES
standards [3,4] and from experience gained in actual qualification work.
3.1
Software Quality Assurance Assessment
This method involves assessing the process by which the software product was engineered. This is
achieved by filling out the checklists in appendices A through D (for Category II applications) or
appendices E through H (for Category III applications). These checklists are taken verbatim fmm
Appendices A through D of the corresponding OASES software engineering standard [3,4].
In providing a definition of software quality, the OASES software engineering standards address all of
the relevant objectives: safety, reliability, maintainability and reviewability. Any deficiencies
appearing in the vendor's SQA process CM be analyzed to determine the resulting impact on these
objectives.
3.2
Operating History
This method involves assessing the demonstrated reliability of the product by analyzing the amount
and type of operating history, the history of errors detected and the history of the changes made to the
software product. The assessment is based on sales, error and revision history data provided by the
vendor. Appendix L contains a questionnaire used to collect the relevant data. The analysis
performed on the data can produce a usage statistic that, when interpreted with other factors such as
product complexity, product stability, type of usage and error rates, can sene as a useful measure of
the product's overall reliability.
3.3
Reference Sites
This method involves questioning reference sites, using the software product in a similar way, to
determine the reliability they have witnessed, the level of service provided by the vendor and the cost
to their own staff in maintaining the product. Appendix N contains a reference site questionnaire to
be filled in by the site contact. This method can provide very useful, direct evidence on product
reliability, maintainability and failure modes (safety). Details addressed may help identify specifics,
such as what features should be used or avoided, and what workmunds for known problems can be
employed.
9
3.4
COG-95-179
Failure Modes Data Assessment
The failure mode analysis performed during qualification is an independent investigation into the ways
the product can fail, the special featums the product has to handle those failures, and the effects and
indications those failures have via external interfaces. Appendix K contains a questionnaire for
collecting this information. Also, the reference site questionnaire (Appendix N) contains a section on
failure modes. The qualifier must ensure that those failure modes identified as safety-related in the
qualification requirements are thoroughly addressed by this assessment.
This method provides direct information regarding the product's ability to appropriately handle a given
set of failure modes.
3.5
G oodn ess-of-Design Assessment
This method involves assessing the goodness-ofdesign of the software product by reviewing design
documentation and source code and by interviewing design staff. A subjective opinion of the design's
goodness is made based on quality attributes defined in the OASES standards [3,4]. Appendix I lists
the appropriate attributes along with their definition. This provides a guide to the qualifier, who must
use his or her judgement in determining, based on the sample of the design presented by the vendor,
whether the product meets the spirit of the definitions.
Appendix J contains a checklist of Safety Net Requirements that represent design decisions which
should be considered when using software in a safety-related application. Although a software
product may not have been specifically developed for use in a safety-related application, aspects of its
design can be scrutinized against the requirements of Appendix J to help determine its safety and
ro bustness.
All of the objectives, safety, reliability, maintainability and Eviewability are addressed, to a degree, by
this method.
3.6
Maintenance Process Assessment
This method involves assessing the vendor's product-support process against acceptable practice.
Appendix M provides a checklist of requirements for the evaluation of a vendor's processes and
procedures for maintaining the software product after "first customer ship". Appendix 0 contains a
questionnaire related to a variety of vendor- and product-related issues that impact on the
maintainability of the product. Those appendices apply to both Category II and III applications.
This method can help evaluate the vendox's ability to upgrade and support the product over the long
tern.
10
3.7
COG-95-179
Anecdotal Evidence
Anecdotal Evidence is similar to Reference Site Data, in that the information is testimonial in nature;
i.e., someone's opinion on the goodnesdsuitability of the product, based on their experience using,
reviewing or testing the software product. Where they differ is that, with anecdotal evidence, it is not
always possible to determine how relevant the experience is to the target application. Also, there may
not be the same degree of access to specific details. Anecdotal evidence may be obtained from a
variety of sources, including:
a
rn
0
a
a
Commercial journals (these often have product reviews or comparisons).
Trial usage or prototyping performed.
FAQs (Frequently Asked Questions): depositories, available over the InterNet, where
questions and answers on the products usage can be obtained. This is a good source of bug
and workaround infomation.
Support groups (large vendors usually have support groups that can provide access to other
users, product problems and issues, etc.).
Original Equipment Manufacturer (OEM)and Value Added Reseller W A R )experience (these
contacts may be useful for answering specific questions on the product; their objectivity as a
source of opinion, though, should be treated with scepticism)
The reference site questionnaire (see Appendix N) can be used as a guide for collecting the anecdotal
evidence. The information provided by this method may be useful in assessing the product's overall
reliability, maturity and stability. It may also provide a list of specific errors for failure mode
assessment and may pmvide useful workarounds. This method may also address a number of
maintenance issues.
11
4.
COG-95-179
THE STEPS IN PERFORMING A QUALIFICATION
This section describes the major steps involved in performing a qualification. These steps should not
be thought of as strictly sequential. It may be necessary to iterate through a number of them,
depending on information or circumstances. For example, the plan may have to be reevaluated if
data from a particular method is not adequate to provide meaningful evidence.
4.1
STEP 1: Determine the Qualification Requirements
The first step of a qualification is to determine and document the qualification requirements so that
there is a clear set of criteria against which to assess the software product. These requirements should
be derived within the context of the target application. Factors that do not have a direct impact on the
application need not be included.
This step is usually performed by interviewing the project team (or other party requesting the
qualification), reviewing project documentation (proposals, requirements specifications, design
descriptions, design manuals, etc.) and reviewing product documentation (proposals, product
descriptions, user manuals, reference manuals, FAQs, commercial journals, etc.).
In determining the qualification requirements, the following should be done:
a.
A description of the target application's primary functionality should be obtained. Interfaces
with other station equipment should be identified, and any backup or mitigating systems
related to the target application should be identified.
b.
A description of the role that the software product will play in the target application should be
obtained. Interfaces between the target application and other station equipment with the
software product should be described.
C.
The OASES Nuclear Safety Category of the application should be identified. If categorization
work has been done, what category has been assigned? If categorization has not been
performed, what worst-case category should be assumed for the qualification?
d.
The importance of safety must be determined. Category Il systems typically will have a
safety implication. Some Category III systems may have no safety implication. Functional
safety requirements can be broadly classed as systems that must be able to continue operation
even in the event of failures (e.g., fault-tolerant systems), and those that, in the event of
failum, fail to a known, safe state (eg., fail-safe systems). To properly analyze safety, the
qualifier must identify the unsafe failure modes and determine what effect they have on the
application. This is important preparation for assessing how the product handles or mitigates
against these failure modes.
12
COG-95-179
e.
In terms of the quality objective reliability, all software products are expected to be reliable.
However, there may be specific requirements that are more fundamental to the overall
reliability of the application. These should be identified so that the qualification can give
them due attention. There may also be specific requirements to ensure the robustness of the
system (e%. redundancy, graceful degradation, error detection and recovery, etc.).
f.
The importance of maintainability depends on the prqbability of future upgrades, modifications
or enhancements to the target application that may impact the software product. Related to
this are any special software support requirements (e.g., in-house engineering support, custom
upgrade, etc.).
g.
The use of the qualification report should be determined. For example, if licensing is a
requirement the third-party reviewability of the vendor's SQA process becomes an issue. Also,
the formality of the qualification report becomes more important; e.g., ensuring that all
supporting evidence for the conclusions and recommendations are provided in an auditable
way.
h.
The expected form of the software pmduct should be identified (stand-alone system, software
library, firmware, etc.). This should include a description of any hardware with which the
software must be packaged.
1.
Permissible versions of the software product should be identified, including vendor name,
product name, model number, revision number, option package, etc. The identification should
be complete enough that no confusion exists over what is being qualified. The qualifier
should determine the appropriate vendor contact.
4.2
STEP 2: Determine the Focus of the Qualification
Typically, the focus of the qualification will be a single software product. There may be cases,
however, when a qualification is for a complex collection of software products (e.g., a distributed
control system) or a software product that is, itself, a complex collection of components (e.g. an
operating system). In such cases it may not be practical to perfom a qualification of the entire
collection. The qualifier may, instead, choose to select only those components or products that are
most important to the application.
Once the qualification requirements have been established, the critical points of risk (e.g., safetycritical failure modes, high-reliability features) are defined. Using these as a starting point, the
qualifier can perform the equivalent of a fault tree analysis on the system, to identify the components
that have failure modes that critically impact the application. These components then become the
focus of the qualification. This allows the qualifier to tum an unmanageably l q e task into 'one that
is both practical to achieve and will provide the key information on the quality of the product(s).
13
4.3
COG-95-179
STEP 3: Plan the Qualification
The planning process begins by selecting a qualification approach that should provide sufficient
evidence for assessing the product's ability to meet the qualification requirements. The approach is
then refined into a plan based on factors such as the existence of a documented SQA process, the
willingness and ability of the vendor and reference sites to provide information, the maturity of the
software product, and the existence of a previous qualification. The plan evolves throughout the
qualification in response to the evidence that is uncovered. When deficiencies are encountered in the
evidence, other methods can be employed to help determine whether any perceived risk is somehow
offset by other evidence.
4.3.1 Choose a Preliminary Approach
To qualify a software product generally (Le., does it have an adequate level of quality for a given
application and category?), it is sufficient to show that the SQA process used for engineering it
conforms with OASES standards. Therefore, SQA Process Assessment should always be selected for
the preliminary approach. This method is also necessary for collecting evidence where reviewability
must be demonstrated.
If empirical evidence is available, it must be investigated to show how the product has performed in
actual use. Operating History Assessment and Reference Site Assessment are both effective methods
for this. Reference Site Assessment has the added benefit of pmviding data that is more objective,
since reference sites usually do not have a conflict of interest. The evidence from these methods may
also help assess the risk of any deficiencies uncovered in the vendor's SQA process.
The core methods recommended for the preliminary approach are therefore:
0
0
0
SQA Process Assessment,
Operating History Assessment, and
Reference Site Assessment.
If safety is a concem in the target application, the qualifier should include the following method:
0
Failure Modes Assessment.
If maintainability is a concern for the target application, the qualifier should include the following
method:
0
Maintenance Process Assessment.
The qualifier must review the effectiveness of the approach addressing all of the qualification
requirements.
14
COG-95-179
4.3.2 The Use of Previous Qualifications Performed on the Product
If a qualification was previously pedormed on the product of interest, the report from that
qualification should be reviewed, as part of the planning step, to determine the applicability of the
conclusions and recommendations to the current target application.
The review of a previous qualification should address the following issues:
a.
Is the qualification relevant to the revision of the software product being assessed now? This
includes indirect relevance (e.g. an assessment of an SQA process that may also have been
applied to the product of interest).
b.
Does the qualification address all or any of the qualification requirements?
c.
Was the qualification based on criteria that is consistent with the appropriate OASES standard
for the category of the application?
d.
Do the methods used provide the same level of sufficiency as that determined by the
preliminary approach (Le., do the methods chosen provide an acceptable level of confidence in
the conclusions drawn)?
e.
Has the qualification produced documented evidence to show the extent of the assessment, the
decision criteria used and the conclusions reached?
f.
Was the author of the previous qualification financially independent of the vendor?
If the qualifier chooses to use part or all of the qualification, the preliminary approach can be altered
appropriately, to remove work no longer necessary. The approach can also be modified, where
necessary, to provide confidence that the previous qualification is valid. For example, if the previous
qualification indicated that the vendor has an SQA process that meets the appropriate OASES
standard, the qualifier would plan to confirm that process was used on the revision of the product of
interest to the target application. Also, if the previous qualification identified any deficiencies in the
software product or the software engineering process, and recommended corrective action, the qualifier
would plan to look for evidence that the corrective action has been carried out.
15
COG-95-179
4.3.3 Prepare the Plan
Once a preliminary approach has been selected and the impact of any previous qualifications has been
assessed, the qualifier can begin to prepare the actual qualification plan. If the preliminary approach
represents what should be done to determine suitability, the plan represents what can be achieved
given the availability of information, and what can provide additional evidence to mitigate the
deficiencies uncovered.
The software product's vendor should be contacted early in the process. The qualification objectives
should be explained clearly to the vendor, and the target end date of the qualification should be
discussed, along with the implications it will have on scheduling a visit. The main aim of contacting
the vendor is to determine the degree of cooperation that may be expected, and to give the vendor
some lead time to mange for a potential visit. The vendor should be asked to provide names of
reference sites and contact personnel. The most desirable reference sites are those that have used the
pmduct for a reasonable period of time, and in a way that is similar to the target application.
Table 4.3.3 lists criteria, for each qualification method, to help determine whether the method will
provide useful evidence. Each of the methods chosen in the preliminary approach should be assessed
for its potential effectiveness. If a method is not likely to be effective, then other methods should be
investigated that will provide similar evidence. For example, if it is determined that the vendor is not
willing to have their SQA process scrutinized, or if the SQA process used is neither comprehensive
nor well documented, the qualifier may attempt to investigate the goodness-ofdesign of the product,
instead.
Other factors that should be taken into consideration in preparing the plan are:
0
0
If the qualification is being done in conjunction with an evaluation effort, is there evidence
from that work that may be useful for the qualification? This could be evidence relevant to
the qualification requirements, or prototyping or testing activities that could be used to
supplement the qualification findings.
If the application is operational (Le. operational), what verification and validation was
performed on it that reveals something about the quality of the software product?
Do the methods together adequately address all of the objectives of the qualification?
0
Is adequate objectivity provided by the methods as a whole? Is there a reasonable mix of
points of view (e.g., some fmm the vendor, some from a source that does not have the
vendor's interest in mind)?
Once a qualification plan has been prepared, it should be circulated to those requesting the
qualification so that the objectives, scope of work and schedule are clearly understood. The plan
should also outline the proposed qualification methods. Once documented, they can be reviewed as a
whole to determine whether they are likely to provide adequate coverage of the objectives.
16
1
Table 4.3.3
COG-95-179
- Criteria for Method to be Effective
Software Quality Assurance
Process Assessment
a comprehensive SQA process existed, with documented
evidence (e.g.. software development plan, pmcedures,
requirements and design descriptions, test procedures)
the vendor is willing to have their process documents assessed
the vendor is willing to provide staff to answer specific questions
Operating History
Assessment
the software product has been on the market for at least 6 months
the vendor is willing to provide sales f i g u ~ s
a significant portion of the operating history uses the product in a
way similar to the target application
error information is available
details of changes made to the product are available
the vendor is willing/able to identify one or more reference sites
at least one site is willing to cooperate
the reference site@)has used the pmduct for at least three
months
if the application is a poised system, the system has been called
upon to work
the reference application(s) is similar to the target application in:
scope, functions used, options chosen, platform used,
configuration
the reference site's means of supporting the product is similar to
that intended by the application (maintainability)
the reference site@)has had occasion to use the services of the
vendor or support organization (maintainability)
Failure Modes Assessment
the vendor has performed an assessment of the product's failure
modes, and identified the resulting actions of the product and the
states it is left in
the vendor is willing to describe the failure modes and their
consequence
Goodness* f-Design
Assessment
the vendor is willing to present their design and answer specific
questions
the vendor is willing to allow their requirements, design and
source code documents to be examined
Maintenance Process
Assessment
the vendor is willing to present their maintenance process and
answer specific questions
the vendor is willing to allow their process maintenance
documents to be examined
0
17
COG-95-179
Software Quality Assurance
Process Assessment
a comprehensive SQA process existed, with documented
evidence (e.g., software development plan, procedures,
requirements and design descriptions, test procedures)
the vendor is willing to have their process documents assessed
the vendor is willing to provide staff to answer specific questions
Anecdotal Evidence
Assessment
one or more sources of anecdotal infomation are available (the
vendor may be able to point to specific joumal articles,
OEMNAR and support organizations, user groups, FAQs, etc.)
the source provides information pertinent to the qualification
the source is credible (i.e., provides a technical assessment of the
product, and not marketing material)
4.3.4 Example Qualification Strategies
Example 1
The station wishes to use software product A in a Category II application. The application is safety
related. The intended function is very specific and there is no anticipation of enhancements for the
foreseeable future. The application will require licensing before being placed in service.
Preliminary Approach
The following methods are selected for the preliminary approach:
0
0
0
0
SQA Process Assessment,
Operating History Assessment,
Reference Site Assessment, and
Failure Modes Assessment.
Product Background
Company A has a welldocumented SQA procedure that has been certified to IS0 9001 using I S 0
9000-3.They have developed a mediumcomplexity product that they hope to sell to the nuclear
industry. The vendor is willing to describe their SQA process during a visit, and will provide
documented evidence at that time. The product has been sold to a handful of customers, and first
customer ship was nine months ago. Sales figures are low, but a compEhensive list of error data is
available. One customer is using the product in a similar way to the target application, and is willing
to cooperate. No failure modes analysis has been performed by the vendor, but they are willing to
provide an engineer to answer questions related to how known failure modes are handled.
Qualification
The qualifier decides to assess the SQA process against the OASES Category II standard, and to
ensure that the process was actually followed for the engineering of the product. The SQA process
18
COG-95-179
scores high overall, but has some significant deficiencies that may impact on product reliability. The
qualifier then looks for other evidence that may demonstrate good reliability.
The sales figures are felt to be too low and of too short a life to make operating history a useful
avenue. However, the error data is examined and will be recorded in the report. It shows a product
that is reasonably stable. A number of bugs have been fixed in the most recent revision of the
software and the qualifier recommends that, if that revision is pwhased, the application verification
and validation add specific tests to demonstrate their correct operation.
Three reference sites are sent questionnaires and two are returned completed. One points to a problem
not listed in the vendofs error log, and the vendor is questioned on it in detail. The other reference
site is the one using the product in a way similar to the target application. Their experience overall is
very positive, and they agree to talk with the project team to answer specific questions regarding
functionality.
The failure modes assessment is performed by meeting with one of the vendor's engineers. The
meeting consists of identifying all probable failures, their some, means of detection and effect on the
software product and its external interfaces. All failure modes identified in the qualification
requirements are taken into consideration. This information is recorded for the report. The engineer
agrees to provide more specific details and forward them to the pmject team.
Example 2
The station has purchased product B for use in a Category Ill application. The product is a hardware
component controlled by firmware. The application has no safety implications. The intended function
is very specific, and there is no anticipation of enhancements for the foreseeable future. Licensing is
not an issue.
Preliminary Approach
The following methods are selected for the preliminary approach:
e
e
e
SQA Process Assessment,
Operating History Assessment, and
Reference Site Assessment.
Product Background
Company B is a small company. Product B is a simple product they have been selling for years for a
wide variety of uses. The original developers no longer wodc at company B and the SQA process
19
COG-95-1 79
used is not documented. The vendor is willing to describe the design of the product, and to allow
access to the source code. A number of customers are willing to provide reference site data.
Qualification
Three reference sites are selected, based on the similarity of product use to the target application.
Their completed questionnaires show a product that is stable and mature. They are all using revisions
of the product that are older than the most recent.
A detailed investigation of the SQA process is not possible. Instead, the qualifier spends a day with
an engineer reviewing the design and examining the code. The conclusions are that the functionality
is simple and well defined, and that the design, although not very structured, does not break any basic
tenants (e.g., no self-modifying code, uses a deterministic loop, etc.).
The sales figures are examined and the amount of usage of each product revision is quantified. The
latest revision of the product is fairly immature. The new features included in it are itemized so that
the project team can determine whether they can use an older, more stable version.
The vendor's maintenance process is assessed to ensure that the company can provide an older version,
and that any later upgrade can be performed reliably. The vendor's test procedures are also evaluated,
to determine their suitability.
Example 3
The station wishes to purchase a complex PLC (Procdct C) to be used for a Category III application.
The external environment is expected to change within the next 10 years. If the primary functionality
of the PLC is not met, there will be a safety-related consequence. Rather than investigate all of the
PLC's components, the qualifier performs a high-level hazards analysis, to identify failure modes of
concern and to focus on the critical components. Licensing is not an issue.
Preliminary Approach
The following methods are selected for the preliminary approach:
a
a
a
0
a
SQA Process Assessment,
Operating History Assessment,
Reference Site Assessment,
Failure Modes Assessment, and
Maintenance Process Assessment.
20
COG-95-179
Product Background
Company C is a large company with a large customer base. Product C has been selling well for the
past year-and-a-half. It is a new generation of a previously successful product. Although the vendor
professes to have a very good SQA process, they invested heavily in its development and are reluctant
to have it or their maintenance process reviewed. They are, however, willing to provide the source
code for review. The vendor is also willing to provide sales and e m r figures, and to discuss the
product's error modes and defences. A number of customers are willing to provide reference site data.
Qualification
The reference sites are approached first, to determine whether there are any deficiencies or limitations
that should be discussed with the vendor. Any problems encountered are added to the list of failure
modes assessment questions. The sales figures are examined and the amount of usage of each product
revision is quantified. The amount of experience is reasonable, but not convincing, given the
complexity of the product.
A failure modes assessment is performed by meeting with one of the vendor's engineers. The qualifier
determines that the fault tolerance of the system is not adequate to meet the safety and reliability
requirements of the target application. It is recommended that the project team design-in some
redundancy to counter this. Possible options for this are discussed with the vendor and documented in
a report.
The code is reviewed by the qualifier, with special attention given to the Safety Net Criteria. The
quality of the design is high, giving credence to the vendor's claims of a sound SQA system. No
major problems are identified in the design.
Although it is not possible to review the vendor's maintenance process, the Maintenance Questionnaire
(Appendix 0) is filled out with the vendots assistance, to ensure a reasonable level of customer
support.
21
4.4
COG-95-179
STEP 4: Execute the Qualification Plan
4.4.1 Prepare Qualification Checklists and Questionnaires
The execution of the qualification plan typically involves preparing a set of checklists and
questionnaires which are later completed by conducting interviews and reviewing available
documentation. This guide contains template checklists and questionnaires for all of the qualification
methods (see Section 3). Once a plan is in place, the qualifier selects those checklists and
questionnaires that are relevant to the chosen methods, reviews them and then modifies them as
necessary, so that they adequately address the qualification requirements.
4.4.2 Gather Reference Site Data
The selected reference sites should be contacted initially by phone, to detennine whether they are
willing to cooperate in completing the reference site questionnaire (see Appendix N). Those that are
willing should be sent a copy of the questionnaire, with a short cover letter explaining the purpose of
the investigation. In certain cases, it may be feasible to visit a reference site to have a first-hand look
at their application, and to discuss their responses to the reference site questionnaire. In other cases,
the evidence may be gathered by reviewing the completed questionnaire and discussing specific details
and outstanding issues by telephone.
The reference site may also be willing to participate in more specific discussions with members of the
project team. The qualifier should infonn the project team of this option.
The answers to the reference site questionnaire may identify issues that should be investigated further.
For example, the reference site may have experienced specific problems with the product, or with the
service provided by the vendor. It is therefore desirable to collect the reference site data before
visiting the vendor, so that any new issues can be addressed by the vendor. For example, an
identified problem may have been corrected in a later revision or by a revamped vendor-support
process.
22
COG-95-179
4.4.3 Review Anecdotal Evidence
As with reference site assessment, the qualifier should attempt to review anecdotal evidence before
visiting the vendor. In that way, any specific deficiencies or limitations uncovered can be dealt with
directly with the vendor.
4.4.4 Review Available Product SQA Documentation
The vendor should be given the option of sending SQA documentation related to the product's
engineering to the qualifier before a visit. This could shorten the duration of the visit. Such
documents may include:
0
software quality assurance policies and procedures,
0
the software development plan for the product,
the product's requirements and design documents and sample code,
test procedures and test reports, and
0
requirements and design review reports.
Much of this information is highly proprietary, and will often be available to the qualifier only during
a site visit. The qualifier should be prepared to sign a nondisclosure agreement with the vendor.
Before reviewing the SQA docurnentation it is necessary to establish whether the documents are:
1.
directly pertinent to the way in which the product was engineered, and
2.
pertinent to the way ongoing changes to the pmduct are engineered.
An attempt should be made to answer as many questionnaire items as possible, since the vendor may
be providing the material in hopes of reducing the site visit time. Any additional questions that arise
from the review of the material should be noted on the checklists or questionnaires. Any "NO"
answers enteEd as a result of the review should be revisited during the interview with the vendor, to
ensure that a correct assessment has been made.
23
COG-95-179
4.4.5 Visit the Vendor
The operating history questionnaire (see Appendix L) can be sent to the vendor, to be completed and
returned. The other checklists and questionnaires, however, are best completed by the qualifier during
a visit to the vendor's site, where appropriate vendor staff can give presentations and be interviewed.
Such a visit usually lasts two full days, with time being required by vendor personnel involved in the
development, verification, validation and support of the product.
During the visit, the qualifier must translate the checklist and questionnaire questions into a form that
the vendor can understand in the context of their organization. The vendor will have their own model
for SQA and their own set of terminology. It is important for the qualifier to establish a
correspondence between the vendor's model and the one provided in the OASES standards. Note that
a model that differs from the OASES model can still meet the OASES quality requirements.
In assessing the vendor's SQA process, it is important that the focus be on the process that was
actually used for the product, and not on a process used for other products or a new process that is
currently in use. If the vendor does have a documented process used during the engineering of the
product, then that process should be examined. However, if that documented process does not comply
in any aspect, the qualifier should determine whether the vendor went beyond their documented
process and met the checklist requirements in the product. They should of course be given credit if
they did.
A proposed agenda should be sent to the vendor before the visit so that they can appreciate the effort
involved and schedule the appmpriate staff. In coordinating the visit and other details of the
qualification, it may be necessary to deal with a local sales account manager, especially in a large
company. If possible, arrange to talk to someone involved in the engineering of the product, to ensure
that when the visit is made the correct people and documentation are on hand. If this is not possible,
keep in close contact with the sales representative, to make sure they understand what your
expectations are and what is necessary to make the visit effective.
The qualifier should take c o n t d of the visit from the beginning, by explaining the scope of work and
indicating what specific topics must be covered. The qualifier should also indicate the amount of
checklists and questionnaires that must be covered, to impress upon the vendor staff the effort
involved.
At the beginning of the meeting, the qualifier should give a brief background on:
0
0
0
nuclear safety categories,
the target application,
the purpose and nature of qualification, and
the use of the qualification report.
24
'
COG-95-179
Next, it is recommended that an architectural overview of the product (i.e. basic functional
components and interfaces) be presented by the vendor. The remainder of the meeting should be
centred around completing the checklists and questionnaires. It is recommended that the vendor
provide a short talk at the beginning of each topic; however, coverage of the topic should be driven by
the qualifier asking the specific questionnaire or checklist questions.
It may be possible for interested members of the pmject team to attend the visit or participate in a
conference call during the meeting. The qualifier should inform the project team of this option before
the visit. If they accept, the qualifier must monitor the visit time carefully, to ensure that the
qualification requirements are adequately addressed.
Some questions may not have ready answers from the vendor. These should be noted as they are
encountered, and reviewed at the end of the meeting, to establish a list of actions that must be
resolved at a later time. It is also important to establish contacts with appropriate personnel in case it
is necessary to follow-up with someone to clarify answers that are not clear, or to delve into more
detail as a result of the analysis of the data.
4.4.6 Determine the Complexity of the Software Product
To assist in analyzing the qualification evidence, it is necessary to establish a general rating of the
software product's complexity. To achieve this, the following parameters must be estimated
(see Section 4.52.3):
0
0
0
number of lines of code,
number of external interfaces, and
number of internal software modules (e.g., programs, subprograms).
These estimates can be obtained by either interviewing members of the product's software
development team or requesting that the vendor complete a questionnaire.
25
COG-95-179
4.4.7 Filling out Checklists and Questionnaires
The following general rules should be followed for completing the questionnaires and checklists:
a.
Recognize the subjectivity of the endeavour, and establish a level of interpretation of
requiEments that is consistent throughout the qualification.
b.
Whenever quantitative data is available, use it instead of qualitative data.
c.
Do not trust your memory. Write things down during the meeting, and compile them at the
end of the day or shortly after the visit is completed.
d.
Checklists have four columns. The first column contains the requirements that must be met.
The second is used to indicate whether the vendor complies with the requirement. An answer
of "Yes", "No","Not Applicable" or "Not Assessed" should be used. Those answers have the
following meaning:
Yes
The pupose of the qualification is not to determine exact compliance
but to determine equivalent practice. An answer of "Yes" means that
the spirit of the requirement is met; Le., that the important benefit of
the requirement is realized by the way the vendor does things. This
can be a very subjective measure, and the qualifier must attempt to
interpret the underlying benefit of the requirements in a consistent and
methodical way. The criticality of the target application should also
be kept in mind, as well as the importance of reviewability.
No
The spirit of the requirement is not met.
Not Applicable
The requirement is not applicable to the product being reviewed. For
example, if the product is designed as a single thread deterministic
loop, then questions regarding the use of multitasking are not
applicable.
Not Assessed
The requirement could not be assessed, due to the unavailability of
relevant information.
26
COG-95-179
The third column of the checklists is used to record references to vendor documentation that
support the answer to the requirement. As the meeting proceeds, the qualifier should list all
documentation titles used as evidence by the vendor. This list should be numbered and the
numbers placed in the reference column. Eventually, the list of references will be added to
the qualification report (see Section 5.9). Documents defined by the checklists are generic
entities and may correspond to one or more documents in the vendor's model. For example, a
Computer Systems Requkments document may correspond to a file containing memos,
drawings, design notes, etc.
The fourth and last column of the checklists is used to record comments. These comments
should provide a description of any rationale the qualifier used in drawing the conclusion as to
the vendor's compliance. It may also be used to show the correspondence between the
vendor's terminology and that of the checklists.
4.5
STEP 5: Analyze the Evidence
The analysis should include the data collected from the current qualification plus any relevant data
from previous qualifications. The analysis from a previous qualification may be referenced in part or
in whole, depending upon its applicability (see Section 4.3.2). The following subsections provide
guidance on analyzing the data collected from each of the qualification methods.
4.5.1 Software Quality Assurance Process Assessment
If reviewability is an issue, the degree to which the vendor can demonstrate their SQA process with
supporting documentation becomes very important. The checklists call for specific documents to be
produced during the engineering process. The corresponding documents produced by the vendor may
have different names or formats and may be spread over a number of documents, minutes of meetings
and memos. It is important that the qualifier make a mapping between the vendor's document set and
the document set called for by the checklists. This can be achieved using the reference section of the
qualification report and the references column in the checklists.
If reviewability is
an issue, the qualifier can afford to be moce flexible in interpreting the checklist
requirements. For example, if the vendor began the process with a requirements specifEation and
design description, but as changes occurred just updated the code, then the vendor can be given credit
for a requirements and design phase, but it should be noted that the maintainability of the software
suffers. If the design team is not likely to modify the software in the future, then this may not be too
serious a deficiency, but needs to be weighed in the balance of all other considerations about the
product. Any allowances given in determining compliance should be noted in the comment column of
the checklist. Any deficiencies to reviewability should also be noted, indicating their severity.
27
COG-95-179
The overall degree of SQA process compliance can be expressed by including a table containing a
percentage of the "Yes" answers to the total number of requirements, minus "Not Applicable" and
"Not Assessed" requirements. The table couId break the scores down using the section names from
Appendices A through D or E through H.
The actual focus of the analysis should be to examine the specific deficiencies, to determine their
impact on the safety, reliability, maintainability and reviewability of the product. Even a score that
seems quite high may be misleading, because of a small number of serious deficiencies in the process.
In the "Development Processes Output Requirements" checklists (Appendices B and F), and the
"Support Processes Output Requirements" checklists (Appendices D and H),many of the lequirements
are annotated with the OASES quality attributes (see Section 4.5.5). This is helpful in determining the
impact of the deficient requirements. Ultimately, the qualifier must use his or her judgement, based
on experience. as to the consequence and severity of a deficiency. Things to keep in mind are:
e
the serious impact that a poor configuration management or change control process can have
on future generations of the product,
e
the problems involved in maintaining a poorly documented product,
e
the impact on reIiability if a product has not been thoroughly documented, reviewed and
tested,
e
whether verification and validation has been objective enough by introducing independence
into those processes, and
the reliability of the product can be seriously impacted by how methodical the engineering
process was.
Once the qualifier has determined the impact of the SQA process deficiencies, evidence from other
methods can be used to assess their ultimate level of risk to the application.
28
COG-95-179
4.5.2 Operating History Assessment
The analysis of the operating history data attempts to determine:
the amount of time a given version of the software product has been run in operational
environments,
the similarity of the operating history to the target application (Le., have the portions of the
software product that the target application intends to use been well exercised?),
the breadth of the operating history (Le.. have all portions of the software pmduct been well
exercised?),
the error record of the software product (Le., has the operating history demonstrated that the
product is reliable?), and
the degree of change that has occurred from version to version (i.e.*has the product reached a
state of stability? Is it possible to take credit for the operating history of past versions because
of the minor nature or isolation of the changes?)
Other factors considered are:
0
the complexity of the software product, and
0
the category of the target application.
4.5.2.1
Calculating Unit-Years of Operation
The amount of time a given version has been run is calculated in terms of Unit-Years of operation.
This is done by first determining the Sales Year, which represents the total number of units and
length of time these units have been in the field. Then the Unit-Years of operation is determined by
multiplying the Sales Year by the duty cycle, and by the percentage of units that are actually in
operation. The formula is as follows:
a)
Sales Year = (cumulative total sales of the revision, up to three months from the present
[assuming it typically takes three months delay before a unit is operational]) * (number of
years the product is in the market) / 2 [to provide a straight line approximation]
b)
Unit-Years of operation = Sales Year
* duty cycle [percentage of the time the software is
29
COG-95-1 79
operational] * percentage of units actually in operation [assuming some units consisting of
embedded software are kept as spares, or that some copies of the software are no longer used
or have been upgraded]
The determination of duty cycle depends on the type of applications in which the product is used.
Poised systems (trip and alarm systems) may have a very low duty cycle, since they only perform
their primary function during emergency situations. Control and monitoring systems that operate on a
24-hour basis would have a very high duty cycle. The duty cycle for peripherals, such as a disk drive
or printer, may vary greatly, depending on the application and the hours of use. If it is not possible to
determine a representative value for the types of application the product is used in, a value of 50% is
a reasonable default for non-poised systems.
The percentage of units actually in operation is applicable for embedded software, since a certain
number of copies of the system will be kept in spare. A reasonable estimate is 80%. This parameter
can also be used to represent the number of copies of the software that are estimated to be still in
operation, assuming that some may have fallen out of use, or have been replaced by a new revision.
If it is possible for the vendor to estimate the percentage of units that are actually used in a similar
application to that of the target application, then the calculated Unit-Years should be multiplied by that
percentage to give a final value for analysis.
4.5.2.2
Operating History Calculation Example
A hypothetical product "Product X" has been released in three revisions: 1.O,first released in 1991
November; 1.1, first released in 1993 January; and 2.0, first released in 1994 July. The date of the
qualification is 1994 December. The sales figures for each revision are listed in Table 4.5.2.2.
COG-95-179
30
r
Table 4.5.2.2
- Sales Figures for Product X
Revision 1.0
Revision 1.1
Revision 2.0
Sales in 1991
45
0
0
Sales in 1992
478
0
0
Sales in 1993
130
780
0
Sales in 1994 (up to the end of
September)
0
930
285
Total Sales
653
1710
285
~~
The Unit-Years of operation for revision 1.0 are calculated as follows:
a)
Sales Year,,o = 653 * 3.083 [37 months since first release] / 2 = 1006.60
b)
Unit-Years,,o = 1006.60 * 0.50 [50% duty cycle] * 0.80 [20% spares]
= 403
The Unit-Years of operation for revision 1.1 are calculated as follows:
a)
Sales Year,.,
= 1710 * 2.0 [24 months since first release] / 2 = 1710.0
b)
Unit-Years,,,
= 1710 * 0.50 (50% duty cycle] * 0.80 [20% spares]
= 684
The Unit-Years of operation for revision 2.0 are calculated as follows:
a)
Sales Year2,
= 285 * 0.5 [6 months since first release] / 2 = 71.25
b)
Unit-Years,,
= 71.25
= 28.5
* 050 [SO% duty cycle] * 0.80 [20% spares]
It is estimated, by the vendor, that 60% of their market uses the product in a way that is similar to the
target application. Therefore:, the final Sales Year values are calculated to be 241.8 for revision 1.0,
410.4 for revision 1.1 and 14.17 for revision 2.0. The vendor also indicates that the product is used in
a wide enough variety of applications that approximately 90% of the total functionality of the product
is used.
COG-95-1 79
31
Interpreting Operating History Results
4.5.2.3
The analysis of operating histoxy depends partly on the complexity of the software product. Software
complexity can be roughly categorized using the number of lines of code, number of external
interfaces and number of internal software modules (e.g., programs, subprograms). Using these
parameters, Table 4.5.2.3-1 lists four levels of software complexity. When the parameters fall into
different levels of complexity, the highest is selected..
Level of Complexity
Lines of Code
External Interfaces
Internal Modules
LOW
< 1000
<5
< 20
Medium
1000-9.999
5-9
20- 199
High
1o.OOo-99,999
10 - 20
200-1999
Very High
> 1oo.OOo
> 20
> 2,000
0
a low rate of reported errors,
0
a
b
very low rate of serious emrs (those that hamper the functionality of the product),
a low rate of change (Le., updates to the revision have been isolated to specific modules and
have had little impact on the system as a whole),
0
a high rate of use in similar applications, or
0
broad coverage in the use of all of its functionality?
When the criteria provided in table 4.5.2.3-2 is met, then a reasonable level of product reliability has
been established. When not met, the qualifier may pursue other methods to further evaluate reliability,
or may recommend exception to the criteria by providing adequate justification.
Category I1
Category I11
Low Complexity
200
100
Medium Complexity
500
200
High Complexity
1,000
500
Very High Complexity
5,000
1,000
32
COG-95-179
Continuing with the Product X example, the vendor indicates that Product X has just less than 10,000
lines of code, six external interfaces (one high-alarm indicator, one low-alann indicator, one contact
output, one process value display, a process input, and a manual reset) and 50 internal modules.
Referring to Table 4.523-1 we conclude that there is a medium complexity. The category of the
target application is III; therefore, from Table 4.5.2.3-2 we require at least 200 Unit-Years of
operation. As a result, either of revisions 1.0 and 1.1 are adequate, provided that the error history and
extent of change (for revision 1.1) are satisfactorily low.
4.5.3 Reference Site Assessment
In analyzing the data from the reference site questionnaires, the degree of similarity between the
reference site's use of the product and the intended use of the target application must be established.
This does not mean that the applications are necessarily the same. That is highly desirable, but what
is actually pertinent is that the set of the product's functionality that is used by the application should
coincide with the set that will be used by the target application.
It is of primary importance to establish the revisions used by the reference site, and the length of time
they have been used. Reference site evidence for a different revision than that intended by the target
application is usable, but only if an analysis is done to show that the functionality of intexest has not
changed substantially. The length of time should be measured from when the product went into actual
operation at the reference site. Credit can also be given for any pleaperation test time, provided that
the testing exercised the functionality that the target application will use.
There are no rules for how long reference site use should be. It depends, in part, on the product's
duty cycle within the application. For poised systems, it is important to establish whether and how
often the product has been called upon to act (including periodic testing). The amount of use of the
product and the number of copies used at the reference site should be conveyed in the qualification
report.
In tenns of assessing reliability, the qualifier should gain as much information as possible regarding
the e m r history of the product. The frequency. nature and effects of the errors should be documented
in the qualification report. It is also essential to record any workarounds or recovery techniques the
reference site has instituted. The qualifier should also determine whether the refeEnce site
recommends that specific product options or configurations be avoided due to e m r proneness.
In terms of maintainability, the qualifier should establish the support record of the vendor. Have they
been responsive and competent in dealing with problems? The qualifier can also establish experience
in upgrading the software, in terms of impact on the application, time involved and any problems that
arise. The ongoing maintenance cost of the system, from an administration point of view, can also be
established. This may involve such activities as backup, cleanup and setup. If the vendor has been
involved in customizing the product for the reference site, it is possible to determine their track xecord
for implementing, installing and testing those customizations.
Reference sites can provide valuable information on failure modes, and how they are handled by the
product. They may have encountered failure modes that are not described in the product's
documentation. They can also provide infomation on abnormal operating modes, fault-tolerant or
fail-safe capabilities, failure indications provided and the possibility of indeterminate operating modes.
33
COG-95-179
4.5.4 Failure Modes Analysis
The completed Appendix K questionnaire should be assessed keeping in mind the following issues:
a
Are there any failures that are not handled by the product and could cause it to go to an
indeterminate or dangerous state?
a
Are there any options or configurations of the product that should be avoided because of
potentially serious failure modes?
a
Should special protection be instituted to address failure modes; e%., external watchdog
device, redundancy?
a
Is there an unacceptable common mode failure?
a
Do all failures have a unique indication?
4.5.5 Goodness-of-Design Assessment
The following provides guidance for assessing the ability of the software product to meet the
goodness-ofdesign criteria defined in Appendices I and J.
Verifiability
In investigating verifiability, the qualifier should determine whether the design provides adequate
access for testing the correctness of the software (e.g., does the design make critical information
available externally for testing purposes?).
Understandability
A number of characteristics should be looked for in determining the level of understandability. They
include:
a
a
a
a
a
a
a
a
a
a
a
a
use of meaningful variable names,
variable and terminology definitions,
high-level view of the software structure and interfaces,
high-level view of the flow of data through the system,
complete definition of interfaces,
use of proper grammar,
commentary that has meaning in the problem domain,
commentary that improves the maintainability of the software,
consistent layout that makes it easy to locate specific topics,
code structure and formatting that makes the structure visible,
simplicity/elegance of design, and
use of revision histories.
34
COG-95-179
Robustness
Appendix J contains a "Safety Net Requirements" checklist, which addresses design issues that help to
ensure the robustness of a software product. Besides these, other issues that help determine the
robustness of the design include whether the developers have:
e
e
e
investigated the failure modes external to the software product and included design aspects to
mitigate against or handle those failure modes,
investigated the failure modes internal to the software product and included the capability to
check for and handle those failure modes, and
defined design principles relevant to the fail-safeness or fault-tolerance of the system.
Predictability
In investigating predictability, the qualifier should determine whether the developer:
e
8
e
e
opted for a single-thread deterministic use over multi-tasking,
in the case of multi-tasking design, used a thread-checking algorithm to monitor proper
execution,
used static data structures, where possible, instead of dynamic data structures,
investigated the failure modes of the software and included the capability to check for and
handle those failure modes, and
created a simple design.
Modifiability
In investigating modifiability, the qualifier should determ ine whether the developer:
e
e
e
defined those aspects of the design that are most likely to change, and designed the system to
make those changes as unobtrusive as possible,
followed the tenets of information hiding in designing the software,
clearly defined the stmcture of the design and all internal interfaces, and
produced a simple design.
Modularity
In investigating the modularity of the design, the qualifier should determine whether the developer:
a
e
e
e
divided the software into a collection of modules and programs that are of a reasonable size,
defined modules based on the principle of separation of concerns,
defined each module's function to maximize cohesion, and
defined each module's interface to minimize coupling.
35
COG-95-179
Structuredness
In investigating the structuredness of the design, the qualifier should determine whether the developer:
e
e
e
followed basic structured programming principles,
used the features of a structure language in a standard, accepted way, and
formatted the code in a way to make its stmcture stand out clearly.
Consistency
In investigating consistency, the qualifier should determine whether the developer:
e
e
applied the coding style consistently throughout the software, and
applied a similar design to similar parts of the system.
4.5.6 Maintenance Process Assessment
The results of the Appendix M checklist should be analyzed to determine how well the vendor scores
overall in their ability to maintain the product, and also to determine the impact on the product's
maintainability due to any specific deficiencies.
With respect to the Appendix 0 questionnaire, the qualifier should review the answers with respect to
the target application to identify any potential problem areas or desirable features.
In assessing the vendor's maintenance process, the qualifier should keep in mind the following issues:
What is the impact of the expected resources and effort required in performing day-today
maintenance of the product?
What is the anticipated risk of upgrading the product in the future, given the vendor's support
and configuration management systems?
What is the likelihood and frequency of product patches, given the complexity and stability of
the product?
What are the long-term issues related to upgrading the product, given the vendor's future plans
and commitment to standards?
How well does the vendor's support options and support staff expertise fit with the type of
support required by the application?
If in-house modifications or support of the product (e%., source code changes) are being
considered, what material, resources and facilities are required, and what support will the
vendor be willing to provide?
Is the service-oriented training that is available to the end user adequate for them to perfom
their maintenance functions?
~
36
COG-95-179
4.5.7 Anecdotal Evidence
Any information gained from anecdotal evidence sources that may be of use to the project should be
documented or referenced by the qualifier.
In analyzing anecdotal evidence the qualifier must assess the credibility of the source. Is the source
financially independent from the vendor? Has the source gathered their data in an objective and
systematic way? Does the source provide a comparison with products From other vendors? Are the
circumstanceskonditionsunder which any benchmaks or tests were conducted described and
consistently applied?
The qualifier should also determine how applicable the information is to the revision of interest and to
how the station intends to use the product (platform used, options selected, functions exercised, etc.).
The qualifier should document any specific emrs reported in the evidence along with their effects and
any recommendations for their avoidance.
4.6
STEP 6: Issue the Qualification Report
It is always assumed that there is a minimum set of verification and validation activities performed on
the target application, as required by appropriate OASES standards. These activities may be explicitly
credited in the qualification report if other evidence is insufficient. They may also be explicitly
required as conditions of qualifying a software product.
4.6.1 Draw Conclusions and Determine Recommendations
The conclusions should:
a.
summarize the qualifier's opinion as to how well the software product satisfies each
qualification requirement (e.g., Is the product reliable enough? Is the product maintainable?
Is the product reviewable? Are the failure modes handled adequately?), and
b.
refer to supporting rationale from the analysis of the qualification data.
The attitude, degree of experience, intelligence, quality ownership and commitment, and smallness and
cohesiveness of the development, verification and validation teams are all intangible concepts, but are
at the heart of a wellengineered product. They should be given credit, where appropriate.
Comments from the vendor can be useful and should be factored into the conclusions. For example,
the qualifier can ask the vendor why they think the product is of good quality. The vendor may
provide evidence the qualifier did not think to ask for. However, the qualifier should make the vendor
defend their comments with specific evidence.
37
COG-95-179
The recommendations should, where necessary, provide specific instructions related to such issues as:
a
limitations in using the software product (e.g., "only version 4.2 should be used at present"),
a
conditions for using the software product (e.g., specific verification that should be performed
before using the software),
a
conditions for modifying the software product (e.g., if modifications of the software are
required, the vendor should institute a list of stated improvements in their SQA process),
a
conditions on the user for maintaining the software product (e.g., the deployment of a fulltime system administrator to maintain the system),
a
application-specific issues (e.g., the inclusion of an external watchdog in the system design to
provide a fail-safe output),
a
any special advice provided by the vendor or reference sites (e.g., "do not use option x"),
a
possible changes that could be made to the way in which the software product will be used in
the target application, that will reduce its OASES Nuclear Safety Category, and
a
recommendations regarding the use of features such as redundancy or configuration locks (to
prevent inadvertent modification).
Section 5 describes the format and content of the qualification report.
4.6.2 Perform Follow-up Activities
After the qualification results have been presented, the qualifier may be requested to perform
additional, follow-up activities. These may include:
a.
investigating specific issues in more detail,
b.
performing some or all of the additional evaluation or verification work recommended by the
qualification report, and
C.
helping to plan the verification activities to be carried out after the product is integrated into
the target system.
Depending on the nature and extent of the follow-up activities, the qualifier may choose either to
document the additional findings in the fonn of a memo, or to release a new revision of the
qualification =port.
38
5.
COG-95-179
The Qualification Report
The qualification report contains the conclusions and recommendations of the qualification work and
evidence supporting them. Below is a general format that should be followed in the =port. The
following subsections describe the contents of the report.
Title Page
Confidentiality Statement
Executive Summary
Table of Contents
1.
Introduction
2.
Evaluation Approach
3.
Analysis of the Assessment Results
4.
Conclusions and Recommendations
5.
References
Appendices
5.1
Title Page
The title page should contain:
0
a title that clearly identifies the pmduct being qualified and the target application it is
being qualified for,
0
the revision of the report, (e.g.. "Preliminary Draft", "Draft", "Revision OO"), and,
0
the authors of the report, along with their department or organization.
39
5.2
COG-95-179
Confidentiality Statement
The qualification report will in some form refer to the material that is proprietary to the vendor of the
software product. It is important that the sensitivity of the information be relayed to the reader of the
report by placing a confidentiality statement immediately after the title page. The confidentiality
statement should indicate the proprietary nature of the document as a whole, and indicate the
acceptable circulation of the document. For example, "employees of the qualifying organization and
their agents".
5.3
Executive Summary
The executive summary is an optional section of the report. If it is used, it should contain:
b
an identification of the target application and software product,
b
a brief statement of the qualification requirements,
0
a brief list of the major activities conducted during the qualification,
0
a summary of the conclusions of the qualification, and
b
a summary of the recommendations of the qualification.
The executive summary should not exceed one page in length.
5.4
Table of Contents
The table of contents should list all sections and subsections appearing in the report. A list of tables
and figures and their page numbers is optional. All of the appendices should be listed following the
title "Appendices".
5.5
Introduction (Section 1.)
The introduction to the report should provide a clear understanding of why the qualification was
performed and what specific qualification requirements were assessed.
A brief description of the target application for which the product is intended should be provided.
The OASES Nuclear Safety Category used.for the qualification work should be identified.
This section also defines all terminology used in the report that may not be redily understood by the
report's intended audience. Examples include vendor- or product-specific terms, qualification
methodology terms or terms normally used in a different way.
40
5.6
COG-95-179
Qualification Approach (Section 2.)
This section should outline the methods used to perform the qualification. It should provide the
rationale for why certain methods were chosen over others, or why specific methods could not be
conducted (e.g., vendor was not willing to reveal certain proprietary information).
5.7
Analysis of the Assessment Results (Section 3.)
The qualification report should capture the highlights of the analysis process in enough detail to help
the reader understand the extent of the work performed, and the rationale behind the conclusions and
recommendations that follow later in the report. Each of the major methods used should be treated in
separate subsections.
5.8
Conclusions and Recommendations (Section 4.)
Section 4 of the report should contain all the conclusions and recommendations derived from the
analysis of the qualification data. The section should be divided into subsections addressing, where
applicable,
0
a
a
a
a
Safety,
Reliability,
Maintainability,
Reviewability, and
Specific Project Issues.
Recommendations should be numbered, e.g., "RECOMMENDATION 1:
them stand out easily.
5.9
..." in bold italics, to make
References (Section 5.)
The references section should identify all pertinent documentation used during the qualification.
References should appear in a sequentially numbered list so that, in the body of the report, the
checklists and the questionnaires can refer to them via a number in square brackets "[n]", where
refers to the n"' item listed in the References Section.
"nl'
Items that appear in the references section should include any project documentation, any product
documentation describing the functionality and configuration details of the product, and any product
SQA documentation provided by the vendor describing how the product was engineen=d, or containing
requirements specifications,design descriptions or code.
41
5.10
COG-95-179
Appendices
The appendices should contain the completed checklists and questionnaires. Vendor-supplied
documentation, in general, should simply be referenced and filed by the qualifier for future use.
However, some portions of that information may be useful to include in the Appendices (e.g., lists of
corrected and outstanding product errors, letters of reference from customers).
42
COG-95-179
6.
REFERENCES
1.
Atchinoff, G.H.,Lau, D.K.. deGmsbois, J. and Bowman, W.C., “Guidelinefor Categorization
of Software in Nuclear Power Plant Safety, Control, Monitoring and Testing Systems“,
Revision 1, COG 95-264, 1995 September.
2.
Joannou, P.K.,Harauz, J., Viola, M., Cijanic, R., Chan, D., Whitall, R., and Tremaine, DR.,
“Standardfor Software Engineering of Safety Critical Software”, Revision 1, CE-1001-STD,
1995 January.
3.
Austin, LF., Harauz, J., Hinton, GJ. and Lynn, BA., “Sofrware Engineering of Category I1
Software”, Ontario Hydro Report number 907-C-H69002-0100, 1993 May.
4.
Austin, LT., Harauz, J. and Hinton, G.J., “Software Engineering of Category III Software ”,
Ontario Hydro Report number 907€-H69002-0200, 1993 May.
5.
deRuyter, M. and Waddington, R., “Guidelinefor the Qualification of Predeveloped
Sofhuare”, Revision 1.0, 1994 March.
COG-95- 179
Appendix A
Software Engineering of Category I1 Sofwarc
Rquircments for the Computer System Rquircmenm Document
Product:
Company:
Company Representative(s):
Date:
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE COMPUTER SYSTEM REQUIREMENTS DOCUMENT
II
REOUIREMENT
The CSR documents define the requirements that must be met by the
computer system. There may be a single requirements document for the
whole computer system, or the requirements may be contained in several
documents. All references herein refer to sections or tables in the
Category I1 standard.
The CSR shall:
(a)
Define the functional. performance, safety, reliability, and
maintainability requirements of the computer system, clearly
identifying the safety requirements.
(b)
Define the scope and boundary of the computer system and define
the interfaces to other systems. It shall contain or reference a
description of the problem domain including natural restrictions on
the required functions.
(C)
Define human interface requirements.
(d)
Define all accuracy requirements and tolerances.
(e)
Define any constraints placed dn the design options.
(0
Define any Quality Assurance and Software Quality Assmnrc
(SQA) requirements to he met by the romputcr syktcni l i l t Iiitli* or
reference the categorization analysis that provide\ ths r;tltiw.ik 1411
choosing the SQA requirements.
I COMPLIES
REFERENCE
I
COMMENT
-
A2
-
COG-95-119
S o h a r e Engineering of Category II Software
Requirements for h e Computer System Rquircments Document
SOFTWARE ENGINEERING O F CATEGORY II SOFTWARE
REQUIREMENTS FOR THE COMPUTER SYSTEM REQUIREMENTS DOCUMENT
REQUIREMENT
r
1
(9)
Define the design quality objectives, if different to those in Section
E.1,and their relative priority.
(h)
Define anticipated changes to the computer system based on an
analysis of experience with changes to similar systems and on
projections of future needs.
(i 1
Provide a clear definition of terns.
(i)
Explicitly identify a comparable refemnce design or the absence of
any.
(k)
Contain no requirements that are in conflict with each other. .
0)
Identify each requirement uniquely and define each requirement
completely in one location to prevent inconsistent updates and to
facilitate easy referencing by subsequent documents and
verification processes.
(m)
Contain or reference a revision history.
COMPLIES
I
I
REFERENCE
COMMENT
COG-95- 179
Appendix B
Software Engineering of Category II Software
Requirements for the Development Process Oulputs
Product:
Company:
Company Representative(s):
Date:
~
~
~~
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE DEVELOPMENT PROCESS OUTPUTS
REQUIREMENT
This appendix defines the requirements which the outputs from the
development processes must meet to be of acceptable quality. All
references herein refer to sections or tables in the Category I1 standard.
The outputs are:
l
l
I
Computer System Design,
(a)
(b)
Computer Hardware Design,
(c)
Software Requirements Specification,
(d)
Software Design Description,
Source Code.
(e)
The following detailed requirements reference, using brackets "[...I", the
quality attribute(s) from Appendix E which they are intended to address.
COMPLIES
REFERENCE
COMMENT
. B2
COG-95- I19
Software Engineering of Category I1 Sofware
Requirements for h e Development Process Oulpuir
REQUIREMENT
B.1 COMPUTER SYSTEM DESIGN (CSD)
(b)
Decompose the computer system into functional subsystems with
logical inputs and outputs, and uniquely identify the functional
subsystems.
Safety-related functional subsystems shall be isolated from other
subsystems and identified as such so that the requirements of this
standard can be appropriately applied. [Understandability.
Correctness, Smcturedness, Modularity]
(c1
Define the logical interfaces between the functional subsystems. A
diagram or diagrams illustrating the logical interfaces between
functional subsystems shall be provided. An example of a suitable
type of diagram is a data flow diagram. [Understandability,
Verifiability, Correcmess]
(d)
Allocate system requirements to the functional subsystems.
[Completeness, Traceability, Verifiability, Modularity, Consistency.
Understandability]
(e)
Define the reliability requirements for each functional subsystem.
[Robustness, Predictability, Correctness1
COMPLIES
REFERENCE
COMMENT
Software Engineering of Category 11 Software
Requirements for h e Development Process 0utput.1
SOFTWARE ENGINEERING O F CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE DEVELOPMENT PROCESS OUTPUTS
REQUIREMENT
(f)
Define the accuracy requirements and tolerances. [Correctness,
Predictability J
(g)
Define any constraints placed on the software design options for
each functional subsystem. [Consistency]
(h)
For each functional subsystem, define which functions must be
easy to change or reconfigure. wodifiability J
(i )
Map the functional subsystems to one or more computers or nonprogrammed hardware. predeveloped software and custom
developed software. [Understandability, Verifiability,
Completeness, Traceability]
I
Identify and describe all failure modes of the system based on
failure modes analysis.
Provide an overview of required fault-tolerance and graceful
degradation features provided by the system hardware architecture.
Examples would include hot-standby computers, warm-restart
functions. backups, etc. Describe or reference the failure, restart,
and re-synchronization procedures and requirements. [Robusmess.
Predictabilitvl
8.1.2 Hardware Architecture
For functional subsystems allocated to hardware, the CSD shall:
(a)
Identify each computer, physical interface between computers, and
non-programmed hardware device within the computer system.
Diagrams of the computer system hardware architecture shall be
provided. [Understandability, Completeness, Verifiability]
COMPLIES
REFERENCE
COMMENT
- B1-
COG-95-179
Software Engineering of Category II Software
Requirements for h e Devebpmcni Process Oupuu
SOFTWARE ENGINEERING O F CATEGORY 11 SOFTWARE
REQUIREMENTS FOR THE DEVELOPMENT PROCESS OUTPUTS
II
REQUIREMENT
(b)
For each computer, specify required characteristics, such as
performance, input and output capabilities. user interface device
capabilities, memory and mass storage capacities, support for
predeveloped software (such as operating systems or database
managers). reliability, and maintainability. The definition of
required computer capabilities shall be suffcient to allow software
requirements to be specified with confidence that the computer
equipment will be suffcient to support them. (Completeness,
Correcmess]
(c)
Describe the required characteristics of each physical interface
between computers, including requirements such as performance,
reliability, maintainability, and any design constmints, such as the
use of specific contention resolution algorithms. The definition of
required communications capabilities shall be suffcient to allow
software requirements to be specified with confidence that the
communications equipment will be suffcient to meet them.
[Completeness, Predictability]
(d)
Describe the required characteristics of each non-programmed
hardware device within the computer system. Identify the nlevant
design documentation for the device. The definition of required
non-programmed hardware capabilities shall be sufficient to allow
software requirements to be specified with confidence that any
non-programmed hardware will be sufficient to meet them.
[Completeness, Consistency].
B.1.3 Predeveloped Software
For hnctional subsystems allocated to predeveloped software, the CSD shall
identify the types of predeveloped software to be used in the target system.
This includes operating systems, database managers, libraries, compilers. etc.
Wnderstandability. Completeness, Verifiability]
COMPLlES
REFERENCE
COMMENT
Software Engineering of Category I1 Software
Requirements for the Dcvelopmenl Process Outputs
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE DEVELOPMENT PROCESS OUTPUTS
II
REQUIREMENT
B.2 COMPUTER HARDWARE DESIGN (CHD)
The CHD documentation describes the computer hardware. There may be a
single computer hardware design document for the whole computer system,
or the information may be divided into several documents. References to
the manufacturer's technical specifications may be used.
The requirements provided below address only computer hardware design
issues direcrly related to software engineering. Other information may be
required to engineer the hardware. Each of the requirements addresses the
Correctness and Completeness quality attributes.
The CHD shall. for each computer in the computer system, describe the
computer in detail. including items such as:
(a)
Bus W o e and meed.
(b)
Hardwaredesignated address mapping,
IC)
Direct Memorv Access IDMA) caDabilities.
' (d)
Hardware interrupts and interrupt controller,
(e)
CPU type, instruction set, and speed,
(0
Coprocessor types and speeds,
(g)
Memory types, sizes, and speeds,
I fi)
L
Memorv management facilities, including caching.
(i)
Mass storage device types, sizes, and speeds,
0)
VO devices, including controller types, specifications, addressing,
protocols, etc.,
(k)
Required initialization logic.
(I)
Required diagnostic facilities.
(m)
Power up, power fail, and restart logic.
COMPLIES
REFERENCE
COMMENT
COG-95-119
SoCnWa~cEngineering of Category I1 Sofnvnrc
Requirements for the Development Process Outputs
SOFTWARE ENGINEERING O F CATEGORY II SOFTWARE
REQUIREMENTS FOR THE DEVELOPMENT PROCESS OUTPUTS
REOUIREMENT
(n)
Other information as required to completely characterize the
software interface to the computer hardware.
B.3 SOFTWARE REQUIREMENTS
SPECIFICATION (SRS)
The SRS contains all requirements from the CSD which are relevant to the
software as well as all other software specific requirements which arise due
to the environment. Normally there will be one SRS for each functional
subsystem.
Alternative organizations are possible as long as the SRS is arranged to
facilitate mapping to the functional subsystems defined in the CSD.
B.3.1 Requirements
The SRS shnff:
(a)
Describe all requirements from the CSD which are relevant to the
software. [Completeness]
(b)
Describe all additional requirements. [Completeness]
(c)
Describe any software implementation design constmints. This
might include the processors on which the software is required to
execute, and the predeveloped software required to be used.
[Completeness]
(d)
Describe the logical inputs to and the logical outputs from the
functional subsystem and relate them to physical inputs and
outputs in the problem domain. [Completeness]
COMPLIES
REFERENCE
COMMENT
Software Engineering of Category I1 Software
Requirements for the Development Process Outputs
SOFTWARE ENG JNEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE DEVELOPMENT PROCESS OUTPUTS
REQUIREMENT
Describe the required behaviour of the software in terms of what
the software must do. Natural language shall not be used to
specify requirements. Examples of acceptable methods include the
following: [Completeness, Correctness, Verifiability]
(1)
Define the required behaviour of the logical outputs in
terms of the logical inputs with the use of mathematical
functions. The entire domain of the input variables shall
be covered by these functions. Mathematical functions
can be represented using techniques such as algebraic
expressions and tables (such as decision, truth, event, or
condition tables).
(2)
Use diagrammatic techniques to show specific
relationships, for example between functions or between
data items.
Examples of appropriate diagrammatic techniques include
data flow diagrams, entity-relationship diagrams, state
transition diagrams, and others.
(3)
Use representative examples to describe functions. For
example, include pictures or diagrams of required
displays.
(4)
Use constrained language techniques such as problem
description languages, pseudo-code or structured English.
(0
Describe any mathematical function using a notation which has
precisely defined syntax and semantics. [Verifiability]
(g)
Describe the timing tolerances and the accuracy requirements by
specifying the allowable deviation from the specified behaviour of
the output variables. [Completeness]
(h)
Describe the chamteristics of the inputs and outputs addressing
such issues as types, formats, units, and valid ranges.
[Completeness]
(i 1
Specify the response to exception or e m r condition\
[Predictability]
COMPLIES
REFERENCE
COMMENT
COG-95-119
Software Engineering of Caregory 11 Software
Requirements for the Development Process Outputs
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE DEVELOPMENT PROCESS OUTPUTS
REQUIREMENT
0)
Identify all those requirements for which future changes are
anticipated. [Modifiability]
(k)
Explicitly demonstrate the mapping and complete coverage of all
relevant requirements and design constraints in the CSD by such
means as a coverage matrix or cross-reference. nraceability]
(1 1
Uniquely identify each requirement so that it can be readily
referenced by the SDD. rraceability]
(m)
Reference design notes which document key design decisions
relevant to the software requirements. praceability]
(n)
Describe requirements for fault tolerance and graceful degradation.
[Robustness]
B.3.2 Constraints on the SRS
The SRS shall:
.
I
(a)
Require no more than what is specified in the CSD without
providing justification. [Correcmess]
(b)
Define each unique requirement once to prevent inconsistency.
[Consistency, Modifiability]
(c)
Be written strictly in terms of requirements and design constraints;
it shall limit the range of valid solutions, but shall nor insist on a
particular design. [Completeness]
(d)
Be written according to any additional SPH guidelines.
(e)
Use standard terminology and definitions throughout the document.
[Consistency]
(0
Be written according to guidelines in the SPH. [Completeness,
Consistency, Correcmess]
COMPLIES
REFERENCE
COMMENT
Software Engineering of Calegory I1 Software
Requirements for the Development Process Outputs
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REOUIREMENTS
FOR THE DEVELOPMENT PROCESS OUTPUTS
REOUIREMENT
1
I
11
The Software Design Description document (SDD) is a representation of the
software design. Normally the SDD will be organized into an overview of
the software architecture. sections or subdocuments corresponding to each of
the functional subsystems. Alternative organizations are possible as long as
the SDD is arranged to facilitate mapping to the SRS. The SDD will be the
primary document issued for the client to allow himher to understand, use
and
maintain the software.
~ _ _ _ _ _
9.4.1 Software A r c h i t e c t u i
The Software Architecture section of the SDD shall:
(a)
Describe a software architecture which meets the requirements of
the SRS to a level of detail that requires no further refinement of
the predeveloped software. tasks, databases, inter-task communications, or inputloutput.
(b)
Document the data flow through the software using graphical
techniques such as Data Flow Diagrams. Provide a data use cross
reference. Provide a mapping between the logical interfaces
described in the SRS and the physical interfaces used in the SDD.
[Understandability, VerifiabilityJ
COG-95-179
Software Engineering of Category II Software
Requirements for the Development Proecsc Outputs
1II
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS
FOR THE DEVELOPMENT PROCESS OUTP
REQUIREMENT
(c)
Where multiple tasks are used, document the task architecture (the
relationship between tasks) as required below. Provide the
following information for each computec
(1)
which tasks activate which other tasks (and which are
scheduled by the operating system),
(2)
a graphical ovetview of the interfaces between tasks, and
the interfaces between tasks and the computer system
input/output. Data Flow Diagrams are a type of diagram
that could be used to meet this requirement,
(3)
show task scheduling and allocation of processor
resources to tasks. This information sholl be sufficient
to ensure that real-time performance requirements will be
met by the specified architecture if the architecture is
implemented as specified.
Diagrams, charts, tables, Gantt-type charts, data flow diagrams,
and other techniques could be used to meet part or all of this
requirement.
For the purposes of this requirement, each instance of each task
shall be identified explicitly. Both developed and predeveloped
software shall be included in this description. [Understandability,
Verifiability, Predictability J
(d)
Provide a mapping between functional subsystems, tasks, and
computers. Every requirement of the SRS sholl be allocated to a
task. rraceability , Understandability, Verifiability]
(e)
Provide detailed information about each task. This information
shall include a unique identifying name, the scheduling method
and priority, interfaces to other tasks. resource requirements
(processor, memory, disk, etc.), and other information as required.
[Understandability, Verifiability]
COMPLIES
REFERENCE
COMMENT
Software Engineering of Category II Software
Requirements for the Development Process Outputs
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
~~
I
REOUIREMENT
(f)
Provide detailed information about each database. This
information shall include a unique identifying name, the logical
record stnrcture (Le., the field names and types in each record), the
physical record stlllctilre (i.e.. the physical size and location of
each field within a record), the storage method (e.g., memoryresident, disk based, etc.), the access method (e.g.. indexed
sequential access method (ISAM), unkeyed. array indexes, etc.)
plus all indexes or keys, relevant sizes and limits (including record
sizes, average and maximum number of records. and average and
maximum database size), and other information as required.
[Understandability, Verifiability]
(g)
Provide detailed information about each intertask communications
service. This information shall include a unique identifying name,
the type of service (e.g., queue, pipe, semaphore. message box,
shared memory segment, etc.), message formats. data rates, and
other information as required. [Understandability. Verifiability]
(h)
Provide information about each input/output device; this
information shall be sufficient to describe the software interface to
the device. For each device, provide at least a unique identifying
name, the type of device, data or message rates. and the software
interface method. [Understandability. Verifiability]
(i 1
Document the software construction architecture by providing an
overview of the partitioning of the software into design entities.
This shall include how the software is to be organized into
libraries (if these are to be used), where code is to be shared
between tasks, and what major sets of modules are to be
developed. This requirement could be met by providing a "module
guide".
0)
Describe each libray to be shared between tasks, For each such
library, the SDD shall provide a unique identifying name, an
overview description of the services provided, a list of the
functions performed, and a list of the tasks using the library. For
object-oriented designs, the functions shall be listed by object
class, and an inheritance hierarchy diagram shall be provided.
[Understandability, Verifiability]
I
COMPLIES
~
REFERENCE
COMMENT
COG-95-179
Softwan Engineering of Cnagory If Software
Requirements for the Development Process Outputk
II
SOFTWARE ENGlNEERlNG OF CATEGORY 11 SOFTWARE
COMPLIES
REQUIREMENT
Describe how the executable software for each functional
subsystem and computer is organized in the target computer
system, for example, directory s t ~ c t u m sor memory maps.
[Understandability, Verifiability]
0)
Describe how each predeveloped software component incorporated
into the software relates to the developed software. Provide
diagrams as necessary to describe the configuration of the
software. Define the detailed interface between the predeveloped
software and the developed software. References to the
manufacturer's documentation may be used. [Understandability.
Verifiability]
(m)
Identify the language(s) and translator@)to be used. Document
the rationale for the selection of the language(s) and of the
translator(s). [Verifiability]
8.4.2 Software Detailed Design
The Software Detailed Design section of the SDD shall:
(a)
For each task or library identified in the Software Architecture.
section, describe a software design which meets the relevant
requirements of the SRS to a level of detail that requires no further
refinement of the module structure, module functions, module
interfaces, libraries, data S I N C ~ ~ R Sor
, databases in the code.
Provide the decomposition of the task or library into design
entities. Provide diagrams showing the program calling
hierarchies. Where object-oriented programming techniques are
used, provide diagrams of the class inheritance hierarchies.
Wnderstandabilitv. Verifiabilitvl
I
REFERENCE
COMMENT
Software Engineering of Caregory II Software
Requirements for the Development Process Outpuic
REQUIREMENT
(b)
Provide a "module guide" containing the following information for
each module: vnderstandability. Verifiability]
(1)
unique identifying name,
(2)
the purpose of the module,
(3)
the data structwe(s) maintained by the module,
(4)
the list of programs contained in the module,
(5)
the programming language(s) to be used.
COMPLIES
REFERENCE
COMMENT
-
coo-95-119
814
Software Engineering of Caccgory I1 Softwsn
Requirements for the Development Process OUQUL~
SOFTWARE ENGINEERING O F CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE DEVELOPMENT PROCESS OUTPUTS
REQUIREMENT
Provide the following information for each program:
[understandability, Verifiability]
unique identifying name,
program type (e.g.. main program. subroutine, function.
macro, intempt handler),
the purpose of the program (what requirement it
satisfies),
the function of each program, using a notation that ..as a
well-defined syntax and semantics, and covering the
entire domain of the program input.
program dependencies (necessary relationships with other
programs. e.g.. interactions, data flow, timing, order of
execution, data sharing. ew.),
program interfaces (method of invocation or interruption.
communication via parameters. database access, message
passing, protocols. data formats. ranges of validity),
required resoulces (elements external to the design, e.g..
devices, software services (libraries), operating system
services, processing resource requirements),
program processing details, if necessary. to ensure that a
particular algorithm is used (refinement of function,
algorithm. contingency responses to error cases, timing.
sequence of events, priorities, processing steps).
(9)
unique identifying name,
COMPLIES
REFERENCE
COMMENT
Software Engineering of Caregory II Software
Requirements for the Development Process Outputs
SOFTWARE ENGINEERING OF CATEGORY 11 SOFTWARE
REOUIREMENTS
FOR THE DEVELOPMENT PROCESS OUTPUTS
REQUIREMENT
(d)
1III
Provide the following information for each local data item:
[Understandability. Verifiability]
(I)
data type (e.g., integer, Boolean, floating point, sttu~ture,
file).
(2)
the purpose of the data structure (what data it contains).
(3)
the range of validity,
initial value.
(4
I (e)
II
~
~~
~
~
Describe how the software for each task or library is built or
generated. [Understandability, Verifiability]
8.4.3 Other Requirements To Be Met By The SDD
The SDD shall:
Describe error handling which is not specified in the SRS.
lRobustn essl
(b)
(C)
Describe how relevant experience from previous systems defined
in the CSD has been factored into the design. [Correcmess,
Traceability]
Identify all functions contained in the SDD which are outside the
scope of the SRS.
Design notes shoN be referenced to provide the rationale for the
existence of these functions. rraceability]
Reference the design notes which document key design decisions
relevant to the software design. vraceability]
Identify each function in the design so that it can be uniquely
referenced by the code. The identifiers shall be consistent with
those used in the code. l’l’raceabilityl
Contain or reference a revision hislory which identifies all
modifications to the design and the rationale for these changes.
lTraceabilitv1
COMPLIES
REFERENCE
COMMENT
COG-95-119
Software Engineering of Category II Software
Requiremenu for the Development Process Ourpuis
SOFTWARE ENGINEERING OF CATEGORY 11 SOFTWARE
REOUIREMENTS
FOR THE DEVELOPMENT PROCESS OUTPUTS
REQUIREMENT
Use standard terminology and definitions through out the
document. Konsistencvl
(g)
B.5 SOURCE CODE
The source code is a complete and accurate translation of the design
described in the SDD. It includes both executable statements and data
declarations.
The source code shall precisely implement the design as described in the
SOD. IVerifiability. Understandability. Correctness1
B.5.1 Code Contents
The source code shall:
(a)
Follow the program and module structure and implement the data
structures defined in the SDD. [Consistency, Understandability,
Verifiability. Correcmessl
(b)
Have consistent layout and format. [Consistency.
Understandability]
(C)
.
Have consistently useful comments. [Understandability,
Consistency]
(d)
Use consistently descriptive naming conventions.
[Understandability.Traceability, Verifiability]
(e)
Be documented using readable, reliable, and consistent listings.
[Understandability,Consistency]
(9
Have revisions documented. mraceability. Verifiability]
(g)
Use structured programming. [Structuredness. Understandability,
Verifiability]
(h)
Have limited complexity. [Correctness, Robustness,
Understandability, Predictability]
(i1
Take advantage of compiler-implemented type checking where this
is feasible. A rationale shall be provided in the SDD if compilerimplemented type checking is not used. [Correctness]
COMPLIES
REFERENCE
COMMENT
Software Engineering of Category I1 Software
Requirements for the Development Process OUQULS
SOFTWARE ENGINEERIN4 OF CATEGORY 11 SOFTWARE
REQUIREMENTS FOR THE D 1VELOPMENT PROCESS OUTPUTS
I
~~~
COMPLIES
REQUIREMENT
0)
Defend against detectable run-time errors, either through coding
standards or through compiler-provided checks. [Robusmess,
Correcmess]
or)
Be compilable and debugged. Icorrecmessl
0)
Follow the standards in the SPH.[Consistency, Understandability,
Verifiability]
I
'
1 REFERENCE I
COMMENT
I
COG-95- 179
Appendix C
Softwarn Engineering of CSkgOIy II Software
Requirements for the Verification Process Outputs
Product:
Company:
Company Representative(s):
Date:
SOFTWARE ENGINEERING OF CATEGORY XI SOFTWARE
REQUIREMENTS
FOR THE VERIFICATION PROCESS OUTPUTS
REQUIREMENT
This appendix defines the requirements which the outputs from the verification
processes must meet to be of acceptable quality. All references herein refer to
sections or tables in the Category 11 standard. The outputs are:
Computer System Design Review Reporl,
Software Requirements Review Report,
Software Design Review Report,
Code Review Report,
Unit Test Procedures,
Unit Test Report,
Subsystem Test Procedures,
Subsystem Test Report,
System Integration Test Procedures,
System Integration Test Report,
Validation Test Procedures,
Validation Test Report.
Many of the requirements on the review and verification reports are generic and
are listed in Subsection C.1. Those requirements which are specific to the various
review and verification reports are dealt with in separate subsections. The
requirements on the test procedure outputs are presented individually in the
following subsections. The requirements for all test reports are presented in
Subsection C.10 as a common set of requirements.
COMPLIES
REFERENCE
COMMENT
.c2.
COG-95-119
Software Engineering of Category II Software
Requirements for the Verification Process Outputs
REQUIREMENT
COMPLIES
REFERENCE
COMMENT
C.l GENERIC VERIFICATION REPORT REQUIREMENTS
The Verification Reports have a common set of requirements. To be of acceptable
quality, a report shall:
(a)
Identify the versions of the relevant documents.
(b)
Summarize the review or verification activities performed and the
methods and tools used.
(c)
Summarize the discrepancies.
~
(d)
Summarize the positive findings.
(e)
Describe the conclusions and recommendations.
(0
IdentifL the review or verification participants.
(g)
Identify and comply with the applicable SPH requirements for Review
Reports.
(h)
Identify or reference corrective action lists resulting from the review or
test.
C.2 COMPUTER SYSTEM DESIGN REVIEW
REPORT (CSDRR)
To be of acceptable quality, the CSDRR shak
~~
(a)
Provide objective evidence that a Technical Review of the design has
been performed.
(b)
Provide objective evidence that the review has covered all requirements
and design constraints in the CSR and all functional subsystems.
computers, and interfaces in the CSD.
(c)
Provide objective evidence that the review has covered all standards and
procedures in the SPH applicable to the CSD.
~~
~~
~~
Software Engineering of Category II Software
Requiremcntr for the Verification Process Outpurs
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE VERIFICATION PROCESS OUTPUTS
~_____
REQUIREMENT
COMMENT
C.3 SOFTWARE REQUIREMENTS REVIEW
REPORT (RRR)
To be of acceptable quality. the RRR shall:
(a)
Provide objective evidence that a Technical Review of the requirements
has been Derformed.
@)
Provide objective evidence that the review has covered all requirements
and design constraints in the CSD and SRS.
(c)
Provide objective evidence that the review has covered all requirements
and design constraints appearing in the SRS which are not derived from
the CSD.
(d)
Provide objective evidence that the review has covered all standards and
procedures in the SPH applicable to the SRS.
C.4 SOFTWARE DESIGN REVIEW
REPORT (DRR)
To be of acceptable quality, the DRR shall:
Provide objective evidence that a Technical Review of the design has
been performed.
(b)
Provide objective evidence that the review has covered all requirements
and design constraints in the SRS and all programs, data structures, and
data bases in the SDD.
(c)
Provide objective evidence that the review has covered all standards and
procedures in the SPH applicable to the SDD.
C.5 CODE REVIEW REPORT (CRR)
To be of acceptable quality, the CRR shaN:
(a)
Provide objective evidence that the review has covered all programs,
data structures, and data bases in the source code.
_____
I
I
I
COG-OS-I79
Software Engineering of Cnegory II Software
Requirements for the Verification Process Outpucs
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE VERIFICATION PROCESS OUTPUTS
~
II
REQUIREMENT
(b)
Provide objective evidence that the source code has been reviewed
against the applicable suh-tier standards and procedures in the SPH.
C.6 UNIT TEST PROCEDURES (UTP)
The Unit Test Procedures shall:
(a)
Define tests which are executed on hardware which accurately represents
the target processor and any other hardware required to operate the
software unit, including emulators and simulators.
(b)
Define a suffcient number of test cases, derived from the analysis of the
SDD. to ensure that the executable code for each program behaves as
specified in the SDD. The number of test cases shall be considered
suffcient when they include:
(I)
all possible decision outcomes,
(2)
all possible conditions for each decision,
(3)
tests on each boundary and values on each side of each
boundary for each input. The test values shall be determined
by using boundary value analysis and equivalence partitioning,
U
(4)
tests based on postulated coding implementation errors.
COMPLIES
REFERENCE
~~
~
COMMENT
Software Engineering of Category II Software
Requirements for the Verification Process Outputs
SOFTWARE ENGINEERING O F CATEGORY I1 SOFTWARE
FOR THE VERIFICATION PROCESS OUTPUTS
REQUIREMENTS
REQUIREMENT
Define a sufficient number of test cases, derived from the analysis of the
code, to ensure that the executable code for each program behaves as
specified in the SDD.
(C)
The number of test cases shall be considered sufticient when they cause
to be executed at least once:
(I)
every program statement,
(2)
all possible decision outcomes,
(3)
all possible conditions for each decision,
(4)
each loop with minimum, maximum, and at least one
intermediate number of repetitions,
(5)
cause a read and write to every memory location used for
variable data,
(6)
cause a read of every memory location used for constant data.
(d)
Define a sufficient number of tests to cause each interface to be
exercised.
(e)
Describe the expected results of each test case so that a pass/faiI
determination can be made as to the outcome of the test. The expected
results shall be based on the information contained in the SDD.
~~
~~
(0
Identify all equipment (and its required calibration). tools, and support
software required to perform the test and provide adequate setup and test
execution instructions so that the test can be repeated by personnel who
did not perform the original test. Each test procedure skull describe or
reference how the executable code to be tested is built.
COMPLIES
REFERENCE
COMMENT
- C6 -
COG-95-I79
Softwam Engineering of Category II Software
Rquircmnu for h e Veriftcntion Process Ourputs
SOFTWARE ENGINEERING OF CATEGORY 11 SOFTWARE
REQUIREMENTS FOR THE VERIFICATION PROCESS OUTPUTS
REQUIREMENT
COMPLIES
REFERENCE
~
(i 1
Provide a cross-reference between the sections of the SDD and the test
procedures to document the test coverage.
ti)
Comply with the applicable sub-tier standards and procedures in the
SPH.
C.7 SUBSYSTEM TEST PROCEDURES (STP)
The Subsystem Test Procedures shall:
(a)
Require all tests to be done with predeveloped software and test
hardware that accurately represents the capabilities of the target, that are
required for the functional subsystem including emulators and. simulators.
(b)
Define test cases to test each functional requirement in the SRS.
(c1
Define tests to bst the performance requirements as described in the
SRS.
(d)
Identify all resources used by the functional sub-system and define test
cases which test the functional subsystem under conditions that attempt
to ovedoad these resources in order to determine if the functional and
performance requirements defined in the SRS are met.
(e)
Define tests to exercise any interfaces between the software and the
hardware. and the software and the pre-developed software.
(0
Define tests to show that the functional subsystem meets its requirements
under each hardware configuration and operational option.
(9)
Define tests to test the ability of the functional sub-system to respond as
specified in the SRS to sofIware, hardware, and data errors.
(h)
Define test cases which attempt to subvert any existing safety or security
mechanisms.
(i )
Describe expected results of each test case so that a passlfail
determination can be made as to the outcome of the test. The expected
results shall be based on the information contained in the SRS.
i
COMMENT
~
Software Engineering of Category I1 Software
Requirements for the VerifKation Process Outputs
REQUIREMENT
(i)
Identify all equipment (and its required calibration), tools. and support
software required to perfom the test and provide adequate setup and test
execution instructions so that the test can be repeated by personnel who
did not perform the original test. Each test procedure shall describe or
reference how the executable code to be tested is built.
(k)
Identify the SRS name and version.
(1 1
Provide a cross-reference between the sections of the SRS and the test
procedures to document the test coverage.
(m)
Comply with the applicable sub-tier standards and procedures .in the
SPH.
(d)
Define test cases to test that the interfaces between functional subsystems
behave as specified in the CSD.
(e)
Define test cases to test that the physical interfaces between computers
behave as specified in the CSD.
(9
Define test cases to test that the functional sub-systems perform in an
integrated environment as specified in the CSD.
COMPLIES
REFERENCE
COMMENT
COG-95- I19
Software Engineering of Cotegory II Sohvnrt
Requirements for the Verification Process Outpu~r
REQUIREMENT
1
(g)
Define test cases to test that the system architectural features perform as
specified in the CSD.
(h)
Describe expected results of each test case so that a passlfail
determination can be made as to the outcome of each test from
information provided in the CSD.
(i )
Identify all equipment (and its required calibration), tools, and support
software required to perform the test and provide adequate set-up and
test execution instructions so that the test can be repeated by personnel
who did not perfom the original test. Each test procedure shaff describe
or reference how the executable code to be tested is built.
(i)
Identify the CSD name and revision.
(k)
Provide a cross-reference between the sections of the CSD components
and the test procedures to document the test coverage.
Comply with the applicable sub-tier standards and procedures in the
SPH.
C.9 VALIDATION TEST PROCEDURES (VTP)
The Validation Test Procedures shoN:
(a)
Require all tests to be done with predeveloped software and hardware
that duplicates to the greatest practical extent the complete target system
architecture, including emulators and simulators.
(b)
Define test cases to test each functional requirement in the CSR that is
applicable to the system.
(C)
(d)
t
Define test cases to test each performance requirement in the CSR that is
applicable to the system.
Define test cases, using dynamic simulation of input signals. to cover
normal operation. anticipated operational occurrences. ahnormal
incidents. and accident conditions requiring systcni action in intlicatrd in
the CSR:
COMPLIES
I
REFERENCE
II
COMMENT
I
Software Engineering of Category I1 Sofnvare
Requirements for the Verification Process Outputs
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE VERIFICATION PROCESS OUTPUTS
REQUIREMENT
(e)
Describe expected results of each test case so that a padfail
determination can be made as to the outcome of the test from
information provided in the CSR.
(f)
Identify all equipment (and its required calibration), tools, and support
software required to perform the test and provide adequate setup and test
execution instructions so that the test can be repeated by personnel who
did not perform the original test.
Each test procedures shall describe or reference how the executable code
to be tested is built.
le)
Identifv the CSR comoonents and versions.
(h)
Provide a cross-reference between the sections of the CSR components
and the test procedures to document the test coverage.
(i 1
Comply with the applicable sub-tier standards and procedures in the
SPH.
~~
I
C.10 VALIDATION TEST REPORTS (VTR)
A test report must document the results of each test activity.
The test report output from each test activity shall:
(a)
Identify or describe the tests performed.
(b)
Identify the test procedures.
(c)
Include the comparison of actual and expected test results as defined in
the referenced test procedures.
(d)
Summarize the discrepancies.
(e)
Summarize the positive findings.
I
c
Describe the conclusions and recommendations.
(g)
Identify the date and time of the performance of each test.
(h)
Identify the program. functional subsystem, or system. and the versions
being tested.
COMPLIES
COMMENT
- CIO.
COG-95-179
Software Engineeringof Category II S o b a n
Requirements for the Verifmxion Pmcerr Outputs
COMPLIES
REQUIREMENT
(i 1
Identify the testers involved.
(i)
Reference the detailed test results.
(k)
Comply with the applicable sub-tier standards and procedures in the
SPH.
I
REFERENCE
I
COMMENT
I
COG-95- 179
Appendix D
Software Engineering of Category II Software
Requirements for the Support Process O U ~ U I S
Product:
Company:
Company Representative(s):
Date:
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
II
I
REQUIREMENT
This appendix contains the requirements which the outputs from the support
process shall meet to be of acceptable quality. All references herein refer
to sections or tables in the Category I1 standard. The outputs am:
(a)
(b)
Software Development Plan,
Standards and Procedures Handbook.
I COMPLIES
REFERENCE
I
COMMENT
. DZ .
COG-95-119
Software Engineering of Caaegory II Software
Requirements for Le Support Process Outputs
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
REQUIREMENT
D.l SOFTWARE DEVELOPMENT PLAN (SDP)
The SDP provides the comprehensive plan for the management of the
software engineering process.
The requirements of the SDP are subdivided into requirements for the
portions which make up the SDP:
(a)
(a)
(C )
(d 1
(e)
(0
Project Plan Portion,
Development Plan Portion,
Verification Plan Portion,
Documentation Plan Porlion.
Configuration Management Plan Portion,
Training Plan Portion.
The SDP shull nor duplicate information contained in overall project
management documentation, but rather shall reference it. The SDP shufl be
revised when there are major changes to either the software scope of work
or to the project team organizational smcture.
In the following requirements. the term "provide" allows developers to use
either descriptions or references to supply the required information.
D.l.l PROJECT PLAN PORTION OF SDP
The project plan portion of the SDP describes the scope of the software
engineering effort, the organization and responsibilities, and the key
milestones and dates based on external detailed budget and schedule
documentation used for monitoring the work effort.
The project plan portion of the SDP shull:
(a)
Identify the CSRDID.
~
(a)
i
Identify any overall project management plans, and descrihe the
relationship of the SDP to them.
COMPLIES
REFERENCE
COMMENT
Software Engineering of Cslcgory I1 Software
Requirements for the Support Process Outputs
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
REOUIREMENT
(c)
Adopt a specific software life cycle model, such as waterfall,
iterative waterfall, evolutionary, builds, pre-planned product
improvement. etc., as a focus for planning that a systematic
software engineering process is followed over the entire life of the
software. If iterations are planned, then the model shall show the
intended number of iterations. In addition, it shall treat each of
development and maintenance as an integral, continuous, and
interactive process.
(d)
Describe unambiguously the scope of work for the software
project.
(e)
Partition the project planning effort into uniquely identifiable
support processes with well defined inputs and outputs (see Table
I ).
COMPLIES
REFERENCE
COMMENT
- D4 -
COG-95-179
S o f w a n Engineering of Category I1 S o h a r e
Requirements for the Support Process Ourquu
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
REQUIREMENT
(9
Describe the organizational struchlre for the project and
relationships to external organizational units, including the scope,
aulhority and responsibility of each and including the user. Show
all relevant job positions and the number of resources required
within the project and list the independence requirements of each
job position.
The following independence d e s shall be met.
(I)
I
Define the following independence roles:
. Developer,
. Verifier,
. Validator.
(i1
-
(2)
Responsibility for the developmen&verification, and
support processes sholl be assigned to the defined roles
as described in Table D-I.
(3)
The immediate supervisors of the developers shall be
different from the immediate supewisors of the verifiers
and validators.
(4)
Developers may be reassigned to verification roles.
Verification shall not be performed by the same person
who developed the item being verified.
(5)
Developers shall nor be involved in roles assigned to
validators.
Specify the approach and methodologies to be used for the support
processes.
COMPLIES
REFERENCE
COMMENT
Software Engineering of Category II Software
Requirements for the Support Process Outpu~s
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
REQUIREMENT
ti)
Specify facilities, tools, and aids to be used during support.
Identify personnel required to maintain and manage the facilities.
(k)
Include a policy statement that the SDP shall be followed by all
personnel who are responsible for the production of an output
reouired bv the SDP.
0)
Include a policy statement that the personnel who produce each
output requirrd by the SDP have primary responsibility for the
output's quality.
(m)
Mandate the project-specific standards and procedures in the SPH
based on the requirements in Appendix D.2.
(n)
Require that the SDP be revised when there are major changes to
either the software scope of work or to the organizational structure.
(0)
Identify any previously developed systems from which developers
may obtain relevant design and operating experience which can be
factored into their current design.
D.1.2 Development Plan Portion of SDP
The development plan portion of the SDP describes the approach to
development, key design and implementation issues, and identifies the
support tools and facilities.
1
The development plan portion of the SDP shall:
r-
(a)
Partition the development effort into uniquely identifiable
development processes with well defined inputs and outputs (see
Table I).
(b)
Specify the approach and methodologies to be used for the
development processes.
tools, and aids to be used during software
development. Identify personnel required to maintain and manage
the facilities.
COMPLIES
REFERENCE
COMMENT
I
D6
-
COG-95- 179
Software Engineering of Category I1 Sofhvnrc
Rquirements for Ute Suppon Process Outputs
i
SOFTWARE ENGINEERING OF CATEGORY I t SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
REQUIREMENT
(d)
Identify any key design and implementation issues and preliminary
studies. simulation modelling, and/ or the prototyping required to
resolve them.
D.1.3 Veriflcation Plan Portion of SDP
The verification plan portion of the SDP describes the approach to the
verification of the software product. It identifies the verification activities
and support tools and facilities required to perform the verification.
The verification plan portion of the SDP shuN:
(a)
(b)
Partition the verification effon into uniquely identifiable
verification processes with welldefined inputs and outputs (see
Table I),
ldentify appropriate Verification activities for process outputs prior
in Appendix
D. 1.3(a). These activities shall consist of one or more of the
following: animation. supervisory review, walkthrough, =view and
comment, and prototyping.
to performing the verification processes identified
(c)
Specify the approach and methodologies to be used for the
verification processes.
(d)
Specify facilities, tools, and aids to be used during verification.
Identify personnel required to maintain and manage the facilities.
(e)
Require that verification activities shuN nor start until the outputs
k i n g verified are completed and placed under configuration
control.
(f)
Identify the qualification requirements for pre-developed software
for the target system and software tools.
I
I
I
I
-~
COMPLIES
REFERENCE
COMMENT
S
o
f
t
w
a r e Engineering of Carcgory II Software
Rquircmenu for the Support Process Outpuu
SOFTWARE ENGINEERING OF CATEGORY 11 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
REQUIREMENT
COMPLIES
D.1.4 Documentation Plan Portion of SDP
The documentation plan portion of the SDP lists and provides summary
descriptions of all project docurnentation related to software. It also
identifies responsibility for producing, reviewing, approving. and issuing
documents. At the very least. the plan covers all document outputs
identified in Table 1.
The documentation plan portion of the SDP shnll:
-~
~~
(a)
List and briefly describe all project documents and records that are
a result of the software project covered by the SDP. It shall
distinguish between deliverable end-user documentation and nondeliverable documentation.
(a)
Provide a list of all deliverable software, test software, and support
software.
Provide a list of all deliverable documents obtained from external
suppliers as they become known.
(C)
Require the use of design notes to document the experience,
options, trade-offs, decisions, rationale, design preferences, etc.
that have been incorporated into the design. A separate design
notes file shall be defined for each development and verification
output. Developers and verifiers shall be directed to file copies of
all correspondence, calculations, and analyses relating to a given
output in the corresponding design notes file.
(d)
~~
Identify responsibility for producing, reviewing, approving, tiling,
and issuing documents, including design notes.
~
~~
~-
(0
Provide document distribution lists.
(g)
Identify facilities, tools and aids used to produce, manage, and
publish documentation.
I
REFERENCE
COMMENT
- D8 -
coo-93-179
Software Engineering of Category fl Software
Requirements for the Support Process Oulputn
SOFTWARE ENGINEERING OF CATEGORY 11 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
REQUIREMENT
D.l.S Conflguration Management Plan Portion
of SDP
The configuration management plan portion of the SDP identifies the
configuration management activities. the organizational SINC~UIE and
independence requirements. and the support tools and facilities.
The configuration management plan portion of the SDP shall:
(a)
Uniquely identify all categories of configuration items that form a
developmental baseline for software configuration management:
application software (source, object. listing, load images), software
media, all deliverable and nondeliverable documents, support tools
(compilers, linkers, operating systems), data, test software,
command files or scripts. and pre-developed software. Specify the
identification scheme within each type.
(b)
Specify when and under what conditions all configuration items
come under change control. As a minimum, every document and
piece of source code shall go under change control before it is
given IO an independent party for verification purposes.
.
(c)
Require that each change be implemented starting at the highest
affected output and proceeding to subsequent outputs (e&, CSR to
CSDDID to SRS LO SDD to source code) and that each undergo
the same degree of verification as provided for the original
outputs.
Define under what circumstances technical reviews must be
performed.
adjudicate submitted change requests. The mechanism must allow
for participation by members representing each organizational unit
that is responsible for one of the project outputs identified in Table
COMPLIES
REFERENCE
I
COMMENT
Software Enginering of Category II Sofnvare
Requirementr for the Support Process Outpuu
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
REQUIREMENT
(e)
Identify a centralized support library to provide storage for, and
controlled access to. software and documentation in both humanreadable and machine-readable forms as well as records to support
audit and review of the software configuration.
(9
Identify a mechanism for analysis of the emrs detected during
verification as an aid to on-going improvement of sub-tier
standards and procedures in the SPH and the various test
procedures (STP, SITP. and VTP)
(g)
Identify how to integrate predeveloped software and documentation
into the adopted configuration management system.
D.1.6 Training Plan Portion of SDP
The training plan portion of the SDP identifies the types of training, the
training organization, training tools, resources, and documents.
The training plan portion of the SDP shall
(a)
.
For each of the job positions defined in Section D.1.1. establish
the minimum qualification criteria for the positions performing
each set of tasks consistent with accepted engineering practice for
systems of this nature.
(b)
Specifically tailor training to job positions.
(c)
Provide for the evaluation of training effectiveness.
(d)
Identify orientation training for new personnel with specific
emphasis on project standards in the SPH.
(e)
Identify training required when changes in the project occur (e.g..
changes to project requirements or standards and procedures).
(0
Identify requirements for training courses and materials with
respect to the use of software tools and methodolopic<
COMPLIES
REFERENCE
COMMENT
COG-95.179
Soflwm Engintning of Category 11 Sofrwarc
Requirements for L e Support Process OUQUU
P
SOFTWARE ENGINEERING OF CATEGORY 11 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
REQUIREMENT
(9)
Identify training facilities. tools, responsibility for training, and the
contenb of the training records.
D.2 STANDARDS AND PROCEDURES
HANDBOOK (SPH)
The purpose of the SPH is to document the sub-tier standards, procedures,
work practises, and conventions adopted by a specific project to meet the
requirements of this standard. The requirements of the Standards and
Procedures Handbook (SPH) are subdivided into requirements for the
portions which make up the SPH:
(a)
Project Plan Standards and Procedures,
(b)
Development Standards and Procedures.
(4
Verification Standards and Procedures,
(4
Documentation Standards and Procedures,
(e)
Configuration Management Standards and Procedures,
(4
Training Standards and Procedures.
All standards and procedures shall be consistent with requirements specified
in this standard.
D.2.1 ProJect Plan Standards and Procedures
The project plan sub-tier standards and procedures shall:
-
(a)
Identify standards for the SPH.
(b)
Identify procedures for using tools and facilities in planning
processes.
COMPLIES
REFERENCE
COMMENT
Software Engineering of Category II Software
Requirements for the Support Process O U I ~ U I S
SOFTWARE ENGINEERING OF CATEGORY II SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
I
REQUIREMENT
D.2.2 Development Standards and Proredores
The development standards and procedures shaN:
~
(a)
A
Provide sub-tier standards, procedures, and guidelines for each of
the development processes and outputs listed in Table 1 (except
computer hardware design and software acquisition, which are
outside the scope of this standard). The processes include:
( I ) Computer System Design,
( 2 ) Software Requirements Definition,
(3) Software Design,
(4) Code Implementation.
(b) For each development process listed above and the corresponding
output, provide the following practices or guidelines:
( I ) Guidelines describing the methodologies, techniques, and principles
to be employed during the development process.
( 2 ) Guidelines identifying the tools to be used during the development
process. and any procedures that must be followed in using them.
(3) Standards describing the organization and contents of the
development outputs.
COMPLIES
REFERENCE
COMMENT
- DI2 -
COG-95-I19
Software Engineering of Category ll Software
Requirements for the Support Process Outputs
SOFTWARE ENGINEERING OF CATEGORY 11 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
I
~~
REQUIREMENT
~
(c) Provide the following guidelines for the design process:
( I ) Provide guidelines on structuring the software. including Structured
Design, Information Hiding, Structured Programming, and
heuristics to be followed. Wodularity. Smcturedness,
Modifiability, Understandability]
(2) Provide guidelines for complexity metrics, including maximum and
minimum values. The metrics shall address at minimum module
coupling, cohesion, and size; with the objective of achieving a
loosely coupled and highly cohesive design. [Modularity,
Smcturedness. Modifiability. Understandability]
(3) Provide guidelines on scheduling to ensure that real-time
requirements are met, such as requiring operating systems and
communications facilities with deterministic worst case behaviour.
and restricting the use of intermpts. [Predictability]
(4) Provide guidelines on plausibility checking and the handling of
error conditions. [Robusmess]
( 5 ) Describe under which circumstances functions not specified in the
SRS may be performed by the design, and what documentation
requirements must be met. [Correctness. Traceability]
(d) Coding standards and procedures shall:
( I ) Define code layout and formatting guidelines for:
l
indentation and spacing,
.
capitalization, and
.
consistent sequence (e.g.. grouping all data declarations or
include files). [Understandability, Consistency]
COMPLIES
REFERENCE
COMMENT
Software Engineering of Category II Software
Requirements for chc Suppon Process Outputs
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
II
REQUIREMENT
(2) Define commenting guidelines for:
.
the use of module and program headers,
.
block comments,
.
commenting of data declarations,
.
commenting of logic,
.
commenting of expressions, and
.
comments that provide cross references to the SDD.
[Understandability, Consistency, Traceability]
.D14 -
COG-95-119
Enginccnng of Cakgory 11 SOfIware
Rquircmcnu for Ihc Support Process Outpuv.
SOfWarC
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
REQUIREMENT
(3) Define naming conventions for:
.
Identifier conventions shull be defined for;
tasks,
files,
modules,
libraries,
programs,
data types.
structure tags,
variables,
constants,
enumeration values,
parameters,
macros,
other entities where applicable. [Understandability,
Traceability. Verifiability]
.
.
.
The naming and capitalization conventions shall assist
reviewers to determine the type and scope of each identifier,
The naming and capitalization conventions shall assist in the
preparation of cross-references.
Identifiers shall be consistent in the SDD and the code.
COMPLIES
REFERENCE
COMMENT
Software Engineering of Category II Software
Rcquircmenir for the Suppon Process Outpuls
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
REQUIREMENT
(4) Define guidelines for listings. maps, and cross references as
follows:
.
The coding standards shall define practices. and conventions
for listings, maps, and cross references. including:
-
format of listings,
-
layout of code in listings,
-
.
format of maps and cross references. vnderstandability,
Consistency]
Tools used for producing listings, maps. and cross-references
shall be identified in the SPH. ~raceability]
(5) Define guidelines for the documentation of revisions as follows:
.
The coding standards shall define standards, practices, and
conventions for detailed documentation of modifications to
source code, including the information required to be kept for
each revision (e.g., date, programmer. change reference, and
description), and the location of this documentation.
It shall be possible to determine which lines were modified,
when, why, and by whom.
Any tools used for documenting revisions shall be identified
in the SPH. praceability, Verifiability]
COMPLIES
REFERENCE
COMMENT
. D16.
COG-95.179
Software Engineering of Category II Software
Requirements for thc Suppor~Process Outputs
SOFTWARE ENGINEERING O F CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
REQUIREMENT
(6) Define guidelines for structured programming and other aspects of
control flow as follows:
structured programming guidelines
-
prohibitions on branches into loops.
-
definition of legal looping constructs.
-
a
.
.
a
.
definition of legal conditional branching constructs.
restrictions on use of unconditional branches.
restrictions on the number of entry and exit points in
programs,
maximum size of programs,
maximum number and complexity of parameters to programs,
allowed type of invocation methods (e.g.. call by reference vs.
call by value),
prohibitions on unreachable code,
exception handling. (Slructuredness. Understandahility.
Verifiability]
COMPLIES
REFERENCE
COMMENT
Software Engineering of Category I! Software
Requirements for the Suppon Process Outputs
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
REQUIREMENT
(7) Define guidelines for algorithms and expressions as follows:
.
restrictions on recursion and complexity,
.
restrictions on self-modifying code,
.
expression complexity guidelines (such as number of
operators or operands),
.
guidelines on the use of floating point arithmetic.
[Conecmess, Robusmess, Understandability. Predictability]
(8) Define guidelines for data struchlres and types, variables, and
constants as follows:
when to define types (e.g.. to permit compilers to perform
type checking. or for portability),
.
restrictions on types (e.g., for accuracy or range),
restrictions on the use of global variables,
restrictions on updates to variables at multiple locations.
.
the use of symbolic rather than literal constants,
.
the use of constant expressions to initialize constants,
the use of typed constanls,
restrictions on data declared but not used. [Correcmess,
Consistency, Understandability, Modifiability, Robusmess]
COMPLIES
REFERENCE
COMMENT
- Dl8 -
COG-95- I79
Softwarr Engineering of Category ll Software
Requirement8 for the Support Process O u ~ ~ u t s
L
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
COMPLIES
REQUIREMENT
~
~
~~~
~~~
(9) Define guidelines for mn-time e m r checking as follows:
plausibility checking,
a
.
range and bounds checking.
-
division by zero.
numeric overflow and underflow,
.
stack and memory allocation checks,
-
type checks. [Robusmess, Predictability, Comcmess]
Define guidelines for language-dependency as follows:
(IO)
a
.
use of macros,
conditional compilation or assembly,
In-line oronrams. IPredictabilitv. Modifiabilitvl
Define machinedependent guidelines for.
(I 1)
e
a
use of floating point arithmetic.
word length and alignment,
.
byte-ordering,
.
inpul/output type (programmed vs. memofy mapped),
.
use of absolute addressing.
.
memory models,
.
use of traps or interrupts. [Robusmess, Predictability,
Modifiability]
REFERENCE
COMMENT
Software Engineering of Category I1 Software
Requirements for the Support Process Outputs
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
REQUIREMENT
D.2.3 Verification Standards and Procedures
The verification standards and procedures shall
(a) Provide sub-tier standards, procedures, and guidelines for each of the
verification processes and outputs listed in Table 1. The processes
include:
(1) Computer System Design Review,
(2) Software Requirements Review,
(3) Software Design Review,
(4) Code Review,
(5) Unit Testing,
(6) Subsystem Testing,
(7) System Integration Testing,
(8) Validation Testing.
(b)
For each verification process identified above and the corresponding
outputs, provide the following sub-tier standards or guidelines:
(I) Guidelines describing the methodologies, techniques, and principles
to be employed during the Verification process.
(2) Guidelines identifying the tools to be used during the verification
process. and any procedures that must be followed in using them.
(3) Standards describing the organization and contents of the
verification output or outputs.
D.2.4 Documentation Standards and Procedures
The documentation standards and procedures shall:
COMPLIES
REFERENCE
COMMENT
. D20 -
COG-95-179
Software Engineering of Category I1 Software
Requirements for the Support Process Outpuis
*-
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
REQUIREMENT
(a) Identify a standard document style and format to be used for all
documents. The style and format guidelines shall be chosen to make
the documents consistent, understandable. reviewable, and maintainable.
The guidelines shall address:
( I ) fonts and attributes,
(2) headers and footers,
(3) revision history,
I1
(4) labelling of figures and tables,
( 5 ) page identification,
(6) section identification.
(7) cross-referencing between documents,
(8) glossary.
(9) index,
(1 0 )
references,
(1 I )
date. signature, and file number.
(b) Define a comprehensive glossary to ensure common understanding and
consistent usage of project-specific and industry accepted terminology.
The use of unique terminology and definitions shall be minimized.
~
D.2.5 Configuration Management Standards
and Procedures
- The configuration management standards and procedures shall:
COMPLIES
REFERENCE
COMMENT
Software Engineering of Category It Software
Requirements for the Support Process Outputs
SOFTWARE ENGINEERING OF CATEGORY I1 SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
~~~
REQUIREMENT
(a) Specify guidelines to handle and document exception cases or problems
arising from e m r s or exceptions which are detected during software
development and verification (i.e.. any nonconformances to the SDP or
SPH,errors found during all verification activities. changes arising
during installation and operation, tests deferred to a later testing phase,
etc.).
(b) Identify procedures for change requests to fix problems or exception
cases.
~~
(c) Identify a procedure for ensuring that all personnel are working from
the most current version of the documentation.
(d) Identify procedures for unique identification of:
(1)
software changes,
(2) software and software components,
(3) release. version, and update designations,
(4) documentation.
(5) media.
(e) Define procedures for issuing, analyzing, estimating the impact of,
classifying, approving or disapproving, distributing, scheduling
implementation of and tracking software change requests to completion.
The procedures shall require that the activities necessary to verify the
change are identified and justified.
(f) Identify procedures for support library operation (i.e., security, disaster
recovery, archive maintenance, protection, access control, retention
period, change history, withdrawal. submission, issue, and approval).
COMPLIES
REFERENCE
COMMENT
- D22 -
COG-95-179
Software Engineering of Category II Software
Requirements for the Support Process Ou&~urr
~~
~
~~~
~~
~
SOFTWARE ENGINEERING OF CATEGORY II SOFTWARE
REQUIREMENTS FOR THE SUPPORT PROCESS OUTPUTS
COMPLIES
REQUIREMENT
(9) Define procedures for change request documentation, e m r analysis
summary reports. and configuration management status reports in order
to identify the most current configuration and to provide a traceable
history of software changes.
REFERENCE
COMMENT
I
I
I
I
I
I
(h) Define procedures for approving and issuing software releases to the
user.
(i)
Define procedures defining h e extent to which verification activities
(such as technical reviews) must be performed following a change.
(i) Define procedures for ensuring that the part of software not affected by
change will still work correctly following a change implementation
(i.e.. regression testing).
(k) Identify tools to be used for configuration management and procedures
for their use.
D.2.6 Tralnlng Standards and Procedares
Training standards and procedures shall identify sub-tier standards and
procedures for maintaining training records and skills inventories.
COG-95- 179
Appendix E
Software Engineering of Category 111 Software
Requirements for the Computer System Requirements Document
Product:
Company:
Company Representative(s):
Date:
SOFTWARE ENGINEERING OF CATEGORY 111 SOFTWARE
REQUIREMENTS FOR THE COMPUTER SYSTEM REQUIREMENTS DOCUMENT
REQUIREMENT
The CSR documents define the requirements that must be met by the computer
system. There may be a single requirements document for the whole computer
system, or the requirements may be contained in several documents. A l l
references herein refer to sections or tables In the Category I l l standard.
COMPLIES
I
REFERENCE
COMMENT
I
I
I
I
The CSR shall:
(a)
Define the functional, performance, safety. reliability, and maintainability
requirements of the computer system, clearly identifying the safety
requirements.
(b)
Define the scope and boundary of the computer system and define the
interfaces to other systems. It shall contain or reference a description of
the problem domain including natural restrictions on the required
functions.
1I
IL)
Define human interface requirements.
~~
(d)
Define all accuracy requirements and tolerances.
(e)
Define any constraints placed on the design options.
(0
Define any Quality Assurance and Software Quality Assurance (SQA)
requirements to be met by the computer system. Include or rcfcrcnce the
categorization analysis that provides the rationalc Tor choo\ing this SQA
requirements.
I
I
~
- E2 -
COQ-95-179
Sofhvare Engineering of Cakgory 111 Sofware
Requirements for Ihe Computer Sysem Requirements Document
L
I
1
(g)
Define the design quality objectives, if different to those in Section E.l,
and heir relative priority.
(h)
Define anticipated changes to the computer system based on an analysis
of experience with changes to similar systems and on projections of
future needs.
(i 1
Provide a clear definition of terms.
(i)
Explicitly identify a comparable reference design or the absence of any.
(k)
Contain no requirements that ate in conflict with each other.
0)
Define each requirement uniquely and completely in one location to
prevent inconsistent updates and to facilitate easy referencing by
subsequent documents and verification processes.
(m)
Contain or reference a revision history.
-
COG-95- 179
Appendix F
Software Engineering of Calcgory 111 Sofnvare
Requirements for the Developmenl Process Outputs
Product:
Company:
Company Representative(s):
Date:
SOFTWARE ENGINEERING OF CATEGORY I11 SOFTWARE
REQUIREMENTS FOR THE DEVELOPMENT PROCESS OUTPUTS
REQUIREMENT
This appendix defines the requirements which the outputs from the
development processes must meet to be of acceptable quality. All
references herein refer to sections or tables in the Category 111
standard.
The outputs are:
(a)
(b)
(c)
(d)
(e)
(0
l
U
.
Computer System Design
Computer Hardware Design
Predeveloped Software Description
Software Requirements Specification
Software Design Description
Source Code
The following detailed requirements reference, using brackets "[...I" , the
quality attribute(s) from Appendix E which they are intended to address.
COMPLIES
REFERENCE
COMMENT
COG-95-I19
Software Engineering of Category 111 Software
Requirements for the Devclopmcnt Process Ouipu~s
B.1 COMPUTER SYSTEM DESIGN (CSD)
The CSD documentation defines the functional decomposition, the required
predeveloped software, and the hardware atchitecture of the computer
system. There may be a single computer system design document for the
whole computer system, or the information may be divided into several
documents.
The requirements provided below address only computer system design
issues related to software engineering. Specifically, requirements related to
hardware engineering are nor addressed.
B.l.l Functional Decomposition
The CSD shall:
(4
Accurately define the computer system context. It shall define all
inputs to the computer system ("monitored variables") and outputs
from the computer system ("controlled variables").
A context diagram shall be provided. [Understandability.
Verifiability]
(b)
Decompose the computer system into functional subsystems with
logical inputs and outputs, and uniquely identify the functional
subsystems. [Understandability. Correcmess, Structuredness.
Modularity)
~~
(c)
Define the logical interfaces between the functional subsystems.
Diagrams illustrating the interfaces between functional subsystems
shall be provided. [Understandability, Verifiability. Correcmess]
(d)
Allocate system requirements to the functional subsystems.
[Completeness, Verifiability, Modularity. Consistency,
Understandability]
(e)
Define the reliability requirements for each functional subsystem.
[Robusmess. Predictability, Correcmess]
(0
Define the accuracy requirements and k d c r a i i ~ n1(*11ric, till.*\
Predictability]
-
Software Engineering of Category Ill Software
Requirements for the Devclopmenl Process Outputs
n
Define any constraints placed on the software design options.
[Consistency]
(h)
For each functional subsystem, define which functions must be
easy to change or reconfigure. modifiability J
(i 1
Map the functional subsystems to one or more compute? or nonprogrammed hardware, predeveloped software and custom
developed software. [Understandability, Verifiability,
Completeness]
(i)
Identify and describe significant failure modes of the system.
Provide an overview of required fault-tolerance and graceful
degradation features provided by the system hardware architecture.
Examples would include hot-standby computers, warm-restart
functions, backups, etc. Describe or reference the failure, restart,
and resynchronization procedures and requirements. [Robusmess,
Predictability]
B.1.2 Hardware Architecture
For function subsystems allocated to hardware. the CSD shall:
(a)
Identify each computer, physical interface between computers. and
non-programmed hardware devices within the computer system.
Diagrams of the computer system hardware architecture shall be
provided. Wnderstandability, Completeness, Verifiability]
(b)
Describe the general high-level characteristics of each computer,
including types and approximate numbers of inputs, outputs, and
YO devices, approximate amount of memory and mass storage,
approximate CPU performance, and other relevant information.
The definition of required computer capabilities shall be sufficient
to allow software requirements to be specified with confidence that
the computer equipment will be sufficient to support them.
[Completeness, Comcmessl
(c)
Describe the general high-level characteristics of each physical
interface between computers, including type of communications
link, performance, determinism and other relevant information.
[Completeness. Predictability]
COG-95-119
S o h a m Enginwing of Category 111 Software
Requirements for l e Devclopmcnt Process Outputs
~
(d)
~~
Describe the general high-level characteristics of each nonprogrammed hardware device within the computer system.
Identify the relevanl design documentation for the device.
[Completeness, Consistency].
~
~~
~~
~~
8.1.3 Predeveloped Software
For functional subsystems allocated for predeveloped software, the CSD
shall identify the types of predeveloped software to be used in the target
system. This includes operating systems, database managers, libraries, compilers. etc. [Understandability, Completeness, Verifiability]
8.2 COMPUTER HARDWARE DESIGN (CHD)
The CHD documentation describes the computer hardware. There may be a
single computer hardware design document for the whole computer system,
or the infomation may be divided into several documents. References to
the manufacturer's technical specifications may be used.
The requirements provided below address only computer hardware design
issues directly related to software engineering. Other information may be
required to engineer the hardware. Each of the requirements addresses the
correcmess and completeness quality attributes.
The CHD shall, for each computer in the computer system. describe the
computer in detail, including items such as:
(a)
Bus type and speed,
(b)
Hardwaredesignated address mapping,
(C)
Direct Memory Access (DMA) capabilities,
(d)
Hardware interrupts and intenupt controller.
(e)
CPU type, instnrction set, and speed,
(0
Coprocessor types and speeds,
(g)
Memory .types. sizes, and speeds,
(h)
Memory management facilities, including caching.
t-
Software Engineering of Category 111 Software
Requirements for the Dcvelopmcnl Process Outputs
(i 1
Mass storage device types, sizes, and speeds,
(i)
VO devices, including controller types, specifications, addressing,
protocols, etc.,
(k)
Required initialization logic,
0)
Required diagnostic facilities,
(m)
Power up, power fail, and restart logic,
(n)
Other information as required to completely characterize the
software interface to the computer hardware.
B.3 SOFTWARE REQUIREMENTS SPECIFICATION (SRS)
The SRS contains all requirements from the CSD which are relevant to the
software as well as all other software specific requirements which arise due
to the environment. Normally there will be one SRS for each functional
subsystem. Alternative organizations are possible as long as the SRS is
arranged to facilitate mapping to the functional subsystems defined in the
CSD.
B.3.1 Requirements
The SRS shall:
(a)
Describe all requirements from the CSD which are relevant to the
software. [Completeness]
(b)
Describe all additional requirements. [Completeness]
(c)
Describe any software implementation design constraints. This
might include the processors on which the software is required to
execute, and the predeveloped software required to be used. [Comoletenessl
(d)
Describe the logical inputs to and the logical outputs from the
functional subsystem. [Completeness]
~~
I
Describe the required behaviour of the software in terns of what
the software must do. [Completeness, Correctness, Verifiability]
COG-95-179
Software Engineering of Cmgory 111 Software
Requirements for Le Devclopmcnt Process Outputs
Describe the timing tolerances and the accuracy requirements by
specifying the allowable deviation from the specified behaviour of
the output variables. [Completeness]
Describe the characteristics of the logical and physical inputs and
outputs addressing such issues as types, formats, units, and valid
ranges. [Completeness]
Specify the response to exception or ermr conditions.
[Predictability]
(i1
Identify all those requirements for which future changes are
anticipated. [Modifiability]
ti)
Describe requirements for fault tolerance and gracehrl degradation.
[Robusmess]
8.3.2 Constraints on the SRS
The SRS shall:
(a)
Be written strictly in terms of requirements and design constraints;
it shall limit the range of valid solutions, but shall not insist on a
particular design. [CompletenessJ
(b)
Be written according to SPH guidelines. [Correcmess,
Completeness, Consistency]
(c)
Use standard terminology and definitions throughout the document.
[Consistency]
8.4 SOFTWARE DESIGN DESCRIPTION (SDD)
The SDD document is a representation of the software design. Normally the
SDD will be organized into an overview of the software architecture,
sections or sub-documents corresponding to each of the functional
subsystems. Alternative organizations are possible as long as the SDD is
arranged to facilitate mapping to the SRS. The SDD will be the primary
document issued to the client to allow understanding, use and maintenance
of the software.
U
Software Engineering of Category 111 Software
Requirements for the Development Process Outputs
which are not included. [Understandability. Maintainability]
-
(d)
Provide a mapping between functional subsystems. tasks,
predeveloped software and computers. [Understandability.
Verifiability, Maintainability]
(e)
Describe each task. [Understandability. Verifiability,
Maintainability J
(0
Describe each data structure. [Understandability, Verifiability,
Maintainability]
(9)
Describe each intertask communications service.
vnderstandability. Verifiability, Maintainability I
coo-95-I19
S o f l w m Engineering of Category 111 Sofbvare
Requirements for rhe Dcvelopmcni Process Outpu~c
Describe each inputloutput device. [Understandability, Verifiability,
Maintainabilitvl
(h)
~
~~
~
~
(i1
Describe each library to be shared between tasks.
[Understandability. Verifiability. Maintainability]
ti)
Describe how the executable software for each functional
subsystem including predeveloped software and computer is
organized in the target computer system, for example, directory
structures or memory maps. [Understandability, Verifiability]
(k)
Identify the language(s) and translatofls) to be used. [Verifiability]
B.4.2 Software Detailed Design
This section of the SDD shull:
For each task or library identified in the software architecture
section, describe a software design which meets the relevant
requirements of the SRS to a level of detail that requires no further
refinement in the code of the module structure. module hnctions.
module interlaces. libraries and data stNctures.
(a)
.
(b)
Provide the decomposition of the task or libraly into design
entities. Provide diagrams showing the program calling
hierarchies. Where object-oriented programming techniques are
used, provide diagrams of the class inheritance hierarchies.
IUnderstandability. Verifiability, Maintainability]
Provide a "module guide" containing the following information for
each module: [Understandability. Verifiability, Maintainability]
(I)
(2)
(3)
(4)
(5)
(6)
unique identifying name,
the purpose of the module,
the data structure(s) maintained by the module.
the list of programs contained in the module.
any required resources (elements external to the design,
e.g.. predeveloped software, devices, software services
(libraries). operating system services. processing resource
requirements),
the programming language@)to be used.
~
Software Engineering of Category 111 Software
Requirements for the Development Process Ourpurs
l
Provide the following information for each program:
[Understandability. Verifiability. MaintainabilityJ
(I)
(2)
(3)
(4)
unique identifying name,
program type (e.g., main program, subroutine, function,
macro, intempt handler),
the purpose of the program (what requirement it
satisfies),
program interface (calling sequence and passed
parameters).
Provide information for each data structure and data element
(possibly in the form of a data dictionary):
(d)
(I)
(2)
(3)
(4)
unique identifying name,
data type (e.g., integer, Boolean, floating point, SINCLUE,
file),
the purpose of the data structure (what data it contains),
the range of validity.
Describe how the software for each task or library is built or
generated. [Understandability. Verifiability]
(e)
B.4.3 Other Requirements To Be Met By The SDD
The SDD shall:
~~
~~
(a 1
.
~
~
Describe error handling which is not specified in the SRS.
[Robusmess]
(b)
Describe how relevant experience from previous systems defined
in the CSD has been factored into the design. [Correctness]
(c)
Identify all functions contained in the SDD which are outside the
scorn of the SRS.
-
~
~
(d)
Identify each function in the design so that it can be uniquely
referenced by the code. The identifiers shall be consistent with
those used in the code.
(e)
Contain or reference a revision history which identifies all
modifications to the design and the rationale for these changes.
(0
Use standard terminology and definitions throughout the document.
[Consistency1
COG-95-119
Sofhrare Engineering of Category 111 Software
Requirements for the Development Process Outpuls
-I
B.S SOURCE CODE
The source code is a complete and accurate translation of the design
described in the SDD. It includes both executable statements and data
declarations. The source code shaN precisely implement the design as
described in the SDD.[Verifiability. Understandability. Comcmess].
COG-95-179
Appendix G
Software Engineering of Category 111 Software
Requirements for the Verification Process Output..
Product:
Company:
Company Representative(s):
Date:
SOFTWARE ENGINEERING OF CATEGORY I l l SOFTWARE
REQUIREMENTS
FOR THE VERIFICATION PROCESS OUTPUTS
REQUIREMENT
This appendix defines the requirements which the outputs from the
verification processes must meet to be of acceptable quality. All references
herein refer to sections or tables in the Category 111 standard. The
outputs are:
Computer System Design Review Report,
Software Requirements Review Report.
Software Design Review Report,
Subsystem Test Procedures,
Subsystem Test Report,
System Integration Test Procedures,
System Integration Test Report,
Validation Test Procedures,
Validation Test Report.
Many of the requirements on the review and verification reports are generic
and are listed in Subsection C.1. Those requirements which are specific to
the various review and verification reports are dealt with in separate
subsections. The requirements on the test procedure outputs are presented
individually in the following subsections. The requirements for all test
reports are presented in Subsection C.8 as a common set of requirements.
COMPLIES
REFERENCE
COMMENT
COO-95-I19
S o b a n Engineering of Coacgory 111 Software
Requirements for the Verification Process Outpu~r
C.1 GENERIC VERIFICATION REPORT
REQUIREMENTS
The Verification Repom have a common set of requirements. To be of
acceptable quality. a report shaff:
(a)
Identify the vetsions of the relevant documents.
(b)
Summarize the review or verification activities performed and the
methods and tools used.
(c)
Summarize the discrepancies.
(d)
Summarize the positive findings.
(e)
Describe the conclusions and recommendations.
(0
Identify the review or verification parcicipants.
(9)
Identify and comply with the applicable SPH requirements for
Verification Reports.
(h)
Identify or reference corrective action lists resulting from the
review or test.
~
C.2 COMPUTER SYSTEM DESIGN REVIEW
REPORT (CSDRR)
To be of acceptable quality, the CSDRR shaff:
(a)
Provide evidence that a review of the design has been performed.
~
~~
(b)
Provide evidence that the review has covered all requirements and
design constraints in the CSR and all functional subsystems,
computers, and interfaces in the CSD.
(C)
Provide evidence that the review has covered all standards and
procedures in the SPH applicable to the CSD.
C.3 SOFTWARE REQUIREMENTS REVIEW
REPORT (RRR)
To be of acceptable quality. the RRR shaff:
I
-
~
~~~~
~~~~~
~
~~~~
Software Engineering of Category 111 Software
Requirements for the Verifxation Process Outputs
I
Provide evidence that a review of the requirements has been
performed.
(a)
Provide evidence that the review has covered all requirements and
design conslraints in the CSD and SRS.
(C)
Provide evidence that the review has covered all requirements and
design constraints appearing in the SRS which are not derived
from the CSD.
(dl
Provide evidence that the review has covered all standards and
procedures in the SPH applicable to the SRS.
I
C.4 SOFTWARE DESIGN REVIEW
REPORT (DRR)
I
To be of acceptable quality, the DRR shall:
(a)
Provide evidence that a review of the design has been performed.
(b)
Provide evidence that the review has covered all requirements and
design constraints in the SRS and all programs, data structures. and
data bases in the SDD.
~
(c)
~
~~
Provide evidence that the review has covered all standards and
procedures in the SPH applicable to the SDD.
C.5 SUBSYSTEM TEST PROCEDURES (STP)
The Subsystem Test Procedures shall:
(a)
II*I
(c)
Require all tests to be done with predeveloped software and test
hardware that accurately represents the capabilities of the target,
that are required for the hnctional subsystem.
Define test cases to test each functional requirement in the SRS.
~~
~
Define tests to test the performance requirements as described in
the SRS.
I
I
I
COG-95- I19
Software Engineering of Carcgory 111 Software
Rquiremenu for the Verifmtion Profcsr Outpuis
(d)
Identify all resources used by the functional sub-system and define
test cases which test the functional subsystem under conditions that
attempt to overload these resources in order to determine if the
functional and performance requirements defined in the SRS are
met.
(e)
Define tests to exercise any interfaces between the software and
the hardware, and the software and the predeveloped software.
I
Define tests to show that the functional subsystem meets its
requirements under each hardware configuration and operational
option.
(9)
Define tests to test the ability of the functional sub-system to
respond as specified in the SRS to software. hardware. and data
errors.
(h)
Define test cases which attempt to subvefl any existing safety or
security mechanisms.
(i 1
Describe expected results of each test case so that a pasdfail
determination can be made as to the outcome of the test. The
expected results shall be based on the information contained in the
SRS.
(i)
Identify all equipment (and its required calibration), tools, and
suppolt software required to perform the test and provide adequate
setup and test execution instructions so that the test can be repeated by personnel who did not perform the original test. Each test
procedure shaff describe or reference how the executable code to
be tested is built.
(k)
Identify the SRS name and version.
0)
Provide a cross-reference between the sections of the SRS and the
test procedures to document the test coverage.
(m)
Comply with the applicable sub-tier standards and procedures in
the SPH.
~-
C.6 SYSTEM INTEGRATION TEST
PROCEDURES (SITP)
L
The System Integration Test Procedures shall:
Sofbvarc Engineering of Caacgory 111 Software
Requirements for the Verification Process 0utpu1.s
n
I
I
(a)
Require all tests to be done with predeveloped software and
hardware that is closely representative of the capabilities of the
complete target system architecture described in the CSD.
(b)
Define hardware and software components to be integrated and
tested.
(c)
Define the system integration testing strategy and plan to be used,
e.g.. Bottom-Up Integration, Top-Down Integration. Incremental
Builds, etc.
(d)
Define test cases to test that the interfaces between functional
subsystems behave as specified in the CSD.
(e)
Define test cases to test that the physical interfaces between
computers behave as specified in the CSD.
(9
Define test cases to test that the functional subsystems perform in
an integrated environment as specified in the CSD.
I
Define tests cases to test that the system architectural features
perform as specified in the CSD.
(h)
Describe expected results of each test case so that a passlfail
determination can be made as to the outcome of each test from
information provided in the CSD.
0)
Identify all equipment (and its required calibration), tools, and
support software required to perform the test and provide adequate
set-up and test execution instructions so that the test can be
repeated by personnel who did not perform the original test. Each
test procedure shull describe or reference how the executable code
to be tested is built.
iri
Identify the CSD name and revision.
___
~
_
_
_
~
(k)
Provide a cross-reference between the sections of the CSD
components and the test procedures to document the test coverage.
0)
Comply with the applicable sub-tier standards and procedures in
the SPH.
C.7 VALIDATION TEST PROCEDURES (VTP)
The Validation Test Procedures shaff:
I
I
I
I
I
I
I
I
-
I
1
- Gb Software Engineeringof Caegory 111 Software
Rquirements for he Verifcation Process Ourputs
(a)
Require all tests to be done with predeveloped software and
hardware that duplicates to the greatest practical extent the
complete target system architecture.
(b)
Define test cases to test each functional requirement in the CSR
that is applicable to the system.
(c)
Define test cases to test each performance requirement in the CSR
that is applicable to the system.
(d)
Define test cases, using dynamic simulation of input signals, to
cover normal operation, anticipated operational occunences, abnormal incidents, and accident conditions requiring system action as
indicated in the CSR.
(e)
Describe expected results of each test case so that a passlfail
determination can be made as to the outcome of the test from’
information provided in the CSR.
(0
Identify all equipment (and its required calibration), tools, and
suppott software required to perform the test and provide adequate
setup and test execution instmctions so that the test can be
repeated by personnel who did not perform the original test. Each
test procedures shall describe or reference how the executable
code IO be tested is built.
(g)
Identify the CSR components and versions.
(h)
Provide a cross-reference between the sections of the CSR
components and the test procedures to document the test coverage.
(i1
Comply with the applicable sub-tier standards and procedures in
the SPH.
C.8 TEST REPORTS
A test report must document the results of each test activity.
The test report output from each test activity shufl:
c
(a)
Identify or describe the tests performed.
(b)
Identify the test procedures.
COG-95-I79
Software Engineering of Category 111 Software
Requirements for the Verification Process O U ~ ~ U I S
I
(c)
Include the comparison of actual and expected test results as
defined in the referenced test procedures.
(d)
Summarize the discrepancies.
(e)
Summarize the positive findings.
(0
Describe the conclusions and recommendations.
(g)
Identify the date and time of the performance of each test.
(h)
Identify the program, functional subsystem, or system, and the
versions being tested.
(i)
Identify the testers involved.
0)
Reference the detailed test results.
(k)
Identify compliance with the applicable sub-tier standards and
procedures in the SPH.
I
COG-95- 179
Appendix H
Software Engineering of Category 111 Software
Requirements for the Support Process 0utpu1.c
Product:
Company:
Company Representative(s):
Date:
SOFTWARE ENGINEERIN(
REQUIREMENTS FOR TH
REOUIREMENT
This appendix contains the requirements which the outputs from the support
process shull meet to be of acceptable quality. All references herein refer
to sections or tables In the Category 111 standard. The outputs are:
(b)
Software Development Plan (SDP)
Standards and Procedures Handbook (SPH)
COMPLIES
REFERENCE
COMMENT
- n2.
Software Engineering of Category 111 Software
Rquircmenu for the Suppon Process Ourpu~r
D.l SOFTWARE DEVELOPMENT PLAN (SDP)
The SDP provides the comprehensive plan for the management of the
software engineering process.
The requirements of the SDP are subdivided into requirements for the
podons which make up the SDP:
(a)
(b)
(4
(d)
(e)
Project Plan Portion
Development Plan Portion
Verification Plan Portion
Documentation Plan Portion
Configuration Management Plan Pohon
The SDP shull nor duplicate infomation contained in overall project
management documentation, but rather shaN ttference it. The SDP shall be
revised when there are major changes to either the sonware scope of work
or to the project organizational st~chrre.
In the following rcquiremenu, the term "provide" allows developers to use
either descriptions or references to supply the required information.
D.l.l
ProJect Plan Portton of SDP
The project plan ponion of the SDP describes the scope of the software
engineering effort. the organization and responsibilities. and the key rnilestones and dates based on external detailed budget and schedule documentation used for monitoring the work effort.
The project plan portion of the SDP shall:
J
(a)
Identify the CSR
(b)
Identify any overall project management plans, and describe the
relationship of the SDP to them.
COG-95-119
Software Engineering of Category 111 Software
Requirements for the Support Process Outpui.
(c)
Adopt a specific software life cycle model, such as waterfall.
iterative waterfall, evolutionary builds, pre-planned product
improvement. etc., as a focus for planning that a systematic
software engineering process is followed over the entire life of the
software. If iterations are planned, then the model shull show the
intended number of iterations. In addition, it shall treat
development and maintenance as an integral. continuous, and interactive process.
(d)
Describe the organizational structure for the project and
relationships to external organizational units (including the user),
including the scope, authority and responsibility of each. Show all
relevant job positions and the number of resources required within
the project and list the independence requirements of each job
position.
Define the following independence roles:
. Developer
. Verifier
. Validator
Responsibility for the development, verification, and
support processes shall be assigned to the defined roles
as described in Table D-I.
Verifiers have primary responsibility for directing
verification processes but developers may be involved in
the performance.
Verification shall not be performed by the same person
who developed the item being verified.
Developers shall not be involved in roles assigned to
Validators.
(g)
Include a policy statement that the SDP shull be followed by all
personnel who are responsible for the production of an output
reauired bv the SDP.
COG-95-179
Sofware Engineering of Category 111 Software
Requirements for h e Suppon Process Outputs
(h)
Include a policy statement that the personnel who produce each
output required by the SDP have primary responsibility for the
output's quality.
(i )
Mandate the project-specific standards and procedures in the SPH
based on the requirements in Appendix D.2.
~~
ti)
Identify training requirements for the project.
D.1.2 Development Plan Portion of SDP
The development plan portion of the SDP describes the approach to
development, key design and implementation issues, and identifies the
support tools and facilities.
The development plan portion of the SDP shnN:
~~~~~~
(a)
Partition the development effort into uniquely identifiable
development processes with well defined inputs and outputs (see
Table D-I).
(b)
Specify the approach and methodologies to be used for the
development processes.
~
(c)
~
(d)
~~
Specify facilities, tools, and aids to be used during software
development.
Identify any key design and implementation issues and preliminary
studies, simulation modelling, and/ or prototyping required to
resolve them.
D.1.3 Verification Plan Portion of SDP
The verification plan portion of the SDP describes the approach to the
verification of the software ,product. It identifies the verificatioh activities
and support tools and facilities required to perform the verification.
The verification plan portion of the SDP shull:
(a)
Partition the verification effort into uniquely identifiable
verification processes with well defined inputs and outputs (see
Table D-I).
Software Engineering of Category 111 Software
Requirements for the Support Process Outputs
*
(b)
Specify the approach and methodologies
verification processes.
(c)
Specify facilities, tools, and aids to be used during verification.
(d)
Require that verification activities shall not start until the outputs
being verified are completed and placed under configuration
control.
(e)
Identify the qualification requirements for predeveloped software
for the target system and software tools.
-
to
be used for the
D.1.4 Documentation Plan Portion of SDP
The documentation plan portion of the SDP lists and provides summary
descriptions of all project documentation related to software. It also
identifies responsibility for producing. reviewing. approving, and issuing
documents. At the very least, the plan covers all document outputs
identified in Table D-I.
The documentation plan portion of the SDP shnll:
(a)
List and briefly describe all project documents and records that are
a result of the software project covered by the SDP. It shall
distinguish between deliverable end-user documentation and nondeliverable documentation.
(b)
Provide a list of all deliverable software, test software, and support
software.
(c)
Provide a list of all deliverable documents obtained from external
suppliers as they become known.
(d)
ldentify responsibility for producing, reviewing, approving, filing,
and issuing documents.
~~
~
~~
-
Identify facilities, tools and aids used to produce, manage, and
publish documentation.
COG-95- 179
Appendix I
Product:
Company:
Company Representative(s):
Date:
1
GOODNESS OF DESIGN REQUIREMENTS
II
l
COMPLIES
REQUIREMENT
~~
~~~~
The following requirements address attributes related to goodness of design.
(a)
11 (b)
(c)
understundabili?p-Refers to the extent to which the meaning of the
design is clear to the reader.
verifiabililg-Refers to the extent to which the design facilitates
testing.
robustness-Refers to the extent to which the design implements
the ability to continue to perform despite some subsystem failure.
REFERENCE
'
COMMENT
. I2 -
coo-95-179
Goodcncss of Design Rquirements
GOODNESS OF D
REQUIREMENT
COMMENT
(d)
predicfubilify-Refers to the extent to which the hrnctionality and
performance of the software are deterministic for a specified set of
(e)
modifiabilify-Refers to the characteristics of the design which
facilitate the incorporation of changes.
(0
modulurify-Refers to the extent to which the design is composed
of discrete components such that a change to one component has
minimal impact on the others.
(9)
sfrucfuredness-Refers to the extent to which the code possesses a
definite pattern in its interdependent pans. This implies b a t the
design has proceeded in an orderly and systematic manner (e.&
top-down design), has minimized coupling between modules, and
that standard control st~ctureshave been followed during coding
resulting in well structured software.
(h)
consistency-Refen to the extent to which the code contains
uniform notations, terminology, cornmenu. symbology. and
implementation techniques.
~~
~
~~
~~
-
COG-95-179
Appendix J
Product:
Company:
Company Representative(s):
Date:
I
I
SAFETY NET REQUIREMENTS
REQUIREMENT
The following requirements address a variety of design decisions that can
help ensure the safe operation of a software product.
(a)
Program memory, i.e. that memory in which the executable
program resides, shall be protected against CPU writing.
tolerance) shall be guaranteed by the software. in both normal and
abnormal (e.g. power loss, system degradation) operating modes.
(g)
Data integrity over communication pathways shall be upheld with
of e m r detection and/or correction techniques.
the use
I
I COMPLIES I REFERENCE I
COMMENT
COG-95- 179
Appendix K
K1
-
COG-95-179
Failure Mode Analysis Questionnaire
FAILURE MODE ANALYSIS QUESTIONNAIRE
Product:
Company:
Company Representative:
Date:
Failure modes are examined with the intent of ensuring that all potential system faults are
handled either by the software product or by other defense mechanisms.
1.
What are the possible modes of failure of the system containing the software product?
2.
What are the possible modes of failure of the software product?
3.
Is the software product susceptible to any failures of the application or person using it
(e.g. errors in interfacing with the product)? Are these susceptibilities described in the
product documentation? What are they?
4.
Can any failure place the system in an indeterminate state requiring special recover
procedures? Are those recovery procedures described in the product documentation?
What are those recovery procedures?
5.
Can the failure of the system containing the software product cause the software to
behave erratically? Under what circumstances?
6.
What fault detection mechanisms are provided with the software product?
7.
What methods are provided to prevent or contain the effect of failure?
8.
Does the software stabilize itself after a fault?
,112-
COG-95-179
Failure Mode Aastysis Questionnaire
9.
Does the software product have any fault tolerant capabilities? Are they documented
in the product documentation? What are they?
10.
Does the software product provide a "fail safe" indicatiodoutput?
11.
For all failures that can occur, does the user receive a clear indication as to the
naturekause of the failure?
12.
Is redundancy provided in the system which contains the software?
13.
How many hardware faults can the system tolerate?
14.
Can the system tolerate any software fault?
15.
Can the system have multiple failures attributed to a common cause?
16.
Are common mode failures considered in the design of system?
17.
What abnormal symptomdindications should a user be aware of during the operation
of the system? Are these included in the product documentation?
18.
If the software or system in which the software resides is a replacement for another
system, are the modes of failure identical to that system? If not how do they differ?
19.
If the software or system in which the software resides is a replacement for another
system, are the results or external behaviour identical to that system? If not how do
they differ?
COG-95-179
Appendix L
- L1 -
COG-95- I19
O p t i n g History Questionnaire
OPERATING HISTORY QUESTIONNAIRE
Product:
Company:
Company Representative:
Date:
This questionnaire is used to obtain raw data for the evaluation of product reliability based on
the analysis of operating history.
1.
What revisions have been released?
2.
What are all of the revision release dates?
3.
Are there significant differences between revisions for various platforms? What are
they and which sub-components or modules are affected?
4.
How many copies of each revision of the product have been delivered to customers,
broken down by year, since the beginning of the products life?
5.
What percentage of the copies delivered have been installed in representative
applications? (A "representative" application can be characterised as process
monitoring, process control, turnkey, mission critical.)
6.
What amount of operating history is relevant to the platform being used (<platform>)?
7.
What process do you use that allows customers to report errors they encounter using
the product?
8.
How many software errors were reported on each revision of the product?
9.
How many patches (bug fixes) were made on each revision of the product?
10.
What are the major differences between the revisions by major component?
11.
What major Components are most affected by the revisions?
12.
What major components are most error prone?
u-
COG-95-179
Operating Hhrory Questionnaire
.
13.
Is there a record of outstanding errors for each revision? Is this record provided to the
customer? Can a copy of the record be obtained for review?
14.
Is there a workaround for outstanding errors for each revision?
15.
What documented evidence was used to arrive at the above operating history data?
16.
Can superseded revisions (Le. older than a most recent revision) of the product be
purchased? Which revisions can not be purchased?
17.
Have any customers purchased a superseded revision of the product?
18.
How are revisions of the software components of the product maintained to ensure
reliable recreation of any given revision?
19.
What is the process for accepting and disposing product deficiency reports received
from customers?
20.
What process do you use for making changes and reviewing and testing fixes to
software components of the product?
21.
How do you classify errors? What do you consider to be a serious error?
22.
How many of the errors are serious enough to prevent a component from completing
its function?
23.
What revision(s) of the product are most reliable, why?
24.
How many hours of internal alpha and beta testing are done for the software product?
25.
What metrics are used for bench marking the software product? Are the metrics
calculated for failures, bugs, availability statistics etc.? Are they made available to the
customers? Are they collected by the vendor or an independent organization?
26.
Can the software be selectively disabled or certain portions left out to eliminate error
prone parts of the software product?
COG-95- 179
Appendix M
-MI-
COG-95-119
Maintenance Requirements
Product:
Company:
Company Representative(s):
Date:
~~
Maintenance Requirements
II
REQUIREMENT
This appendix defines the requirements to be met by the vendor to ensure that they
are able to maintain the software product.
The vendor shall be willing to sell the software design documentation
and sources or provide them via an escrow agreement?
~~
2.
The product shall have a complete set of administration and maintenance
documentation, where required. apart from the reference and user
information. This documentation shall form the basis of performing all
the in-house maintenance and support tasks.
3.
The vendor's administration and maintenance documentation shall (where
applicable):
Describe software installation and configuration/customization
procedures.
Describe software update notification procedures.
Describe the procedure for the installation of updates and patches.
Describe the Runtime and emergency back-up and restoration
procedures.
COMPLIES
REFERENCE
f
M2
-
COG-9s-179
Mainrcnance Rquircmcnu
Maintenance Requirements
I
REQUIREMENT
v
I
COMPLIES
(e)
Describe the software troubleshooting and bug repofling procedures (e.g.
means of communicating with technical support to ensure a timely
response to software problems).
(0
Describe the procedures for actively monitoring the status of the software
product while in an on-line state.
(s)
Identify the likelihood and frequency of product reconfiguration of
hardware. peripherals, operating system, functions. interfaces,
communication etc..
(h)
Specify the time and effort required to replace peripherals or circuit
boards or other hardware parts etc. for the system.
(i)
Identify the serviceable parrs (e.g. power supplies, firmware etc.) and
effort needed in servicing them.
ti)
Specify the requirements and product features needed io support backups.
hot standby and swap out systems.
(k)
Specify the need for initializing, pre-testing and controlling the
configuration of the various libraries or modules before rebuilding.
4.
Once the software engineering of the product has been completed (i.e.
after first customer ship of a revision). the product shall be placed under
"production" configuration management by the vendor. The "production"
configuration management shall:
(a)
Uniquely identify all categories of configuration items that form a
deliverable baseline for the software product: application software
(source, object, listing. load images), software media, all deliverable and
support documents, suppott tools (compilers, linkers, operating systems),
data, test software, command tiles or scripts, and pre-developed software.
(b)
Specify when and under what conditions all configuration items come
under change control. As a minimum. every documcnt nntl picrr or
source code sBall go undcr change control hcfon. l l i c prat(1ltr.l i\ ~Iiippd
to any customer.
I
REFERENCE
COMMENT
- M3
COG-95-179
Maintenance Requirements
Maintenance Requirements
REQUIREMENT
COMPLIES
~
(c)
Require that each change be either implemented starting at the highest
affected output and proceeding to subsequent outputs (e.g., CSR to
CSDDID to SRS to SDD to source code) or that each change be
documented so that all pertinent documentation shall be updated before a
subsequent major release of the product.
(d)
Ensure that each change to the software product shall undergo
verification and validation consistent with those performed during
software engineering before being released to the customer. This shall
include appropriate regression testing.
(e)
Define a mechanism for the review. classification, and adjudication of all
change requests by appropriate vendor personnel.
(f)
Identify a software, docurnentation and firmware library for the storage
and control of all deliverable revisions of the software product including
those that are special issues to any individual customer. Such a library
may be distinct from the one used for the original software engineering
of the product.
(g)
Provide procedures for the maintenance of this library to address
security, disaster recovery, archive maintenance, protection, access
control. retention period, change history, withdrawal, submission, issue,
and approval issues.
(h)
Identify a mechanism for analysis of the errors detected after first
customer ship as an aid for improving the vendor's processes and
procedures.
(i)
Define a mechanism for the collection, identification, dispositioning,
correction and verification of errors reported by the product
userslcustomers.
0')
Define a procedure for the integration of any changes made at a
customer site into the vendof s configuration management system.
(k)
Provide a means for the notification of reported errors to the product
users/customers. their impact and possible workarounds.
~
~~
REFERENCE
COMMENT
-M4-
COG-95-119
Maintenance Requirements
~_____
Maintenance Requirements
1I
REQUIREMENT
0)
Identify a procedure for ensuring that customers are delivered the correct
revision including a consistent set of product documentation.
(m)
Identify procedures for unique identification of:
.
..
software changes
software and software components
release, version and update designations
documentation
media
~
(n)
Define procedures for change request documentation, e m r analysis
summary reports, and configuration management status reports in order
to identify the most cumnt configuration and to provide a traceable
historv of software chanees.
(0)
Define procedures for approving and issuing software releases to the
user.
COMPLIES
REFERENCE
I
COMMENT
COG-95-179
Appendix N
- N1 -
COG-95- 119
Reference Sie Questionnaire
REFERENCE SITE QUESTIONNAIRE
Product:
Company:
Site Name:
Site Representative:
Date:
This questionnaire is used to obtain raw data for the evaluation of product reliability and
maintainability based on the analysis of anecdotal evidence.
Application Profile
1.
Please characterize the context of use of the product, Le. describe the application
system configuration at a high level.
2.
Please provide an indication of the purpose of the software product in the system in
which it is used. Also, identify any resultant application specific functional or
performance demands put on by the product.
3.
What operating environment is the product being used in, i.e. what hardware platform,
operating system, other third party software, special configurations is the software
product used in?
4.
Please identify any site specific configuration or usage issues (i.e. whether only a few
key features or functions are used, or is the configuration unique? etc.).
5.
Characterize the usage demands on the product, Le. continual heavy use (explain),
periodic use or occasional but heavy load demands etc..
6.
Have you used all of the functions and configuration options of the software product?
7.
Can you please summarize the hours of service of the product at your site?
Reliability
-N2-
COG-95-179
Reference Site Quesdonnairr
8.
If you had your choice now, would you still be using this product? Why or why not?
9.
What version of the software product are you currently running in the application?
10.
What other versions did you run previously in the application?
11.
How long have you run each of the versions of the software product?
12.
How many errors have you encountered in each of the versions of the software
product?
13.
Are any of these errors serious enough to impact your application? What are the
recovery actions? Do you have any workarounds for these errors?
14.
Are there any significant product functionality deficiencies or bugs outstanding that
have affected the performance or reliability of your system?
15.
Are there any internal procedures or practices undertaken in-house with respect to the
product's operation, servicing, upgrading, or maintenance which were needed to ensure
system or product reliability or availability (eg. procedures for re-start, swap-out,
re-configuration, diagnostics or monitoring, or servicing?)
16.
What parts of the product are particularly error-prone?
17.
What errors can a user make which will bring the systedapplication down?
Maintainability
18.
How reliable and accessible is the supplier?
19.
Do you have a maintenance and support contract for the software product? Have you
had to rely on it and under what circumstances? How often? Did you receive
sati sf actory support?
20.
Do you have a regular preventative maintenance contract? Has this impacted
reliability or availability of the product or the system as a whole? Are' you satisfied
with service?
21.
Are supplier personnel competent and helpful in answering questions and solving
problems?
22.
Is it relatively easy to install upgrades to the software product? What issues, problems,
or concerns have you encountered while upgrading the product? Have you had to put
in place any internal procedures or steps to deal with upgrades (ie. off-line test suites
- N3 -
COG-95-179
Referrnce Site Questionnaire
etc.)?
23.
How much time does it take to install upgrades to your production systems?
24.
Is the backup procedure supplied with the software product adequate?
25.
How long does it take to execute the backup procedure?
26.
Has the supplier installed any custom modificationdenhancements to the software
product for you?
27.
Were the custom modificationdenhancements satisfactory?
28.
Were the custom modifications/enhancements completed in a reasonable time?
29.
Were the installation of the custom modification/enhancements done in a reasonable
time?
Failure Modes
30.
Have you encountered any failure modes that are not clearly documented in the
documentation provided with the software product? What are they?
31.
For all failures that occur, do you receive a clear indication from the software product
as to the naturekause of the failure?
32.
What abnormal operating modes (low power, loss of power, etc.) do you expect the
software product to operate in? Does it handle all of these abnormal operating modes
adequately?
33.
Has the software product ever gone into an indeterminate state requiring special
recovery procedures? Are those recovery procedures described in software product's
documentation? What are those recovery procedures?
34.
Have you incorporated any special features into your application to deal with failure
modes of the software product? What are they?
35.
Is the software product susceptible to any failures of the application/person using it
(e.g. errors in interfacing with the software product). Are these susceptibilities
described in the software product's documentation? What are they?
36.
Does the software product have any fault tolerant capabilities that you rely on? Are
they documented in the software product's documentation? What are they?
- N4 -
COG-95-179
R c f a c ~ ~Site
e Quationnaire
37.
Do the fault tolerant capabilities of the software product work as expected? If not,
what are the limits of their capabilities?
38.
Does the software product provide a "fail safe" indication/output to the application?
Has it proven to be truly fail safe in every instance?
39.
Has the vendor's process for reporting and handling of bugs been adequate? Has the
response time and the reliability of fixed problems been good?
COG-95- 179
Appendix 0
- 01 -
COG-95-179
Maintenance Questionnaire
MAINTENANCE QUESTIONNAIRE
Product:
Company:
Site Name:
Site Representative:
Date:
This questionnaire is used to obtain information from the product's vendor regarding
maintenance issues.
1.
What is the estimated maximum time required to install and configure the product?
2.
What special knowledge is required in order to install and configure the product?
3.
What is the estimated maximum time required to install upgrades (or patches)?
4.
What special knowledge is required in order to install upgrades (or patches)?
5.
Is it possible to selectively upgrade components of the software product?
6.
Is it possible to upgrade a component when the product is in an online state (e.g.
loadable drivers or modules)?
7.
What are the required or recommended day-to-day maintenance tasks involved in using
the product (e.g. administration, backup, etc)?
8.
What is the expected effort involved in performing day-to-day maintenance of the
product?
9.
What special expertise is required to perform day-to-day maintenance of the product?
10.
Is training/assistance available for each of installation and configuration, installing
upgrades and day-to-day maintenance?
11.
What special maintenance tools, software and equipment should the project team
consider purchasing to assist in day-to-day maintenance of the product?
-02-
coo-95-179
MainDnancc Quutioanabe
12.
What level and type of support is available from the vendor for the installation,
configuration, upgrading and maintenance of the software product.
13.
What forms of access does the vendor provide for customer maintenance and support
team (e.g. technical support hot-lines, on-line archives, World Wide Web (WWW)
homepages, on-line conferencing facilities, remote diagnostics, etc.)?
14.
What is the expected product life expectancy and support life expectancy of the
product?
15.
What operating environment issues may have an impact on maintenance and support
of the product?
16.
Are the vendor's ongoing product enhancement activities taking into account complete
backward compatibility with existing and previous product revisions?
17.
What industry and de facto standards were adhered to while designing the product?
What certification has the product received for this adherence?
18.
Are there any special practices, procedures or test functions with regard to the product
which may impact operation of the product (e.g. should diagnostics be run off-line or
on-line)?
19.
Does the vendor provide onsite engineering support for the product? Does this onsite
support extend to all software portions of the product as well?
20.
What resources, software and equipment are needed to make changes to the product on
site?
21.
What are the issues or concerns with respect to the vendor supporting a product that
has been modified by the customer or by the vendor on behalf of the customer.
22.
What test suites (on-line and off-line) and diagnostic tools are available for product
problem identification?
23.
What are the likely enhancements or new features planned for the product and what is
their estimated timeframe for first customer ship.
24.
What tools are available for the online or automatic monitoring, reporting and reaction
to software product malfunction in a networked environment? What standard
protocols does such a feature support?
COG-95- 179
Appendix P
- P1 -
COG-95- 179
DISTRIBUTION LIST FOR COG-95-179
D.S. Ahluwalia
AECL - CRL Stn. 30
A. Hepburn
AECL - Saskatoon
J. Pauksens
AECL - SP2 F4
D. Arapakota
AECL - Saskatoon
G.J. Hinton
AECL - SP2 F3
G. Raiskums
AECL - SPI F3
G.H. Archinoff
AECL - SPI F1
R.J. Hohendorf
Ontario Hydro - H12 F27
C. Royce
Ontario Hydro
S. Basu
D.R.Huget
Ontario Hydro
S. Russomanno
~~
- Bruce A
Ontario Hydro
W.C. Bowman
Ontario Hydro - HI2 A26
E. Hung
Ontario Hydro
A. Campbell
Ontario Hydro
N. Ichiyen
- Pickering
AECL
- Bruce B
- SP
- H I 2 B25
AECL
- Darlington
G.R. Schneider
Ontario Hydro - Pickering
- SP1 F3
R.R. Shah
AECL - CRL Stn. 30
D. Chan
AECL - SPI F2
P.K. Joannou
Ontario Hydro
A.T. Chen
Ontario Hydro
R.A. Judd
AECL - CRL Stn. 91
H. Storey
NBEPC - Point Lepreau
G.D. Cleghorn
A.M. Kozak
A. Stretch
AECL - Saskatoon
C. Cosgrove
AECL - CRL Stn. E6
R. Landry
Ontario Hydro
- Darlington
D.R. Tremaine
Ontario Hydro
J. deGrosbois
AECL - CRL Stn. 91
D.K. Lau
Ontario Hydro
- H12 A26
R. Trueman
AECL - Saskatoon
- Pickering I
~
- H12 F27
T.E. Smart
Ontario Hydro
- Pickering
- H12 F25
~~
E.G. Echlin
AECL - CRL Stn. 30
L.R. Lupton
AECL - CRL Stn. 91
N.J. Webb
Ontario Hydro
- Bruce A
N. Gour
Hydro Quebec
D. Myers
AECL - Saskatoon
R. Zemdegs
Ontario Hydro
- H I 2 F21
- Gentilly-2
-
COG-95-179-1
Notice:
This report is not to be listed in abstract journals. If it is cited as a
reference, the source from which copies may be obtained should be
given as:
I
,
Document Centre
AECL
Chalk River, Ontario
Canada KOJ IJO
Fax: (613) 584-8144
Tel.: (613) 584-33 11
ext. 4623
Copyright 0 Atomic Energy of Canada Limited, 1995
Download