Software Inspection: Eliminating Software Defects

advertisement
Software Inspection: Eliminating Software Defects1
Bill Brykczynski
Reginald Meeson
David A. Wheeler
Institute for Defense Analyses
1801 N. Beauregard St.
Alexandria, VA 22311-1773
703-845-6641
bryk@ida.org
Introduction
Software inspection is an industry-proven process for eliminating defects and reducing
development costs in complex systems. Software inspections can identify and eliminate approximately 80 percent of all software defects during development. When inspections are combined
with normal testing practices, defects in fielded software can be reduced by a factor of 10. By
reducing the amount of rework typically experienced in development, inspections increase productivity and reduce costs and schedules. Cost and schedule reductions for typical applications are on
the order of 30 percent.
This paper identifies rework as a significant software development cost driver, describes
how inspection reduces rework, explores the costs and benefits from inspection, and examines the
current state of inspection practice in industry. Even though inspection is widely practiced by commercial software industry leaders and its benefits have been verified and documented, the process
is not commonly used in industry.
Cost of Software Testing
Figure 1 shows a pie chart that reflects the high cost of testing in software development
[Boehm 1987].What is striking about this figure is the huge amount of time spent on testing—nearly the entire bottom half of the pie. Upon further investigation, however, we found that a large part
of the testing time is not actually used for testing. The extra testing time is spent locating and correcting defects that are found by tests—rework. The amount of effort spent in rework is shown in
1
The work reported in this paper was conducted as part of Institute for Defense Analyses project T-R2.597.21 under
contract number MDA903-89-C-0003. The publication of this paper does not indicate endorsement by the Department of Defense, IDA, nor should the contents be construed as reflecting the official positions of those organizations. A similar version of this paper was presented at the 6th Annual Software Technology Conference, April 1015, 1994.
Preliminary
Design
16%
Detailed
Design
16%
Detailed
Design
24%
Preliminary
Design
12%
Requirements
7%
Code &
Unit Test
12%
Requirements
6%
Integration &
System Test
29%
Code &
Unit Test
24%
Integration &
System Test
10%
Rework
44%
Figure 1. Cost of Software
Development
Figure 2. Hidden Cost of Defect
Rework
Figure 2. 40 to 50 percent of typical software development effort is devoted to correcting defects
[Boehm 1987]. Such a high level of rework does not reflect positively on developers and, hence, is
not often openly reported. Figure 3 shows how the time spent correcting defects is distributed over
project phases [Boehm 1987]. Rework grows steadily and by the final integration and test phase
typically consumes two-thirds of the effort. Clearly, rework is an aspect of software development
that if reduced, can lead to significant cost, schedule, and risk reductions.
REWORK
44%
8%
19%
12%
4%
1%
16%
12%
12%
10%
6%
Requirements
Prelim.
Design
Detailed
Design
Code & Integration &
Unit Test System Test
= Rework
Figure 3. Distribution of Rework During Development
Many defects found in testing are directly traceable to requirements and design flaws that
could have been detected earlier. Defects that are detected soon after they are introduced are relatively easy and inexpensive to correct. When they are not detected until later in development, the
costs of correcting these same defects are compounded by having to undo work that was based on
the incorrect foundations. Finding that a requirement was not correctly understood in the last stages
of testing can easily lead to cost and schedule overruns. The cost of correcting the same problem
in the requirements phase is usually negligible. The most effective process known for finding
defects across all stages of software development is inspection.
Software Inspection Process
The inspection process was developed by Michael Fagan in 1972 while at IBM [Fagan
1976]. The process involves detailed, formalized examinations of work products. The objective of
inspection is to identify defects. No time is spent during an inspection discussing how to correct a
defect. Corrections are left for the author to make later.
Work products are small but complete chunks of work on the order of 200 to 250 lines of
code for code inspections. Requirements, designs, and other work products are inspected in similar-sized chunks. Work products are considered work in progress until the inspection and any necessary corrections are completed.
Inspection teams are formed by 4 to 5 coworkers. Each inspector will typically spend 1 to
4 hours reviewing the work product and related information before an inspection meeting, depending on how familiar they are with this material. Inspection meetings are generally limited to a maximum of 2 hours in length. After 2 hours, the number of defects found drops off significantly.
Managers are not permitted to participate in an inspection meeting. When managers participate in inspections, the process tends to identify only superficial defects. By finding superficial
defects, inspectors report respectable numbers of defects and authors are not embarrassed by serious defects in their work. The result is that the inspections are ineffective and serious defects
remain in work products. To avoid this phenomenon, management must be excluded from inspections. The performance of developers and inspection teams can be measured more accurately in
terms of defects that remain in finished work products.
Responsibilities for several roles are assigned to inspection team members. The most
important is the role of moderator, who must keep the inspection on track. The reader paraphrases
the work product while the author and other inspectors read along and comment on discrepancies.
The recorder or scribe records the locations and a brief description of any defects discovered.
The two principal outputs from an inspection are a list of defects for the author to correct
and an inspection summary for management that describes what was inspected, who the inspectors
were, and the number and severity of defects found. In addition, any systemic defects that are identified are reported for consideration in general process improvement.
For a more detailed description of the inspection process, see [Ackerman 1989, Fagan
1976, Fagan 1986, IEEE 1989].
Other Types of Reviews
It is important to distinguish between inspections and less formal reviews. Walkthroughs
and informal peer reviews are similar in many ways to inspections. Walkthroughs and informal
peer reviews are less rigorous and, for example, may skip the preparation time, eliminate or merge
roles (in particular, the author may serve as moderator), eliminate the follow-up on corrections, and
eliminate data collection for measuring effectiveness and process improvement. These deviations
weaken the process, making walkthroughs and informal peer reviews significantly less effective
than inspections.
Most of the published “success stories” involving review techniques concern the inspection
process, not walkthroughs or informal peer reviews. The IEEE standard for software reviews and
audits provides descriptions of and objectives for inspections, technical peer reviews, and walkthroughs [IEEE 1989].
Benefits from Software Inspections
Figure 4 shows the conventional sequence of software development phases. The numbers
located by the boxes show the numbers of defects that are passed on from one development phase
to the next. The number of defects at the completion of unit testing and the number delivered to the
field are industry averages [Jones 1986, Jones 1991].
The size of the arrows that point back from the testing phases to the construction phases
reflect the costs of correcting defects that were not detected earlier. For example, a misunderstood
requirement that is not recognized until the final system testing phase will typically have the highest cost to correct. It may also delay system delivery.
Figure 5 depicts the introduction of inspections for requirements, designs, and code. The
objective of these inspections is to find all the defects at each phase and to proceed to the next phase
with a completely correct basis. Even though the ideal of completely eliminating defects is rarely
achieved, the number of defects passed on to succeeding phases is reduced significantly. Inspections can also be used to ensure that test procedures and data are correct.
Requirements
Defects per KLOC
20
Design
40
Fielded
defects
per KLOC
100
Code
50
Unit Test
Integration Test
20
System Test
10
Rework Costs
Figure 4. Typical Software Defect Profile
Requirements
Insp.
Design
Reduced
Defects per KLOC
5 (20)
Insp.
Code
8 (40)
Insp.
Fielded
defects
per KLOC
15 (100)
7 (50)
Unit Test
Integr. Test
3 (20)
System Test
1 (10)
Reduced Rework Costs
Figure 5. Defect Profile with Inspections
Figure 5 also illustrates that the number of defects passed from phase to phase when inspections are used are dramatically reduced. (The numbers in parentheses show the number of defects
when inspections are not used.) The cumulative effect of requirements, design, and code inspections is an order of magnitude reduction in the number of defects in fielded products. In addition
to this gain in quality, there is a corresponding gain in productivity because the amount of rework
needed to correct defects during testing is significantly reduced.
Inspections reduce the number of defects in work products throughout the development
process. More defects are found earlier, when they are easier and much less expensive to correct.
Inspections are able to uncover defects that may not be discovered by testing. Examples of this
include identifying special cases or unusual conditions where an algorithm would produce incorrect results. In addition to finding defects, inspections serve as a training process where inspectors
(who are also authors of similar work products) learn to avoid introducing certain types of defects.
Inspection Costs
Inspections require an up-front investment of approximately 15 percent of total development costs. This investment pays off in a 25 to 35 percent overall increase in productivity. This
productivity increase, as demonstrated by Fagan in industry studies, can be translated into a 25 to
35 percent schedule reduction [Fagan 1986].
Figure 6 shows typical spending-rate profiles for development projects with and without
inspections. The taller curve shows increased spending early in the project, reflecting the time
devoted to inspections. This curve then drops quickly through the testing phases. The broader
curve, for projects that do not use inspections, shows lower initial spending but much higher spending through testing, reflecting the 44 percent rework being done. The area under the inspection
curve, representing total development cost, is approximately 30 percent less than the area under the
non-inspection curve.
With Inspections
(15% higher up front,
25-35% lower overall)
80
[from: Fagan, 1986]
60
Development
Expenditure
Rate
($,$$$/mo.)
Without
Inspections
40
(44% rework)
[from: Boehm, 1987]
20
0
0
6
12
18
Development Schedule (months)
Figure 6. Software Development Spending Profiles
24
Recent Usage Examples
Russell gives the following illustration of the cost effectiveness of inspections on a project
at Bell Northern Research that produced 2.5 million lines of code [Russell 1991]. It took approximately one staff hour of inspection time for each defect found by inspection. It took 2 to 4 staff
hours of testing time for each defect found by testing. Inspections, therefore, were 2 to 4 times more
efficient than testing. Later they found that it took, on average, 33 staff hours to correct each defect
found after the product was released to customers. Inspections reduced the number of these defects
by a factor of 10. For commercial software developers, all of these costs are paid out of profits.
In describing software process improvement activities at Raytheon, Dion uses Figure 7 to
illustrate that they have been able to reduce the cost of software rework by a factor of four [Dion
1993]. The “Cost of Rework” curve in this figure has decreased steadily since the start of their process improvement initiative. He attributes this success to software inspections this way: “In our
case, the cost of design and coding rose slightly because formal inspections replaced informal
reviews. However, it was this very change that enabled us to achieve rework savings in uncovering
source-code problems before software integration and eliminating unnecessary retesting.”
Although Raytheon spent more time fixing defects during design and coding, those small increases
were completely over-shadowed by the savings achieved by not having to fix them later during
integration and testing. The cost of fixing coding defects during integration, for example, has been
reduced by a factor of five.
Figure 7. Raytheon Cost Savings
© 1993 IEEE
Inspections are widely used in the commercial software industry where quality and productivity are critical to a company’s survival. There are many published reports from companies such
as IBM [Fagan 1986], Bull HN Information Systems [Weller 1993], and Bell Northern Research
(BNR) [Russell 1991] on the use and benefits of inspections. NASA’s Space Shuttle Program, and
Jet Propulsion Laboratory [Kelly 1992] have also published positive results on inspections.
Inspections and the Capability Maturity Model
The Software Engineering Institute has developed a Capability Maturity Model (CMM)
that can be used to assess or evaluate the maturity of a contractor’s software development process
[Paulk 1993a, Paulk 1993b]. The model characterizes an organization in terms of a maturity level.
There are five levels, each comprising a set of process goals that, when satisfied, stabilize an important component of the software process. Except for Level 1, each maturity level is decomposed into
several Key Process Areas (KPA’s) that indicate the areas an organization should focus on to
improve its software process. KPA’s identify the issues that must be addressed to achieve a maturity
level.
Within maturity level 3 there is a KPA named “Peer Reviews” that closely resembles the
inspection process. A question arises as to whether review processes less formal than inspection
satisfy the criteria of the Peer Review KPA. The current description of the CMM states that:
“The purpose of Peer Reviews is to remove defects from the software work products early and efficiently. An important corollary effect is to develop a better understanding of the software work products and of the defects that can be prevented. The
peer review is an important and effective engineering method that is called out in
Software Product Engineering and that can be implemented via Fagan-style inspections [Fagan86], structured walkthroughs, or a number of other collegial review
methods [Freedman90].” [Paulk 1993a, pp. 35-36]
The inclusion of “structured walkthroughs, or a number of other collegial review methods” is
unfortunate, as these are widely practiced but often not nearly effective as the inspection process.
For example, implementations of structured walkthroughs vary widely across industry, so it is
impossible to make a blanket statement like “structured walkthroughs satisfy the Peer Review criteria.” It is certainly possible to implement a rigorous structured walkthrough process that satisfies
the Peer Review criteria. A rigorous structured walkthrough process that focuses on defect detection would look very similar to the inspection process. Figure 8 suggests a number of factors that
can be used to distinguish between the two.
Most inspection processes, including the Fagan inspection process specifically mentioned,
closely map to the Peer Review criteria. Figure 9 suggests the state of DoD software review practice compared to the CMM Peer Review KPA.
High/Early
Focus on finding defects
Moderator controls
Follow-up on corrections
Collect and use data
Preparation time
Defect
Detection
Effectiveness
Focus on information
Author controls
No follow-up on corrections
No data collection
No preparation time
Low/Later
No
Reviews
Informal
Reviews
Structured
Walkthroughs
Inspections
ad hoc
rigorous
Figure 8. Review Effectiveness Factors
High/Early
CMM Level-3
Peer Reviews
Defect
Detection
Effectiveness
General DoD
Contractor
Practice
Low/Later
None/Never
No
Reviews
superficial
Informal
Reviews
Structured
Walkthroughs
Inspections
rigorous
Figure 9. Effectiveness of Software Reviews
Why Isn’t Inspection Common Practice?
Despite the variety of positive inspection experience reports, the process is not widely
used.1 We present a number of possible reasons for this state of current practice:
1
•
Technology transition/improvement is not easy. Transitioning any “new” processes is
difficult and can take 10-15 years before it is commonplace throughout the software industry [Redwine 1984]. Inspections were first introduced in 1976. Perhaps it is a technology
transition outlier?
•
Upfront cost. Some organizations may be reluctant to make the upfront investment in
inspections. The investment entails building inspection infrastructure (e.g., planning, training, developing forms) and the cost of inspection (e.g., preparation, meetings, filling out
and analyzing forms). Another possibility is the return-on-investment from transitioning to
inspections from a less formal review process may not be considered sufficiently high.
•
Confusion with other review processes. Many people do not distinguish between informal peer reviews, walkthroughs, and inspections. When inspection is discussed, they associate inspection with what they are already practicing and “turn off”. Of course, some
people may simply not be aware that the inspection process exists. Others may feel that less
formal review processes are sufficient, and that inspections add a level of nit-picking to a
review process that already works.
•
The alligator syndrome. An ongoing project that has many problems (e.g., volatile
requirements, missed deadlines, over-budget) may not be receptive to introducing a new
process. If many projects are in this state, it would be a general barrier to introducing
inspections.
•
Bad prior experience. The organization may have tried inspections and had a bad experience with it. Perhaps the type of software development being performed isn’t suited for the
inspection process (e.g., rapid prototyping), proper training wasn’t provided, or the moderator allowed author bashing. For whatever reason, the organization is reluctant to try it
again.
•
Improved quality not beneficial to the bottom line. For this particular product or set of
products, quality is desired but is traded for other goals (e.g., profit, schedule). Inspections
are perceived as improving quality at the expense of other goals more important to the software developer or organization.
This conclusion is based on a variety of professional contacts through email, conferences, participation on Government software evaluation teams and discussion with people that provide
inspection training.
How to Determine if you are Effectively Using Inspections
Inspection data provides valuable information that should be used to further
improve the development process. The benefits of inspection discussed in this paper may
not be experienced if the process is not closely followed. One can ask several questions
about process data to help in determining if inspections are being effectively applied:
1. How many defects per 1000 lines of code are found by inspection?
2. How has this defect rate changed over time?
3. What is the average rate of code inspection?
4. What process changes have resulted from causal analysis of defects detected by
inspection?
In some respect, the specific answers to these questions are not important. The understanding behind them is much more revealing. If the answers to these types of questions are not
readily available, or if the questions are not being asked by the inspection process leaders,
one must question whether the full benefits from inspection are being realized.
Suggestions on Getting Started on Inspections
Here are suggestions for how to begin using inspections in an organization. Like many
activities, effective insertion of inspections requires hard work.
•
Buy a book! Two excellent books on software inspection were recently published
[Strauss 1994, Gilb 1993]. These are the first two books wholly dedicated to the topic of software inspection and provide many useful pointers on how to implement
inspections.
•
Get your software process leaders interested. Ask if they think the inspection process has something to offer. If they are not familiar with inspections, or associate
the process with walkthroughs explain the process. One way is to provide them with
a few short articles; we suggest [Ackerman 1989], which provides a good description of the inspection process, and [Russell 1991], which presents an excellent
experience report. Other ideas for motivating process leaders include preparing a
short presentation and obtaining an outside expert.
•
Get your management interested. Similar to above, you need to talk to management
to sell them on inspections. Have them read the same two articles. Ask them if they
11
are satisfied with product quality. If they are not, suggest that inspections may be a promising method to increase quality (while at the same time reducing cost).
•
Develop an inspection insertion plan. As a prelude to getting management interested,
develop a point paper describing an approach for implementing inspections. Some of the
data and graphics from this paper might help to describe why inspections are beneficial. A
pilot project could be used to determine if inspections are desirable. Identify how staff will
be trained (e.g., outside consultants, in-house staff). Describe how success will be measured: how will you determine if the inspection process is beneficial? What infrastructure
is required (e.g., forms, checklists, procedure descriptions)?
Summary
The summary for of this paper is very simple. The inspection process is a proven industry
“best practice” for detecting defects and reducing costs. Unfortunately, the inspection process isn’t
routinely practiced. Inspection isn’t a panacea and it isn’t appropriate for all software development,
but it can effectively increase product quality in most software development efforts. If you are not
using inspections, investigate whether the process may improve your software development effort.
References
[Ackerman 1989]
Ackerman, A. Frank, Lynne S. Buchwald, and Frank H. Lewski. “Software Inspections: An
Effective Verification Process,” IEEE Software, Vol. 6, No. 3, May 1989, pp. 31-36.
[Boehm 1987]
Boehm, Barry W. “Improving Software Productivity,” IEEE Computer, Vol. 20, No. 9, Sep.
1987, pp. 43-57.
[Dion 1993]
Dion, Ray. “Process Improvement and the Corporate Balance Sheet,” IEEE Software, Vol. 10,
No. 4, July 1993, pp. 28-35.
[Fagan 1976]
Fagan, Michael E. “Design and Code Inspections to Reduce Errors in Program Development,”
IBM Systems Journal, Vol. 15, No. 3, 1976, pp. 182-211.
[Fagan 1986]
Fagan, Michael E. “Advances in Software Inspections,” IEEE Transactions on Software Engineering, Vol. 12, No. 7, Jul. 1986, pp. 744-751.
[Gilb 1993]
Gilb, Tom, and Dorothy Graham. 1993. Software Inspection. Reading, MA: Addison-Wesley
Publishing Co.
[IEEE 1989]
IEEE Standard for Software Reviews and Audits, ANSI/IEEE STD 1028-1988, IEEE Computer
Society, Jun. 30, 1989.
[Jones 1986]
Jones, Capers. 1986. Programming Productivity. NY: McGraw-Hill Book Co.
[Jones 1991]
Jones, Capers. 1991. Applied Software Measurement. NY: McGraw-Hill Book Co.
[Kelly 1992]
Kelly, John C., Joseph S. Sherif, and Jonathan Hops. “An Analysis of Defect Densities
Found During Software Inspections,” Journal of Systems and Software, Vol. 17, No. 2,
Feb. 1992, pp. 111-117.
[Paulk 1993a]
M.C. Paulk, B. Curtis, M.B. Chrissis, and C.V. Weber. “Capability Maturity Model for
Software, Version 1.1,” Software Engineering Institute, CMU/SEI-93-TR-24, February
1993.
[Paulk 1993b]
M.C. Paulk, C.V. Weber, S. Garcia, M.B. Chrissis, and M. Bush. “Key Practices of the
Capability Maturity Model, Version 1.1,” Software Engineering Institute, CMU/SEI-93TR-25, February 1993.
[Redwine 1984]
Redwine, Samuel T., et al, “DoD Related Software Technology Requirements, Practices,
and Prospects for the Future,” IDA Paper P-1788, June 1984.
[Russell 1991]
Russell, Glen W. “Experience with Inspection in Ultralarge-Scale Developments,” IEEE
Software, Vol. 8, No. 1, Jan. 1991, pp. 25-31.
[Strauss 1994]
Strauss, Susan H. and Robert G. Ebenau. 1994. Software Inspection Process. NY:
McGraw-Hill Book Co.
[Weller 1993]
Weller, Edward F. “Lessons from Three Years of Inspection Data,” IEEE Software, Vol.
10, No. 5, Sep. 1993, pp. 38-45.
13
Download