Formal Technical Reviews

advertisement
Technical Reviews
A Presentation By Manuel Richey
for EECS 814
November 17th, 2005
Overview Of This Presentation




There are numerous types of technical
reviews, and numerous methods for
implementing reviews.
This presentation will briefly describe
several types and techniques of reviews
It will go into greater detail on one type
and method for technical reviews
Finally, we will perform a technical review
in class.
Purpose of Reviews





Oversight
To create buy-in or a sense of ownership
To disseminate or transfer knowledge
To improve the quality of a document or
work product
Other purposes?
Types Of Reviews





Customer Reviews
Management Reviews
Peer Reviews
Personal Reviews
Other Types of Reviews?
Customer Reviews




Often specified as milestones in a contract
(e.g. preliminary and critical design
reviews).
Formal documentation is submitted prior
to the review.
A long walkthrough/review is conducted
with the customer.
Often leads up to customer sign-off of a
milestone.
Management Reviews





Each company has their own format for these.
Typically a power point presentation to
Management or technical leadership followed by
Q&A session
The usual objective is to either approve a project
or monitor project status
Inputs are typically project plans and schedules
or status reports.
Management team makes decisions and charts
or approves a course of action to ensure
progress, and properly allocate resources.
Technical Peer Review

Structured encounter in which a group of
technical personnel analyze a work
product with the following primary
objectives:


improve the original quality of the work
product
improve the quality of the review process
Why Hold Technical Peer
Reviews?





Software Development is a very error-prone
process.
Early detection of defects is cost effective, and
peer reviews find errors early.
Peer reviews find many of the same errors as
testing, but earlier and with less effort.
They serve to educate the participants and
provide training
They raise a team’s core competence by setting
standards of excellence
A Generic Review Process

Plan Review: assess readiness of work
product, assign team members, send out
announcement & review package.

Detect Defects: each reviewer looks for
defects in work product



Collect Defects: in a meeting or via email, etc.
Correct Defects: author corrects work product
Follow Up: verify rework, document review
Methods For Technical Reviews







Ad-Hoc
Personal
Walkthrough
Fagan Style Inspection
Asynchronous Review
N-Fold Inspection
Many Other Techniques
Ad-Hoc Reviews



Provide no process or instructions on how to
detect defects
Defect detection depends on inspector’s skill and
experience
Still valuable for:




Enforcing standards
Project status evaluations
Improved communications
Training and knowledge dissemination
Personal Reviews





An integral part of the “Personal Software
Process” by Watts Humphrey.
Involves only the author of a work.
Employs checklists and metrics (if
following PSP).
For PSP, code review occurs before first
compile.
May be performed by the author prior to a
Formal technical review.
Walkthroughs



A meeting in which the author presents a
work product in a sequential manner and
clarifies the product as necessary.
No preparation by meeting attendees.
May be held as part of another type of
review.

As an example, a walkthrough may be held for a work
product prior to distributing the review packet to the
reviewers for a Fagan style software inspection.
Fagan Style Software Inspections




A method of technical review that involves a
meeting based process and specific roles.
Process: Reviewers detect defects separately,
but hold a meeting to collect, classify and
discuss defects.
Defined roles: Moderator, Author, Presenter,
Recorder, etc.
We will examine this technique in detail.
Asynchronous Inspections



No Meeting, so review can be distributed in
space and time
Doesn’t involve author
Process is as follows:





Moderator sends out material via email
Individual reviewers create list of defects
Defect lists are circulated to all inspectors and
discussed via email
Individual reviewers update defect list and send to
Moderator
Moderator compiles final defect list, sends it to author
and follows up – eliminates group approval
N-Fold Inspections




Several independent teams inspect the same
work product using traditional inspection
methods.
Of course, many teams find overlapping defects,
but unique defects are typically found by each
team.
The Moderator collects faults from the
independent teams and composes the final
defect list.
This is an expensive process used when high
reliability is desired.
Fagan Inspection Process
A six step review process






Author submits work products for review
Moderator assesses the product’s readiness,
assigns review team, and announces the
review
Reviewers prepare for review
Reviewers hold review meeting
Author corrects defects
Moderator verifies rework and closes review
Standards and Checklists

Standards:




Rules for requirements/design/coding that all work
products must adhere to
Typically either project or company specific
Improve software maintenance and quality
Checklists:




A list of questions for the inspectors to answer while
reading the document.
Should be less than a page long
Should be derived from most common past defects
Should be periodically updated
Inspection Package




Work-Product to be inspected (line
numbered if possible)
Supporting documentation (requirements
or work-product from which the workproduct to be inspected was derived)
Checklists and Standards are available
Inspection meeting notice (Often sent by
email)
Fagan Inspection Roles





Producer/Author – creates the product being
reviewed and answers questions.
Moderator – prepares review package,
moderates the meeting, verifies rework.
Presenter/Reader – presents product during
meeting
Recorder/Scribe – records defects during
meeting
Reviewer – everyone is a reviewer, but you may
have reviewers that don’t have another role.
Reviewer’s Responsibilities



Responsible for objectively inspecting the
work-product
Responsible for tracking the amount of
time spent preparing for the inspection
meeting
Actively participate in inspection meeting
by providing defects found during
examination of the work product
Producer’s Responsibilities





Provides required reference material for
the inspection
Finds defects
Provides clarification
Answers questions
Modifies the inspected work-product to
correct defects
Moderators Responsibilities







Ensures entry criteria are met
Distributes the inspection package to review
team
Ensures that all reviewers are prepared prior to
the inspection meeting
Facilitates the inspection meeting
Also participates in review as a reviewer
Assures that all items logged at the meeting are
dispositioned
Collects the data and completes the inspection
record
Presenter’s Responsibilities



Presents the product in logical fashion
paraphrased at a suitable rate
Typical review rates are 100-200 LOC/hour (or
10-12 pages/hour for documents)
Can vary significantly due to following factors




language
comments and readability
type of software
structure of software
Recorder’s Responsibilities


Completes the defect log
Defects should be classified based on
team consensus by:





Severity (Major, Minor)
type
class
Should use techniques to minimize defect
logging time
Not a secretary
Review Meeting





Reviewers fill out and sign inspection form
indicating time spent reviewing product.
Reviewers collectively decide if they are ready
for the review to be held.
Presenter progresses through review product
eliciting defects as he progresses.
Recorder records defects on defect form.
Reviewers collectively disposition review as
“Accept As Is”, “Accept with Corrections” or “Rereview”.
Review Comprehension Methods

Front to Back



Bottom Up



Start at the lowest level routines and work up
Used when code is new to inspector
Top Down



Start at the front of a document or top of a code module, and
proceed to the end in sequential order
Use with documents, or if already familiar with the code design
Start at the main SW entry points and review those, then review
the routines they call
Used when inspector is familiar with code
Integrated

Use both Top-Down and Bottom-up approaches as appropriate.
What Makes A Good Reviewer





A good reviewer is thorough
Is prepared (most peer review
postponements are due to lack of team
preparation)
Reviews the product, not the producer
Raises issues, doesn’t resolve them
Doesn’t give the author the benefit of the
doubt
What Makes A Good Moderator





Encourages individuals to prepare and
participate
Controls Meeting (starts on time, keeps
focus on agenda, eliminates problem
solving, etc.)
Nurtures inexperienced reviewers
Is sensitive to Author
Feels ownership in quality of product
How To Select Reviewers


Participants are typically selected by the
Moderator.
Important criteria for selecting
participants:




ability to detect defects (expertise)
knowledge of source documents
need to understand work-product (recipients
of work-product)
motivation, and other personal qualities
Review Metrics




Help measure the effectiveness of reviews
Aid in continuous process improvement
Provide feedback to management
Typical metrics are:






Average preparation effort per unit of material
(typically LOC, KLOC or pages)
Average examination effort per unit of material
Average explanation rate per unit of material
Average number of defects and major defects found
per unit of material
Average hours per defect and per major defect
Percentage of re-inspections
Industry Experience

Aetna Insurance Company:


Bell-Northern Research:




Est. inspection savings (1993): $21,454,000
IBM


Inspection cost: 1 hour per defect.
Testing cost: 2-4 hours per defect.
Post-release cost: 33 hours per defect.
Hewlett-Packard


FTR found 82% of errors, 25% cost reduction.
Reported 83% defect detection through inspections
AT&T

Reported 92% defect detection through inspections
Personal Case Study
The KDR 510
A VHF aviation radio
and modem.
 A Real-time,
embedded, safety
critical, DSP system.
 Won the editors
choice award from
Flying Magazine.
 Formal peer reviews were main QA activity.

Quality Data for the KDR 510

KDR 510 reviews detected many errors.






72% of SW requirements defects
90.7% of SW design defects
90.6% of SW coding defects
Total review time was approx 5% of total project
time.
Only 23% of total project time was spent in
integration & test.
Only one error escaped into the field.
The Real Reasons For Holding
Reviews

Reviews improve schedule performance
Reviews
Revs
No Reviews
No
Revs.

Req R Design R
Req
Design
Code
R
Code
Test
Test
Reviews reduce rework.

Rework accounts for 44% of dev. cost!
Tools For Technical Reviews

Various tools for different inspection methods.





ICICLE – for inspection of C & C++ programs
Scrutiny & InspeQ for specific inspection processes
ASSIST –supports generic inspection process
For larger list, see:
http://www2.ics.hawaii.edu/~johnson/FTR/
Home grown tools




Typically built with Access Database.
Reviewer enters defects offline into database.
Eliminates recorder and reader roles.
Gives author time to consider defects before meeting
Some References





A great website for Formal Technical Reviews:
http://www2.ics.hawaii.edu/~johnson/FTR/
A Discipline for Software Engineering, Watts S. Humphrey, AddisonWesley, January, 1995.
M. E. Fagan, Design and code inspections to reduce errors in
program development, IBM Systems Journal, Vol 15, No 3, 1976,
182-211
G. M. Schneider, J. Martin, W. T. TSAI, An Experimental Study of
Fault Detection In User Requirements Documents, ACM Transactions
On SW Engineering & Methodology, Vol 2, No 2, April 1992, 188204.
A. Porter, H. Siy, C. Toman, and L. Votta, An Experiment to Assess
the Cost-Benefits of Code Inspections in Large Scale Software
Development, IEEE Transactions On Software Engineering, VOL. 23,
NO. 6, JUNE 1997, 329-346
Questions?
Review Workshop





Objective: Allow everyone to take a role in
a Fagan style code review
Combine results to create an N-Fold
inspection.
Break into teams of 4.
Handouts: Source Code Files, Supplementary
Materiel, Review Forms.
Schedule: 20 minutes to prepare for review,
20 minutes for review, 10 minute break for
everyone but moderators. 5 minutes to
summarize results.
Discussion on Review Workshop





Results from N-Fold inspection.
What did you learn from this code review?
Was it effective?
How long would it have taken to detect
some of these defects by testing?
Other comments or conclusions?
Conclusion




Reviews compliment software testing.
Reviews are cost effective techniques for
improving the quality of a developed
product. They pay for themselves.
Reviews improve the maintainability of a
developed product.
One size doesn’t fit all - An organization’s
size, culture and industry should be considered
in deciding on the methods to use for reviews.
Download