Supplementary Slides for Software Engineering: A Practitioner's Approach, 5/e

advertisement
Supplementary Slides for
Software Engineering:
A Practitioner's Approach, 5/e
copyright © 1996, 2001
R.S. Pressman & Associates, Inc.
For University Use Only
May be reproduced ONLY for student use at the university level
when used in conjunction with Software Engineering: A Practitioner's Approach.
Any other reproduction or use is expressly prohibited.
This presentation, slides, or hardcopy may NOT be used for
short courses, industry seminars, or consulting purposes.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
1
Chapter 4
Software Process and Project Metrics
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
2
Measurement & Metrics
... collecting metrics is too hard ...
it's too time-consuming ... it's too
political ... it won't prove anything ...
Anything that you need to
quantify can be measured in
some way that is superior to
not measuring it at all ..
Tom Gilb
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
3
Why do we Measure?
To characterize
To evaluate
To predict
To improve
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
4
TERMS
 A measure is established when a single data point has
been collected
(eg. The number of errors uncovered in the review of a single
module)
 A software metrics relates the individual measures in
some way
(eg. The average number of errors found per review)
 An indicator is a metric or a combination of metrics
that provide insight into the software process, project
or product.
(eg. Formal technical review may provide a higher return)
++
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
5
Process Indicators
Enable a software engineering organization to
gain insight into the efficacy of an existing
process (ie, the paradigm, tasks, products,
milestones)
Enable managers and practitioners to assess
what works and what does not
++
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
6
Project Indicators
Enable a software project manager to:
Assess the status of an ongoing project
Tract potential risks
Uncover problem areas before they go critical
Adjust work flow or tasks
Evaluate the project team’s ability to control
quality of software products
++
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
7
A Good Manager Measures
process
process metrics
project metrics
measurement
product metrics
product
What do we
use as a
basis?
• size?
• function?
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
8
Process Metrics
majority focus on quality achieved as a
consequence of a repeatable or managed
process
statistical SQA data
 error categorization & analysis
defect removal efficiency
 propagation from phase to phase
 reuse data
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
9
Process Metrics
Measuring the outcomes:






Errors
Defects
Productivity
Effort
Time
Schedule conformance
Measuring the characteristics of specific task:
 Effort and time spent performing the umbrella activities and
generic SE activities
++
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
10
Project Metrics
Effort/time per SE task
Errors uncovered per review hour
Scheduled vs. actual milestone dates
Changes (number) and their characteristics
Distribution of effort on SE tasks
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
11
Product Metrics
focus on the quality of deliverables
measures of analysis model
complexity of the design
 internal algorithmic complexity
 architectural complexity
 data flow complexity
code measures (e.g., Halstead)
measures of process effectiveness
 e.g., defect removal efficiency
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
12
Metrics Guidelines
 Use common sense and organizational sensitivity when
interpreting metrics data.
 Provide regular feedback to the individuals and teams who
have worked to collect measures and metrics.
 Don’t use metrics to appraise individuals.
 Work with practitioners and teams to set clear goals and
metrics that will be used to achieve them.
 Never use metrics to threaten individuals or teams.
 Metrics data that indicate a problem area should not be
considered “negative.” These data are merely an indicator for
process improvement.
 Don’t obsess on a single metric to the exclusion of other
important metrics.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
13
Normalization for Metrics
Normalized data are used to evaluate the process
and the product (but never individual people)
size-oriented normalization —the line of code approach
function-oriented normalization —the function point
approach
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
14
Typical Size-Oriented Metrics
errors per KLOC (thousand lines of
code)
defects per KLOC
$ per LOC
page of documentation per KLOC
errors / person-month
LOC per person-month
$ / page of documentation
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
15
Typical Function-Oriented Metrics
errors per FP (thousand lines of code)
defects per FP
$ per FP
pages of documentation per FP
FP per person-month
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
16
Why Opt for FP Measures?
independent of programming language
uses readily countable characteristics of the
"information domain" of the problem
does not "penalize" inventive implementations that
require fewer LOC than others
makes it easier to accommodate reuse and the
trend toward object-oriented approaches
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
17
Computing Function Points
Analyze information
domain of the
application
and develop counts
Establish count for input domain and
system interfaces
Weight each count by
assessing complexity
Assign level of complexity or weight
to each count
Assess influence of
global factors that affect
the application
Grade significance of external factors, Fi
such as reuse, concurrency, OS, ...
Compute
function points
function points =
(count x weight) x C
where:
complexity multiplier: C = (0.65 + 0.01 x N)
degree of influence: N = F
i
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
18
Analyzing the Information Domain
measurement parameter
count
weighting factor
simple avg. complex
number of user inputs
X 3
4
6
=
number of user outputs
X 4
5
7
=
number of user inquiries
X 3
4
6
=
number of files
X 7
10
15
=
number of ext.interfaces
X 5
7
10
=
count-total
complexity multiplier
function points
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
19
Taking Complexity into Account
Factors are rated on a scale of 0 (not important)
to 5 (very important):
data communications
distributed functions
heavily used configuration
transaction rate
on-line data entry
end user efficiency
on-line update
complex processing
installation ease
operational ease
multiple sites
facilitate change
+ Backup and recovery
+ Reusable
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
20
Measuring Quality
Correctness — the degree to which a program
operates according to specification
Maintainability—the degree to which a
program is amenable to change
Integrity—the degree to which a program is
impervious to outside attack
Usability—the degree to which a program is
easy to use
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
21
Defect Removal Efficiency
DRE = (errors) / (errors + defects)
where
errors = problems found before release
defects = problems found after release
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
22
Managing Variation – Why?
 Metrics collected for one project or product may not be
the same as similar metrics collected for another
project
 How can we tell if improved (or degraded) metrics values that
occur as consequence of improvement activities are having a
quantitative impact?
 How do we know whether we’re looking at a statistically valid
trend or not?
 When are changes to a particular software metric meaningful?
 A graphical techniques
 is available for determining whether changes and variation in
metrics data are meaningful
 Control chart
++
 enables individuals to determine whether the dispersion
(variability) and location (moving average) of process metrics
are stable or unstable
 Two different types: moving range & individual control charts
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
23
Managing Variation
Er, Errors found/ rev iew hour
The mR Control Chart
6
5
4
3
2
1
0
1
3
5
7
9
11
13
15
17
19
Project s
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
24
Moving range control chart
4.5
4
3.5
Differences in
successive
Er values
3
2.5
2
1.5
1
0.5
0
1
2
3
4
5
6
Eg. @ project 1  4.3 – 3.1 = 1.2
++
7
8
9
10 11 12 13 14 15 16 17 18 19
Project
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
25
Individual control chart
7
6
5
Error found/
review hour
Std dev
4
3
2
1
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
++
Project
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are
provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001
26
Download