Chapter 22 Process and Project Metrics Software Engineering: A Practitioner’s Approach, 6/e

advertisement
Software Engineering: A Practitioner’s Approach, 6/e
Chapter 22
Process and Project Metrics
copyright © 1996, 2001, 2005
R.S. Pressman & Associates, Inc.
For University Use Only
May be reproduced ONLY for student use at the university level
when used in conjunction with Software Engineering: A Practitioner's Approach.
Any other reproduction or use is expressly prohibited.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
1
Until you can measure something and express
it in numbers, you have only the beginning of
understanding.
- Lord Kelvin
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
2
Until you can measure something and express
it in numbers, you have only the beginning of
understanding.
- Lord Kelvin
Non-software metrics you use everyday:
- Gas tank scale
- Speedometer
- Thermostat in your house
- Battery monitor in your laptop
What would happen if instead these were not numeric?
What other examples do you have?
Gas Tank
☐A Lot
☐Some
A little
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
3
Until you can measure something and express
it in numbers, you have only the beginning of
understanding.
- Lord Kelvin
The problem is non-numeric measurements are subjective… they mean
different things to different people. Numbers are objective… they mean the
same thing to everyone!
When your friend says “oh yeah, John/Jane Doe is super attractive, you
should go our with him/her”… that is a subjective measurement… so, you ask
“Send me a photo”. Why? Because the subjective term “super attractive” has
vastly different meanings for different people!
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
4
Metrics for software

When asked to measure something, always try to
determine an objective measurement. If not possible, try
to get as close as you can!
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
5
A Good Manager Measures
process
process metrics
project metrics
measurement
product metrics
product
What do we
use as a
basis?
• size?
• function?
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
6
We need a basis to say 20 defects
per X lines of code. Why is this
important?




A Because lines of code equals cost
B We want our metrics to be valid across projects of
many sizes
C Because you just caused me to die in Halo 3… stop
asking these questions!
D Because this helps up understand how big our
program is
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
7
Why Do We Measure?





assess the status of an ongoing project
track potential risks
uncover problem areas before they go “critical,”
adjust work flow or tasks,
evaluate the project team’s ability to control
quality of software work products.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
8
Process versus Project Metrics

Process Metrics - Measure the process to help update
and change the process as needed across many
projects

Project Metrics - Measure specific aspects of a single
project to improve the decisions made on that project
Frequently the same measurements can be used
for both purposes
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
9
Process Measurement

We measure the efficacy of a software process indirectly.



That is, we derive a set of metrics based on the outcomes of the
process
Outcomes include
 measures of errors uncovered before release of the software
 defects delivered to and reported by end-users
 work products delivered (productivity)
 human effort expended
 calendar time expended
 schedule conformance
 many others…
We also derive process metrics by measuring the characteristics of
specific software engineering tasks.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
10
Process Metrics Guidelines







Use common sense and organizational sensitivity when interpreting metrics
data.
Provide regular feedback to the individuals and teams who collect measures
and metrics.
Don’t use metrics to appraise individuals.
Work with practitioners and teams to set clear goals and metrics that will be
used to achieve them.
Never use metrics to threaten individuals or teams.
Metrics data that indicate a problem area should not be considered
“negative.” These data are merely an indicator for process improvement.
Don’t obsess on a single metric to the exclusion of other important metrics.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
11
If I calculate the number of defects
per developer and rank them, then
using that rank assign salary raises
based on that.



A. This is good
B. This is bad
C. Helloo…. Halo 3? Stop it.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
12
Software Process Improvement
Process model
Process improvement
recommendations
Improvement goals
Process metrics
SPI
Make your metrics actionable!
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
13
Typical Process Metrics

Quality-related


focus on quality of work products and deliverables
Productivity-related

•Correctness
Production of work-products
related to effort expended
•Maintainability
Statistical SQA data•Integrity
•Earned Value Analysis
•Usability
 error categorization
& analysis

Defect removal efficiency

propagation of errors
from found
process
activity
to activity
Defects
this (1-5)
stage
•Severity
of in
errors
Reuse data
--------------------------------------•MTTF (Mean time to failure)
This
Stage
+ Next
Stage
 The number of components
produced
andtotheir
degree of reusability
•MTTR
(Mean
time
repair)
 Within a single project this can also be a “project metric”. Across projects
this is a “process metric”.


These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
14
Can you calculate a metric that
records the number of ‘e’ that
appear in a program? A. Yes B. No



Should you calculate the number of ‘e’ in a program?
A. Yes
B. No
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
15
Effective Metrics (ch 15)






Simple and computable
Empirically and intuitively persuasive
Consistent and objective
Consistent in use of units and dimensions
Programming language independent
Should be actionable
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
16
Actionable Metrics
Actionable metrics (or information in general) are metrics
that guide change or decisions about something

Actionable: Measure the amount of human effort versus
use cases completed.



Too high: more training, more design, etc…
Very low: maybe we can shorten the schedule
Not-Actionable: Measure the number of times the letter
“e” appears in code
Think before you measure. Don’t waste people’s time!
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
17
Project Metrics



used to minimize the development schedule by making the
adjustments necessary to avoid delays and mitigate potential
problems and risks
used to assess product quality on an ongoing basis and, when
necessary, modify the technical approach to improve quality.
every project should measure:



Inputs —measures of the resources (e.g., people, tools) required to do
the work.
Outputs —measures of the deliverables or work products created during
the software engineering process.
Results —measures that indicate the effectiveness of the deliverables.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
18
Typical Project Metrics





Effort/time per software engineering task
Errors uncovered per review hour
Scheduled vs. actual milestone dates
Changes (number) and their characteristics
Distribution of effort on software engineering
tasks
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
19
Metrics Guidelines







Use common sense and organizational sensitivity when interpreting
metrics data.
Provide regular feedback to the individuals and teams who have
worked to collect measures and metrics.
Don’t use metrics to appraise individuals.
Work with practitioners and teams to set clear goals and metrics that
will be used to achieve them.
Never use metrics to threaten individuals or teams.
Metrics data that indicate a problem area should not be considered
“negative.” These data are merely an indicator for process
improvement.
Don’t obsess on a single metric to the exclusion of other important
metrics.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
Same as process metrics guidelines
20
Typical Size-Oriented Metrics








errors per KLOC (thousand lines of code)
defects per KLOC
$ per LOC
pages of documentation per KLOC
errors per person-month
Errors per review hour
LOC per person-month
$ per page of documentation
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
21
Typical Function-Oriented Metrics





errors per Function Point (FP)
defects per FP
$ per FP
pages of documentation per FP
FP per person-month
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
22
But.. What is a Function Point?


Function points (FP) are a unit measure for software
size developed at IBM in 1979 by Richard Albrecht
To determine your number of FPs, you classify a
system’s features into five classes:




Transactions - External Inputs, External Outputs, External
Inquires
Data storage - Internal Logical Files and External Interface Files
Each class is then weighted by complexity as
low/average/high
Multiplied by a value adjustment factor (determined by
asking questions based on 14 system characteristics
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
23
But.. What is a Function Point?
Count
Low
Average
High
External Input
x3
x4
x6
External Output
x4
x5
x7
External Inquiries
x3
x4
x6
Internal Logic Files
x7
x10
x15
External Interface
Files
x5
x7
x10
Total
Unadjusted Total:
Value Adjustment Factor:
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
Total Adjusted Value:
24
Function Point Example
http://www.his.sunderland.ac.uk/~cs0mel/Alb_Example.doc
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
25
Comparing LOC and FP
Programming
Language
Ada
Assembler
C
C++
COBOL
Java
JavaScript
Perl
PL/1
Powerbuilder
SAS
Smalltalk
SQL
Visual Basic
LOC per Function point
avg.
median
low
high
154
337
162
66
315
109
53
104
91
33
29
205
694
704
178
77
63
58
60
78
32
40
26
40
47
77
53
63
67
31
41
19
37
42
14
77
42
22
11
33
10
7
16
400
75
263
105
49
55
110
158
Representative values developed by QSM
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
26
At IBM in the 70s or 80s (I don’t
remember) they paid people per lineof-code they wrote





What happened?
A. The best programmers got paid the most
B. The worst programmers got paid the most
C. The sneakiest programmers, got paid the most
D. The lawyers got paid the most
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
27
Why Opt for FP?





Programming language independent
Used readily countable characteristics that are
determined early in the software process
Does not “penalize” inventive (short) implementations
that use fewer LOC that other more clumsy versions
Makes it easier to measure the impact of reusable
components
Other options: COCOMO, Planning Poker, SLIM, Story
Points (remember Scrum?), many others…
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
28
Object-Oriented Metrics




Number of scenario scripts (use-cases)
Number of support classes (required to implement the
system but are not immediately related to the problem
domain)
Average number of support classes per key class
(analysis class)
Number of subsystems (an aggregation of classes that
support a function that is visible to the end-user of a
system)
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
29
WebEngineering Project Metrics








Number of static Web pages (the end-user has no control over the content
displayed on the page)
Number of dynamic Web pages (end-user actions result in customized
content displayed on the page)
Number of internal page links (internal page links are pointers that provide a
hyperlink to some other Web page within the WebApp)
Number of persistent data objects
Number of external systems interfaced
Number of static content objects
Number of dynamic content objects
Number of executable functions
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
30
Measuring Quality




Correctness — the degree to which a program operates
according to specification
non-conformance
Maintainability—the degree to whichVerified
a program
is
with reqmts
---------------------------------amenable to change
MTTC KLOC
time tois
change:
Integrity—the degree to which aMean
program
impervious
time to analyze, design,
to outside attack
implement and deploy
threat
probability
a
change
Usability—the degree to which
a program
to use
security
- likelihoodisofeasy
repelling
attack
Integrity =  1-(threat*(1-security))
Many options. See ch 12
E.g. t=0.25, s=0.95 --> I=0.99
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
31
Defect Removal Efficiency
DRE = E /(E + D)
E is the number of errors found before delivery of
the software to the end-user
D is the number of defects found after delivery.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
32
Defect Removal Efficiency
DRE = E /(E + D)
Defects found during phase:
Requirements (10)
Design (20)
Construction
Implementation (5)
Unit Testing (50)
Testing
Integration Testing (100)
System Testing (250)
Acceptance Testing (5)
By Customer (10)
10 / (10 + 20) = 33%
20 / (20
What
are
+ the
50) =
rest?
28%
5 / (5 + 50) = 9%
50 / (50 + 100) = 33%
100 / (100 + 250) = 28%
250 / (250 + 5) = 98%
5 / (5 + 10) = 33%
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
33
Metrics for Small Organizations







time (hours or days) elapsed from the time a request is made until
evaluation is complete, tqueue.
effort (person-hours) to perform the evaluation, Weval.
time (hours or days) elapsed from completion of evaluation to assignment of
change order to personnel, teval.
effort (person-hours) required to make the change, Wchange.
time required (hours or days) to make the change, tchange.
errors uncovered during work to make change, Echange.
defects uncovered after change is released to the customer base, Dchange.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
34
Establishing a Metrics Program

Set Goals






Determine indicators for goals



Identify your business goals.
Identify what you want to know or learn.
Identify your subgoals.
Identify the entities and attributes related to your subgoals.
Formalize your measurement goals.
Identify quantifiable questions and the related indicators that you will use to help
you achieve your measurement goals.
Identify the data elements that you will collect to construct the indicators that help
answer your questions.
Define Measurements



Define the measures to be used, and make these definitions operational.
Identify the actions that you will take to implement the measures.
Prepare a plan for implementing the measures.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
35
Metrics give you information!



Metrics about your process help you determine if you
need to make changes or if your process is working
Metrics about your project do they same thing
Metrics about your software can help you understand it
better, and see where possible problems may lurk. Let’s
see the complexity measurement (after a few
questions…)
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
36
Questions





What are some reasons NOT to use lines of code to
measure size?
What do you expect the DRE rate will be for the
implementation (or construction) phase of the software
lifecycle?
What about for testing?
Give an example of a usability metric?
According to the chart, Smalltalk is much more efficient
than Java and C++. Why don’t we use it for everything?
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided
with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
37
Download