The Mystery Behind Metrics

advertisement
Metrics
Workshop
ReedShell@Yahoo.com
ReedShell@yahoo.com
1
Contents
Topic 1: Basics of IT Metrics
Workshop
Topic 2:The 10 Step Process
Topic 3: Project Management Metrics - Planning
Topic 4: Project Life Cycle Metrics
ReedShell@yahoo.com
2
“When you can measure what you are speaking about and
express it in numbers, you know something about it.”
“Without the right information, you are just
another person with an opinion.”
Measurements
ReedShell@yahoo.com
3
Topic 1: Basics of IT Metrics

What is Data?
A collection of facts.
Portland is 307 miles away.

What is Information?
Meaningful data. Information is an inference
drawn based on facts that data represents
“It’s a long drive to Portland” is information inferred
from the data that Portland is 307 miles away.

What are Metrics?
Metrics establish the relationship between
individual sets of data.
At a speed of 75 mph, it will take roughly 4 hours to
drive 307 miles.
ReedShell@yahoo.com
4
Measurement
Based
Techniques
“Youcannotmanagewhatyoudon’tmeasure.”
‐ Drucker
Applied
to
IT Processes,
Products, and
Services
To Supply
To Improve
Engineering and
Management
Information
IT Metrics
ReedShell@yahoo.com
5
Where are IT metrics found?
•
•
•
•
•
•
Project cost and effort
ROI and cost of operations
Performance evaluation
Quality control and evaluation
Reliability predictions
Specific measures and algorithms
Metrics enable an organization to constantly
improve its quality standards to match global
standards and benchmarks.
They also help achieve project-specific goals,
improve existing processes and determine
corrective or preventive measures.
IT Metrics
ReedShell@yahoo.com
6
Metrics Used at Different Levels
Output
Track
• Performance against industry benchmarks
• Relative growth of organization
• Sequential Growth of organization
Organizational
Level
Track
• Performance against industry benchmarks
• Relative growth of organization
• Sequential Growth of organization
Output
Business Unit /
Vertical Level
Track
• Project progress against schedule
• Costs incurred against budget
• Adherence to quality standards
• New business goals
• Revised organizational norms
• New business goals
• Revised organizational norms
Output
Project Level
• Tactical decisions to improve
health of the project
ReedShell@yahoo.com
7
Why Measure?
To Understand
•
Gain understanding of processes, products,
resources, and to establish baselines for
To Evaluate
•
Determine status with respect to plans
To Predict
•
The success of any organization is depend on
being able to make predictions and
commitments relative to the deliverable it
produces.
To Control
•
Helps us to identify road blocks, root causes,
inefficiencies, and other opportunities for
improving product quality and process
performance.
The Law of Uncertainty
The very act of measuring alters
the results for the better.
ReedShell@yahoo.com
8
Roles of Measurement
This information can be used to:
•
•
•
•
•
Establish baselines, standards, and goals
Derive models of our IT processes
Examine relationships among process parameters
Target process, product and service improvement efforts
Better estimate project effort, costs, and schedules.
Understand
Control
Processes,
Products, and
Services
Evaluate
Predict
ReedShell@yahoo.com
9
Understand
Gather metrics to learn about our current products, processes, and capabilities:
•
How much are we spending in development?
o
Testing, maintenance, and operations
•
How much does it cost us to fix an operations reported defect?
•
Where do we allocate and use resources throughout the life cycle?
•
How many errors and what type of errors are typical in our products and services?
Understand
Control
Processes,
Products, and
Services
Evaluate
Predict
ReedShell@yahoo.com
10
Evaluate
Examine and analyze metrics information as a part of decision making





Perform a reliability analysis of defect rates during system test to determine if the product is
ready to ship
Evaluate how much we can compress the schedule if we add two more resources
Perform a cost benefit analysis on purchase of a new tool
Determine the impacts of requirements changes
Analyze and prioritize identified risks
Understand
Control
Processes,
Products, and
Services
Evaluate
Predict
ReedShell@yahoo.com
11
Control
Use metrics to control processes and product quality

Red flags
◦ Variances
◦ Thresholds
◦ Control limits
◦ Standards and performance requirements

Contingency plan triggers for risk management
Understand
Control
Processes,
Products, and
Services
Evaluate
Predict
ReedShell@yahoo.com
12
Predict
Use metrics to estimate future values
•
•
•
•
•
•
•
How much will it cost? (Budget)
How long will it take? (Schedule)
How much effort will it take? (Staffing)
What other resources will it take? (Resources)
What is the probability that something will go wrong? (Risk)
How many errors will it have? (Quality)
How will those errors impact operations? (Reliability)
Understand
Control
Processes,
Products, and
Services
Evaluate
Predict
ReedShell@yahoo.com
13
Effective IT Metrics
Provide Objective information to help:
•
•
•
•
•
•
•
•
Make day-to-day decisions
Identify project issues
Correct existing problems
Manage risks
Evaluate performance levels
Assess the impact of changes
Accurately estimate and track effort, costs, and schedules
Improve process, products, and services
ReedShell@yahoo.com
14
Measurement Defined
Entity
Attribute:
Features &
Properties
Entities & Attributes
Internal Attribute Measured in
terms of the entity itself
UT
Mapping:
Numbers
& Symbols
External Attribute: Measured in
terms of how the entity relates to its
environment
ReedShell@yahoo.com
15
Entities & Attributes: Resources
Entity
Internal Attribute
External Attribute
Individual
•
•
•
Age
Intelligence
Salary
•
•
Productivity
Experience
Team
•
•
Size
Structure
•
•
Cohesion
Ease of communication
Computers
•
•
MIPS
Memory Size
•
•
Usability
Performance
Network
•
Bandwidth
•
•
•
Utilization
Availability
Performance
ReedShell@yahoo.com
16
Entities & Attributes: Product
Entity
Internal Attribute
External Attribute
Document
•
•
•
•
Length
Functionality
Modularity
Syntactic correctness
•
•
•
Understandability
Usability
Maintainability
Code
•
•
•
Size
Cohesion
Structural complexity
•
•
•
•
Reliability
Portability
Usability
Maintainability
Defect Report
•
•
•
Severity
Priority
Age
•
Impact on the customer
ReedShell@yahoo.com
17
Entities & Attributes: Process
Entity
Process or Time Period
Internal Attribute
•
•
•
Length of Time
Effort
Number of incidents of a
specific time
External Attribute
•
•
•
•
Controllability
Stability
Effectiveness
Efficiency
ReedShell@yahoo.com
18
Measurement Error
Errors in measurement can result from:
•
•
•
•
•
Imperfect definition of what we are measuring
Imperfect understanding of how to convert what we are measuring into the measurement
signal
Failure to establish the correct conditions for taking the measurement
Human error
Defective or deficient measurement instruments (poor calibration)
Validity vs. Reliability
•
Data should be both valid and reliable
ReedShell@yahoo.com
19
Measurement Models
SEI SW - CMM
Software Engineering Institute (SEI) - Capability Maturity Model (CMM)
•
Purpose is to develop and refine an organization’s processes
CMMI
CMMI – Capability Maturity Model Integration
IFPUG
International Function Point Users Group (IFPUG)
•
•
Standardized function point counting criteria
Certified Function Point Analyst
ReedShell@yahoo.com
20
The Balanced Scorecard
A report used by managers to track the execution and monitor the consequences
of activities
•
Two objectives:
•
•
Track business performance over time in relation to strategic goals
Assess performance in relation to world class organizations performing similar activities
ReedShell@yahoo.com
21
The Balanced Scorecard
The challenge for the managements:
Achieving a balance of internal controls and diagnostic indicators to ensure
accountability by:
•
•
•
•
Specifying goals and desire outcomes
Identifying Key Performance Measures
Continually measuring and assessing progress
Creating a culture of “Evidence” where decisions are fact based
ReedShell@yahoo.com
22
The Balanced Scorecard
4 Perspectives of the Balanced Scorecard
1.
2.
3.
4.
Financial
Customer
Internal Business Processes
Learning and Growth
ReedShell@yahoo.com
23
The Balanced Scorecard
Internal Business Processes
Identifying the
market’s need
Creation of
product/service
offering
Building of
products/services
Delivery of
products/services
Support and
Operations
Standard Measures
•
•
•
•
Number of new applications developed
ROI from new applications
Time tot deliver – v/s competitors
Time to develop next generation products and services
ReedShell@yahoo.com
24
Strategy of Metrics Practices
Org
Level
Business Unit
Level
Project Level
•
•
•
•
•
•
•
Strategy
Policy
Decisions
Manage
Improve
Tactical
Execution
•
•
•
•
•
•
•
Business Results
Strategy Plan
Industry Benchmark
Customer Satisfaction
SEE Index
Process Compliance
Tools Effectiveness
Measured by
•
•
•
•
•
•
•
Process Capability
Process Improvement
Resource Utilization
Customer Satisfaction
SEE Index
Process Compliance
Tools Effectiveness
Measured by
•
•
•
•
•
Resources and Cost
Schedule and Progress
Risk Management
Quality
Customer Satisfaction
Measured by
ReedShell@yahoo.com
25
Topic 2: 10 Step Process
Workshop
ReedShell@yahoo.com
26
Topic 2: 10 Step Process
10 Steps for Effective Metrics
1.
Identify Metrics Customer
2.
Identify Goals
3.
Define Metrics
4.
Identify Data to Collect
5.
Define Data Collection Methods
6.
Data Analysis and a Metrics Database
7.
Define Reporting and Feedback Mechanism
8.
Document the Metrics Process
9.
Human Aspects of Metrics
10. Continuously improve the Process
ReedShell@yahoo.com
27
1. Identify Metrics Customer
Challenges
•
•
•
What data to collect and how much
Who needs the information?
Classifying customer types
ReedShell@yahoo.com
28
1. Identify Metrics Customer
Challenges
Two ways to collect data
1. Collect Data on everything,
then find the meaning
2. Implement a random selection
of metrics
Problems
Collect data on everything
• Metrics are expensive
• Too many possible measures
• Easy to drown in the enormity
of the task
Random selection of metrics
• Incomplete Inconsistent
• Inadequate
ReedShell@yahoo.com
29
1. Identify Metrics Customer
Types of Customers
•
•
•
•
•
Functional management
Project management
Practitioners
Specialists
Customers/users
ReedShell@yahoo.com
30
2. Identify Goals
GQM – Goal/Question/Metrics Method
1. Identify goal
2. Identify questions that need answering to
determine if the goal is being met
3. Identify metrics that provide necessary
information to answer the questions
Frustrations
Goals
ReedShell@yahoo.com
31
2. Identify Goals
Asking Questions
The goal is to maintain a high level of customer satisfaction
Appropriate Questions:
•
•
•
•
What is our current level of customer satisfaction?
What attribute of our products and services are most important to
our customers?
How do we compare with our competition?
How do problems with our services affect our customers?
ReedShell@yahoo.com
32
2. Identify Goals
Example:
Select metrics that provide the information needed to answer the questions
Goal
Question
Metric
Ensure all known
defects are corrected
before release
How many defects
were discovered
during testing?
•
Number of defects discovered
•
Defects arrival rate by status
ReedShell@yahoo.com
33
3. Define Metrics
When Selecting Metrics:
•
•
•
•
Be practical, realistic, pragmatic
Avoid “Ivory Tower” approach
Consider current engineering environment
Start with the possible
ReedShell@yahoo.com
34
3. Define Metrics
Process Stages for questions and metrics
1. New process
2. Implemented process
3. Improving the process
ReedShell@yahoo.com
35
3. Define Metrics
1. New Process (Low Maturity)
Goal
•
Perform Peer Reviews
Question
Metric
1.
Are we doing Reviews?
•
•
Number of Reviews
Total Effort
2.
What are we reviewing?
•
Coverage %
3.
When are we Reviewing?
•
Number of Reviews by
phase
ReedShell@yahoo.com
36
3. Define Metrics
2. Implemented Process (High Maturity)
Goal
•
Effectively Perform
Reviews
Question
Metric
1.
•
Detected Defect density vs.
• Preparation rate
• Review rate
• Review size
• # attended
•
•
•
Defect Discovery Rate
Control Charts (prep rate, review rate,
defect density)
Phase Containment
•
•
•
•
Phase Containment
Defect Removal Effectiveness
Field review defect trends
ROI
2.
3.
What are optimal Review
parameters?
How well are we
performing Reviews?
How well are Reviews
working?
ReedShell@yahoo.com
37
3. Define Metrics
3. Improving the Process
Goal
•
Use the data from
Reviews to drive
Process Improvement
Question
Metric
1.
During what phase are
defects introduced?
•
Phase containment effectiveness
2.
What type of defects are
being introduced?
•
Pareto (80% effects come from 20% of
the causes)
• Reason
• Category
3.
Are the defects being
removed?
•
•
High defect programs
Defect Removal Efficency
ReedShell@yahoo.com
38
3. Define Metrics
Metrics Requirements Statement
The purpose of the metric is…
To
To
understand
evaluate
control
predict
evaluate
the
the
attribute
of the
entity
number by
status of
the known
defects
in order
to
goal(s)
in order
to
Ensure all
known
defects are
corrected
before next
release
ReedShell@yahoo.com
39
3. Define Metrics
Metrics Requirements Statement
Benefits of a clearly defined metrics requirement:
•
•
•
•
Ensures a well defined metric based on goals
Eliminates misunderstandings
Communicates needs
Proves basis for metrics design
ReedShell@yahoo.com
40
4. Identify Data to Collect
Challenges
•
Establishing counting criteria
•
•
•
Standardized units of measure
Interpreting the measure the same way
Determining additional qualifiers
•
Data classification for meaningful interpretation
ReedShell@yahoo.com
41
4. Identify Data to Collect
Establish Counting Criteria
•
•
•
Define rules for mapping numbers and symbols onto attributes
Help everyone interpret the measure the same way
Define the first level of data to collect (metrics primitives)
Types of counting criteria
•
•
•
Standardized units of measure (hours, days, $)
Counts of items with certain characteristics
Check sheets
ReedShell@yahoo.com
42
5. Define Data Collection Methods
Challenges
•
•
•
•
What data to collect?
Who should collect it?
How to collect the data?
What should be done with inaccurate and incomplete data?
ReedShell@yahoo.com
43
5. Define Data Collection Methods
•
What data to collect?
•
•
•
Who should collect it?
•
•
Metric primitives and their counting criteria
Additional qualifiers
The Performer (direct access to source data, responsible for generating data)
How to collect the data?
•
Automate
ReedShell@yahoo.com
44
5. Define Data Collection Methods
Inaccurate Data
•
•
If data is accurate enough for requirements, keep it
Reporting inaccurate data can help make it accurate
Incomplete Data
•
Estimate the entire dataset from the existing data using multiplication factors in the metric’s
algorithm
ReedShell@yahoo.com
45
6. Data Analysis
Challenges
•
Establishing criteria for metrics decision based on
1.
Thresholds
2.
Variances
3.
Control limits
•
Establishing entry/exit criteria for making decisions
Criteria for Metrics Decision
•
Thresholds, targets, or patterns should
be used to
o
Determine the need for action
or further investigation
o
Describe the level of
confidence in a given result
•
Provide guidance for interpreting the
metrics results
ReedShell@yahoo.com
46
6. Data Analysis
1. Thresholds Based on Predictions
Thresholds are established boundaries
that when crossed, indicate action is
needed. Threshold may be established on:
•
•
•
•
Historic data
Future predictions
Customer requirements
Industry standard and/or benchmarks
ReedShell@yahoo.com
47
6. Data Analysis
2. Variances
Variances compare actual values with
expected values. Decisions are based on
the magnitude of the difference.
Variances are typically calculated as
• Ratio
• Absolute delta
ReedShell@yahoo.com
48
6. Data Analysis
3. Control Limits
Purpose: To control an attribute of a
process over time
Purpose: To control an attribute of a process over time.
ReedShell@yahoo.com
49
6. Data Analysis
Entry/Exit Criteria
Entry/Exit Criteria are specific, measureable
conditions that must be met before the process can
be started/completed
System Test example:
•
•
•
•
System test effort variance
System test activity status
Problem report arrival rate
Non-closed problem report counts
ReedShell@yahoo.com
50
6. Data Analysis
Example: Decision Criteria
Metric – Cumulative defects by status
Decision criteria for non-closed defects – At Ready to Ship:
•
•
•
Zero non-closed critical defects
Less than 3 non-closed major defects all with customer approved
“work-arounds”
Less than 10 non-closed minor defects
ReedShell@yahoo.com
51
7. Define Reporting & Feedback Mechanism
Challenges
•
•
•
Importance of report format
Timing
Methods of Delivery
ReedShell@yahoo.com
52
7. Define Reporting & Feedback Mechanism
Open
Fixed
Resolved
Sep 11
20
21
13
Oct 11
13
24
31
Nov 11
31
22
27
Dec 11
13
18
31
Sales
Data Table
• Compact and structured
• Many details in small area
• Only values – must be evaluated
Pie Chart
• How total breaks down into parts
1st Qtr
2nd Qtr
3rd Qtr
4th Qtr
Line Graph
• Trends over time
6
4
2
0
Sep '11
Oct '11
Nov '11
Dec '11
ReedShell@yahoo.com
53
7. Define Reporting & Feedback Mechanism
Bar Chart
• Show numbers, ratios, or proportions
• Quantitative or discrete variables
4
3
2
1
0
1st Qtr
2nd Qtr
3rd Qtr
4th Qtr
Grouped Bar Chart
• Comparison of two or more categories
• Easy to pick out difference
4
3
2
1
0
1st Qtr
2nd Qtr
3rd Qtr
4th Qtr
Stacked Bar Chart
• How total Breaks into parts
• Height of bar corresponds to frequency
10
8
6
4
2
0
Sep '11
Oct '11
Nov '11
Dec '11
ReedShell@yahoo.com
54
7. Define Reporting & Feedback Mechanism
Bar Chart
• Show numbers, ratios, or proportions
• Quantitative or discrete variables
40
30
20
10
0
1
2
3
4
5
6
7
Box Plot
• Shows Variance
• Differences within Group Differences between group
70
60
50
40
30
20
10
0
1
2
3
4
5
ReedShell@yahoo.com
55
7. Define Reporting & Feedback Mechanism
Delivery
•
•
•
Mechanism – Hard copy of electronically
Distribution – Target audience
Availability – Who can access? Are there any restrictions?
ReedShell@yahoo.com
56
8. Document the Metrics Process
Challenges
•
•
•
•
Understanding the process
Mapping the process
Documentation of the process
Training and implementation
ReedShell@yahoo.com
57
8. Document the Metrics Process
Documenting the metrics process steps
1.
2.
3.
4.
5.
6.
7.
Develop the process maps
Identify metrics
Provide standard definition of metrics
Identify places to measure
Define data collection methods
Define analysis methods and decision criteria
Define reporting and feedback mechanism
ReedShell@yahoo.com
58
9. Human Aspects of Metrics
The People side of the Metrics Equation
1.
2.
How measures affect people
How people affect measures
“Tell me how I will be measure,
I’ll tell you how I will behave.”
- David Packard
ReedShell@yahoo.com
59
9. Human Aspects of Metrics
Human Factors: Do’s and Don’ts
Do’s
Don’ts
Select Metric based on goals
Measure individuals
Provide feedback
Use metrics as a stick
Focus on processes, products,
and services
Use only one metric
Obtain “buy in”
Ignore the data
Introduce Metrics Incrementally
1.
2.
3.
4.
Minimize impact of change
Ensure staff is not overwhelmed
Small successes create a foundation for acceptance of future metrics
Lessons are learned one step at a time
ReedShell@yahoo.com
60
10. Metrics in Process Improvements
Product and Process
Improvement Metrics
Goals
Identify areas for
continuous
improvement
Questions
What are our defect
prone products and
processes?
How well are we
detecting defects?
Monitor the impact
of those
improvements
How well are we
preventing defects?
Metrics
Defect prone
components
Phase Containment
Effectiveness
Defect Detection
Efficiency
Field Pass Yield
ReedShell@yahoo.com
61
10. Metrics in Process Improvements
Phase Containment Effectiveness
Used to evaluate the effectiveness of defect
detection techniques utilized during each phase
of the development life cycle in order to identify
areas for continuous improvement and to
monitor the impact of those improvements.
Example:
Phase
Detected
Phase Detected Defect was Introduced
Requirements
Design
Code
Requirements
15
-
-
Design
5
29
-
Code
1
7
86
Test
3
3
19
Field
1
2
7
Total
25
41
112
Requirements: 15 / 25 = 60%
Design: 29/41 = 71%
ReedShell@yahoo.com
62
10. Metrics in Process Improvements
First Pass Yield
Used to evaluate the effectiveness of prevention
techniques utilized during each phase of the life
cycle in order to identify areas for continuous
improvement and tom monitor the impact of
those improvements
Example:
Required
Modified
Requirements
18
Design
4
Code
5
Test
3
Post
Implementation
1
Total number of requirements: 100
Requirements first pass yield: 87/100 = 87%
ReedShell@yahoo.com
63
Topic 3: Project
Management Metrics:
Planning
Workshop
ReedShell@yahoo.com
64
Topic 3: Project Management Metrics: Planning
Estimation
•
Size
o
o
o
•
•
•
•
Lines of Code
Function Points
Other size metrics
Effort
Cost
Calendar time
Critical computer resources
Goals
Questions
Metrics
Lines of Code
Deliver product on
time & within budget
How big is the
product we are
producing?
Function Points
Other size metrics
ReedShell@yahoo.com
65
Topic 3: Project Management Metrics: Planning
Estimation: Size
Lines of Code
Measuring size in lines of code
•
•
•
Problems and anomalies
SEI Report: CMU/SEI-92-TR-20
o
Check sheets for counting criteria
o
Physical and logical lines of code
Recommendation – Use a tool
ReedShell@yahoo.com
66
Topic 3: Project Management Metrics: Planning
Estimation: Size
Lines of Code
5 Steps
1. Decide what to count
2. Identify application boundary
3. Count the function points
4. Calculate the Value Adjustment Factor
5. Calculate Adjusted Function Point Count
External Input
The Project’s software
External Inquiry
Other Software or
Hardware
External Interface file
Internal Logical file
External Output
ReedShell@yahoo.com
67
Topic 3: Project Management Metrics: Planning
Estimation: Size
Function Points
Step 3: Count Function Points
User Function Level
Low
Average
High
Internal Input
x3
x4
x6
External Output
x4
x5
x7
External Inquiry
x3
x4
x6
Logical Internal file
x7
x10
x15
External Interface file
x5
x7
x10
Weighting External Inputs
File Types Referenced
Data Elements
1-4
5-15
16+
<2
Low – 3 FP
Low – 3 FP
Avg – 4 FP
2
Low – 3 FP
Avg – 4 FP
High – 6 FP
>2
Avg – 4 FP
High – 6 FP
High – 6 FP
ReedShell@yahoo.com
68
Topic 3: Project Management Metrics: Planning
Estimation: Size
Function Points
Step 4: Calculate the Value Adjustment Factor
14 factors measured on a scale of 0-5 degrees of influence
1.
2.
3.
4.
5.
6.
7.
Data Communications
Distributed data or processing
Performance objectives
Heavily used configuration
Transaction rate
On-line data entry
End-user efficiency
8. On-line update
9. Complex Processing
10. Re-usability
11. Conversion and installation ease
12. Operational ease
13. Multiple situ use
14. Facilitate change
ReedShell@yahoo.com
69
Topic 3: Project Management Metrics: Planning
Estimation: Size
Function Points
Step 5: Calculate Adjusted Function Point Count
• Value adjustment factor:
VAF = 0.65 + (0.01 x T degrees of influence)
• Adjusts the raw function point counts by:
+/- 35%
• Adjusted function point = Raw function point
count x VAF
ReedShell@yahoo.com
70
Topic 3: Project Management Metrics: Planning
Estimation: Size
Other Size Metrics
•
Object Oriented Development Size:
o
o
o
Number of objects
Number of classes
Number of methods
•
Requirements size: Number of unique requirements
•
Design size: number of unique design elements (configuration items, subsystems, modules)
•
Documentation size:
o
o
•
Number of pages or words
Number of figures (tables, charts, graphs)
Test Size: Number of test cases
ReedShell@yahoo.com
71
Topic 3: Project Management Metrics: Planning
Estimation: Effort, Cost, &
Schedule Metrics
Goals
Questions
How long will the
project take?
Deliver product on
time & within budget
Metrics
Effort Estimates
Cost Estimates
How much will the
project cost?
Schedule Estimates
ReedShell@yahoo.com
72
Topic 3: Project Management Metrics: Planning
PERT Method
These three estimates can be used to define a beta probability distribution
e = (a+4m+b)/6
mean (expected value)
T = (b-a)/6
standard deviation
a = optimistic
estimation
m = most likely
estimation
e
B = pessimistic
estimation
ReedShell@yahoo.com
73
Topic 3: Project Management Metrics: Planning
Risk/Reward Balance
Perceived Risk
Perceived Reward
• Probability of problem
• Probability of reward
• Loss associated with the problem
• Benefit associated with the reward
ReedShell@yahoo.com
74
Topic 3: Project Management Metrics: Planning
Risk Management Process
Risk Management Plan
Prioritized Risks
Plan
Analyze
Information
Containment
Actions
Status
Identify
Risks
Track
New/Closed
Risks
Trigger
Contingency
Actions
ReedShell@yahoo.com
75
Topic 3: Project Management Metrics: Planning
Risk Management
Metrics
Goals
Questions
What is the
probability of
something going
wrong?
Deliver product on
time & within budget
What is the impact if
something does go
wrong?
Which risk mitigation
actions should be
implemented?
Metrics
Risk Probability
Risk Loss
Risk Exposure
Risk Reduction
Leverage
ReedShell@yahoo.com
76
Topic 3: Project Management Metrics: Planning
Risk Exposure
Risk #
Probability
Loss
Risk Exposure (RE)
1
10%
$700K
$70K
2
25%
$300K
$75K
3
3%
$100K
$3K
Risk #
Prob (UO) – before
Loss (UO) – after
RE - before
144
25%
$300K
$75K
Alternative
Prob (UO) – after
Loss (UO) – after
RE – after
Cost
RRL
1
5%
$300K
$15K
$65K
0.9
2
25%
$160K
$40K
$25K
1.4
3
20%
$300K
$60K
$2K
7.5
ReedShell@yahoo.com
77
Topic 3: Project Management Metrics: Planning
Project Performance Metrics
Examples of
of model
model based
based estimation
estimation methods:
Examples
•methods:
Earned Valued
•
Earned Valued
• Estimates
vs. actuals
• Estimates
o
Size vs. actuals
Size
Cost
Cost
oo
Schedule
Schedule
oo
Productivity
o
Productivity
Resource
utilization and staff turnover
oo
•
•
Resource utilization and staff turnover
ReedShell@yahoo.com
78
Topic 3: Project Management Metrics: Planning
Budget and Schedule
Tracking Methods
Goals
Deliver product on
time & within budget
Questions
Metrics
What is the Status of
the project?
Earned value
Are we meeting the
budget?
Cost performance
index
Schedule
performance index
Are we on schedule?
Schedule Gantt
ReedShell@yahoo.com
79
Topic 3: Project Management Metrics: Planning
Earned Value Tracking Procedure
1.
2.
3.
4.
Determine critical resources to be tracked (dollars, staff months)
Allocate the resource budget to individual tasks in the work breakdown structure
When a task is complete, the associated budget has been earned back
Compare the earned value for all completed tasks to the total actual expenditures
to date
•
•
If earned value < budgeted expenditures, then the project is over budget
If earned value < budgeted expenditures, then the project is behind schedule
ReedShell@yahoo.com
80
Topic 3: Project Management Metrics: Planning
Earned Value Metrics
•
•
•
•
•
•
•
ACWP = Actual cost of work performed
BCWS = Budgeted cost of work scheduled
BCWP = Budgeted cost of work performed (earned value)
Cost variance = BCWP – ACWP
Cost performance index (CPI) = BCWP/ACWP
Schedule variance = BCWP – BCWS
Schedule performance index (SPI) = BCWP/BCWS
ReedShell@yahoo.com
81
Topic 3: Project Management Metrics: Planning
Task A
Task B
Task C
Task D
Task E
Status
Done
Started
Done
Not Started
Done
Schedule
Done
Done
Done
Not Done
Not Done
Budget
3 days
12 days
9 days
5 days
8 days
Actual
4 days
2 days
9 days
-
6 days
ACWP
BCWS
BCWP
Cost variance
CPI
Schedule variance
SPI
= 4 + 9 + 6 = 19 days
= 3 + 12 + 9 = 24 days
= 3 + 9 + 8 = 20 days
= 20 – 19 = 1 days
= 20/19 = 105%
= 20-24 = -4 days
= 20/24 = 83%
ReedShell@yahoo.com
82
Topic 3: Project Management Metrics: Planning
Resource & Staff
Tracking Metrics
Goals
Deliver product on
time & within budget
Questions
Metrics
Do we have the
resources we need?
Resource Utilization
Do we have the staff
we need?
Staff Turnover
Are we meeting our
productivity
projections?
Productivity
ReedShell@yahoo.com
83
Topic 3: Project Management Metrics: Planning
Actuals
Number of Resources
Initial Plan
Variance
Current Date
Time
ReedShell@yahoo.com
84
Topic 3: Project Management Metrics: Planning
Productivity Metrics
Example of productivity metrics based on the type of work being done
Type of Work
Productivity Metric
Eliciting, analyzing and specifying requirements
# of requirements/effort
Writing user documentation
# pages/effort
Developing Software component
Function points/effort
LOC/effort
Testing
Test case executed/effort
Test cases passed/effort
Defect found/effort
ReedShell@yahoo.com
85
Topic 4: Project Lifecycle
Metrics
Workshop
Project Life Cycle
1.
2.
3.
4.
Requirements
Design
Coding
Testing
ReedShell@yahoo.com
86
Topic 4: Project Life Cycle Metrics
Requirements Metrics
Goals
To have well defined
table requirements
Questions
How many
requirements are
there?
Metrics
Requirements Size
How stable are the
requirements?
Scope creep
To have high quality
requirements
What is the quality of
the requirements?
Requirement defect
density
To keep the
requirements writing
process on schedule
What is the current
status of the
activities?
Requirements activity
status
ReedShell@yahoo.com
87
Topic 4: Project Life Cycle Metrics
Requirement Size
Metric: Current requirements size
Model: Count of unique requirements at current time
Metric: Baselined requirements size
Model: Count of unique requirements when requirements specification was baselined
• Requires unique identification of each requirement
• Used to monitor scope creep and its effect on the project
ReedShell@yahoo.com
88
Topic 4: Project Life Cycle Metrics
Design Defects/Function Points
Requirements Churn Metric - Example
25%
20%
15%
Change
10%
UCL
5%
0%
1 2 3 4 5 6 7 8 9 10 11 12
Weeks since start of General Design phase
ReedShell@yahoo.com
89
Topic 4: Project Life Cycle Metrics
Design Metrics
Goals
Ensure that all
requirements are
addressed in the
design
Questions
Can requirements be
traced?
Metrics
Requirements Size
Were all
requirements traced?
Scope creep
Improve design
quality
What is the quality of
our current design?
Requirement defect
density
To keep software
design process on
schedule
What is the current
status of the
activities?
Requirements activity
status
ReedShell@yahoo.com
90
Topic 4: Project Life Cycle Metrics
Requirement Traceability
Traceability:
The ability to identify the relationships between originating
requirements and their resulting system features and tests.
Requirements
Design
Code
Tests
ReedShell@yahoo.com
91
Topic 4: Project Life Cycle Metrics
Requirement Traceability
Design Completeness
Metrics
•
•
Percent of functional requirements traced
Percent of design elements traced
Models
Number of functional requirements traced
Total number of functional requirements
Number of design elements traced
Total number of design elements traced
ReedShell@yahoo.com
92
Topic 4: Project Life Cycle Metrics
Metric: Design Defects Density
Number of design defects
Number of function points
Design Defects/Function Points
Model:
1
2
3
4
5
6
Weeks since start of General Design phase
ReedShell@yahoo.com
93
Topic 4: Project Life Cycle Metrics
Code Metrics
Goals
Questions
Metrics
Code defect density
To have a quality
product
What is the product
quality?
To minimize
unnecessary effort
Has code been
re-used if possible?
Reuse
To keep the software
coding process on
schedule
What is the current
status of the
activities?
Code activity status
Document defect
density
ReedShell@yahoo.com
94
Topic 4: Project Life Cycle Metrics
Defect Density Indicator
30
Defect density
(defect/kLOC)
25
20
15
10
5
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Module Number
Decision criteria:
• Modules having defect density > 10 are re-reviewed
• Modules with defect density > 20 are investigated as candidates for reengineering
ReedShell@yahoo.com
95
Topic 4: Project Life Cycle Metrics
Metric: Reuse
Model:
Size of design (code) reused
Total size of design (code)
ReedShell@yahoo.com
96
Topic 4: Project Life Cycle Metrics
Number of units
Metric: Code Activity Status (planned vs. actual)
Inspected
Tested
Coded
Weeks since start of coding phase
ReedShell@yahoo.com
97
Topic 4: Project Life Cycle Metrics
Test Metrics: Test Sufficiency
Perceived Risk
•
•
Probability of undiscovered problems
Loss associated with the problem
Costs of more testing
•
•
Cost of continued testing
Benefits of more testing
Exit Criteria: Measureable conditions that must
be met before we consider testing complete.
*Decision criteria for multiple metrics might be used.
EX) System test sufficiency:
• System test effort variance
• System test activity status
• Problem report arrival rate
• Non-closed problem report counts
• System performance metrics
• Reliability predictions
ReedShell@yahoo.com
98
Topic 4: Project Life Cycle Metrics
Test Metrics
Metric: Problem report arrival rate by severity
for each week since the start of system test
Critical
Major
Minor
1 2 3 4 5 6 7 8 9 10 11
ReedShell@yahoo.com
99
Topic 4: Project Life Cycle Metrics
Test Metrics
Metric: Test activity status (planned vs. actual)
120
Problem Reports
100
80
Blocked
60
Passed
Goal
40
20
0
1
2
3
4
5
6
7
8
9 10 11 12 13 14
Weeks since testing started
ReedShell@yahoo.com
100
Conclusion
Workshop
ReedShell@yahoo.com
101
Conclusion
Remember…
•
Metrics can only show problems and give
ideas on what can be done
•
Actions taken as a result of analyzing the
data brings the results
•
Measurement and metrics collection is not
the goal
•
The goal is improvement through
measurement, analysis and feedback
ReedShell@yahoo.com
102
Conclusion
Success of Metrics – Typical Challenges
•
•
•
•
•
•
Focus on collection of data rather than applying results
“My organization is changing too fast”
Each project is unique
Technology is changing too fast
Project results merely reflect the characteristics of the
people on the projects
“I don’t care about future projects; I care only about
current results.”
• Each of these objections may have some merit
• On the whole, they turn out to form inaccurate
perceptions forming a road block to
improvement
ReedShell@yahoo.com
103
Conclusion
Success Factors
•
•
•
•
•
•
Standardize the What, Why, and How Of metrics.
• Metrics definitions
• What data to collect and how to collect
• Interpretation and use of data with clear
responsibility at various organizational levels
Metrics dashboards for easy tracking.
Spread awareness/training on metrics – the rationale,
philosophy, necessity, and usage.
Have a balanced focus on schedule, defects, and
productivity.
Give due diligence to the ‘soft’ factors
Share knowledge and experiences around.
ReedShell@yahoo.com
104
Conclusion
Success Factors
The Improvement Paradigm:
•
•
•
•
•
•
Know where you are
Develop a vision
Establish improvement actions
Prioritize & plan
Mobilize resources and implement
Analyze for improvement Start all over again
ReedShell@yahoo.com
105
Download