Estimation Perspectives Richard D. Stutzke

advertisement
Estimation Perspectives
Richard D. Stutzke (SAIC)
25 October 2005
Presented at the 20th International COCOMO and Software
Cost Modeling Forum
Los Angeles, 25-28 October 2005
Richard.D.Stutzke@saic.com
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
1
Topics
•
•
•
•
•
•
•
Measurement and Estimation
Product Size
Project Cost and Schedule
Product Performance
Product Quality
Process Performance
Summary
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
2
Reasons to Estimate and Measure
•
Product Size, Performance, and Quality
- Evaluate feasibility of requirements
- Analyze alternate product designs
- Determine the required capacity and speed of hardware components
- Evaluate product performance (accuracy, speed, reliability, availability)
- Quantify resources needed to develop, deploy, and support a product
- Identify and assess technical risks
- Provide technical baselines for tracking and controlling
•
Project Effort, Cost and Schedule
- Determine project feasibility in terms of cost and time
- Identify and assess project risks
- Negotiate achievable commitments
- Prepare realistic plans and budgets
- Evaluate business value (cost versus benefit)
- Provide cost and schedule baselines for tracking and controlling
•
Process Capability and Performance
- Predict resource consumption and efficiency
- Establish norms for expected performance
- Identify opportunities for improvement
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
3
Estimation Uncertainty Typically
Decreases with Time
5.0
4.0
Upper 80% Bound
Lower 80% Bound
Zero Error Line
3.0
Uncertainty
(multiple of
estimated value)
2.0
1.0
0.0
0.0
0.2
0.4
0.6
0.8
1.0
Elapsed Time
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
4
Total Estimation Life Cycle
Define Project
(Products, Process)
Requirements
Work Breakdown Structure
BEFORE
Environmental &
Business Factors
Identify and
Evaluate Risks
Staff Skill and
Availability
Estimate Cost
and Schedule
Loop 2
Project Estimate
Re-estimate
Cost and
Schedule
DURING
Budget and
Schedule
Project
Plan
Perform
Planned
Activities
Loop 1
Changes in
Requirements,
Design, and
Environment
Measure
Project Actuals
Revised
Inputs
Documented
Estimation Models
Status & Estimate
To Complete
Close Out
Report
Organization’s
Historical Data
AFTER
Enhancements
Updated Procedures
& Checklists
 2005 by Richard D. Stutzke
Compare Planned
and Actual Values
Calibrate
Improve
Process
Estimation Perspectives (v5)
5
Ways to Improve Estimation Accuracy

Before starting the project
-

During the project
-

Understand product requirements and architecture
Choose a suitable production process and supporting tools
Use a mix of estimating techniques, models, and historical data
Produce the estimate in a disciplined way
Collect actual measurements (costs, performance, progress)
Watch for violations of the estimating assumptions
Watch for changes in key factors (software size, turnover)
Update the estimate with the latest information
After the project
-
Collect final measurements
Analyze data and lessons learned
Capture the new information (calibrate models, update checklists)
Improve the estimating process (procedures, new tools, training)
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
6
Quantities of Interest

Products and Services Delivered
-

Project
-

Size or amount (created, modified, purchased)
Performance (capacity, accuracy, speed, response time)
Quality (conformance to requirements, dependability)
Price and total ownership cost
Effort (activities)
Staff (number, skill and experience, turnover)
Time (phases, schedule milestones)
Effort (direct and indirect)
Costs (labor and non-labor)
Computer resources used for development and test
Process
-
Effectiveness
Efficiency
Flexibility
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
7
Common Challenges
• Identifying the key factors that affect the quantity of
interest
• Defining precise measures
– Operational definitions
– Affordable
– Appropriate scale ([Stevens, 1948], [Zuse, 1997]
• Quantifying the influence of the key factors (valid models)
• The “object” evolves during development
– “Object” can be the product, project, or process
– The final object is not the one initially conceived
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
8
Product Size
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
9
Labor Dominates Software Project Cost
• The concept:
Effort [phrs] = Size [SLOC] / Productivity [SLOC/phr]*
• The challenges
– Choice and definition of the size measure
– Defining productivity
– Identifying all the pieces
*Productivity  (product size) / (resources expended)
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
10
Physical Architecture for a Monitoring Application
Sensors
Front End
Mainframe
Display
Device Drivers
Executive (Control
Logic)
Comm. Mgt.
Polling Logic
Comm. Mgt.
Algorithms
Display Gen.
GUI Services
- Data Validation
- Data Smoothing
- Trend Analysis
Data Management
Comm. Mgt.
Report Generator
GUI Services
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
11
Software Architecture for a Monitoring Application
Software Product
Data Acquisition
Report Generation
- Device Drivers
- Polling Logic(?)
- Data Validation(?)
- Units Conversion
(scaling, clipping)
System Services
- Report Generation
- Display Generation
Data Analysis
Control
- Data Validation(?)
- Data Smoothing
- Trend Analysis
(threshold detect)
- Communications
Mgt.
- Math and Statistics
(e.g., LSQ fitting)
- GUI Services
- Data management
(RDBMS)
- Executive
- Polling Logic(?)
- Self-test
- Data Archiving
Tooling
- Emulators
- Stubs
- Test Drivers
- Test Data
Preparation
- Test Data Analysis
- Scripts
*The items followed by “(?)” may be allocated different areas depending on various considerations.
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
12
Productivity Is Slippery
• Common Issues
–
–
–
–
–
–
What is being counted? (new, modified, prototypes, tooling, breakage)
How is size defined? (Example: a line of source code)
What activities and phase are covered? (CM? QA? RA? I&T?)
How formal is the process? (reviews, documentation)
How skilled is the staff?
How many person-hours per person-month? (overtime, days off)
• Specific Challenges
–
–
–
–
Software (all components unique, “exceptional” functions overlooked)
Hardware (components and assemblies)
Services (installation, training)
Documents (original, copies)
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
13
Migration of Reused Code to the “Dark Side”
300
Additional New
250
200
New
Total Size (All Code)
Estimated
Size 150
(KSLOC)
Original Total
Modified + Unmodified
Unmodified Only
100
Modified
50
Copied
0
0.0
0.2
0.4
0.6
0.8
1.0
Elapsed Time (Relative Units)
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
14
Project Cost and Schedule
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
15
The Precursors of Estimation
•
•
•
•
•
•
Customer’s needs and operational environment
Products and services to be delivered
Production process to be used
Project goals and constraints
Estimate’s purpose and constraints
Applicable estimating techniques and tools
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
16
Project Information Flow
Contract
Master Schedule
Analyze
Design
Code
Test
Review
Deliver
SOW
CLINs
______
______
______
Production
Process
Project Data
Engineering Data
Procurement Data
Spec
Product
Architecture
Technical
Baseline
TPM
_________________
_________________
_________________
WBS
Cost Baseline
1.0 ____
1.1 ____
1.2 ____
2.0 ____
$
$
$
$
(Work
Packages)
Resource Loaded
Network
Earned Value
Plan
BCWS $ $ $ $
BCWP $ $ $ $
ACWP $ $ $ $
Schedule Baseline
Actuals
Planning
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
Tracking
17
Estimating Techniques*
•
•
•
•
•
•
Expert Judgment (Delphi or PERT)
Analogy (“nearest neighbors”)
Scaling (additive, multiplicative)
Top-Down
Bottom-Up
Parametric Models
*See comparison in Chapter 22 of [Boehm, 1981].
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
18
Usage of Estimating Techniques Changes
100
Extrapolation
of Actual Data
80
60
Bottom Up
(Using
WBS)
Method
Use (%)
Parametric
Models
40
Analogies
20
Experts
0
Conception
Elaboration
Construction
Transition
Operation
Phase of the Product's Life Cycle
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
19
Product Performance
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
20
Reasons to Estimate Product Performance
• Evaluate feasibility of specified performance
• Establish bounds on expected performance
• Identify type and amount of computer resources needed
– “Capacity planning”
– System sizing
• Identify significant design parameters
• Provide information when cannot measure
– Product not yet built
– Product is inaccessible (e.g., 24 x 7 x 52)
– Measurement is too dangerous or expensive
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
21
Typical TPMs for Computer Systems

Accuracy
- Correctness (corresponds to real world)
- Adequate precision (resolution)

Dependability
- Reliability
- Availability
- Probability Of Failure On Demand

Speed
- Response time (GUI)
- Execution time (per function, transaction, etc.)
- Data transfer rates

Resource Consumption
-
CPU usage
Storage (memory, disk)
Transmission channel usage (I/O, LAN)
Peripherals
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
22
Resource Usage versus System Performance
Assumed
Operational Profile
Software
Algorithms
Estimated
Resource Usage
Platform Capacity
and Configuration
Estimated
System
Performance
Actual Operational
Profile
Actual
Resource Usage
Actual System
Performance
Hardware Costs
Mission Success
User Acceptance
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
23
Estimation and Measurement Challenges

Platform characteristics
-

External environment and workload
-

Number of users
External stimuli (probabilistic events, multiple scenarios)
Algorithms
-

Multiple processors
Memory cacheing
Configurable COTS components (e.g., buffer size)
“Hidden” factors (operating system’s scheduler)
Internal workload (other concurrent processes)
Use of loops, iteration, searches, or sorts
Suitability for assumed workload (scalability)
Choice of parameters (e.g., step size and convergence criteria)
Poorly understood or not yet identified
Relating component behavior to system behavior
-
Complex (and possibly unknown) interactions of components
Many states and modes (dynamic load balancing, fault tolerance)
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
24
Possible Performance Analysis Techniques
•
•
•
•
•
•
•
•
•
•
Order of magnitude calculations
Analytic model
Simple queueing results
Bottleneck analysis
Semi-analytic models (e.g., PDQ by Gunther)
Queueing network models
Simulation models
Smart stubs
Benchmark runs on similar system
Measurement of actual system
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
25
Comparison of Options
Estimation
Characteristic
Analytic Models
Measurement
Simulation
Smart Stubs
Benchmark Runs
Choice of Performance Measures
Limited
Many
Limited
Limited
Level of Detail
Limited
Arbitrary
Varies
Varies
Accuracy (Realism and Fidelity)
Approximate
Good
Good
Perfect
Steady State Behavior
Yes
Yes
Yes
Yes
Transient Response
Limited
Yes
Yes
Yes
Range of Off-Nominal Conditions
Some
Wide
Some
Limited
Effort to Develop and Validate the Model or Tool
High to Extra High
Low to High
Low
Not Applicable
Effort to Use the Model or Tool (set up, configure,
calculate, measure)
Low
Low to Very
High
Nominal to Very
High
Medium to Very
High
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
26
Performance Estimation is a Cyclic Process
START
Assess
performance risk
Identify critical
use cases
Select key
performance scenarios
Modify/create
scenarios
Define quantitative
performance objectives
Construct
performance model(s )
Verify and validate
models
Run and evaluate
performance model(s )
Next Iteration
Acceptable
performance
Unacceptable
performance
feasible
Modify product
architecture and design
Revise performance
objectives
infeasible
*Adapted from Figure 2-1 in [Smith, 2002]. Used with permission.
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
27
Product Quality
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
28
Quality Factors Vary by System and Stakeholder
Correctness
100
90
80
Portability
Performance
70
60
50
40
30
20
Interoperability
Dependability
10
0
Maintainability
Usability
Supportability
 2005 by Richard D. Stutzke
User
Developer
Perfect
Efficiency
Estimation Perspectives (v5)
29
Example: Product Dependability
Functional
Test Cases
Integrated
Software
Assumed
Operational
Profile
Installed and
Configured
Software
Actual
Operational
Profile
Observed
Correctness
Test
Platform
Estimated
Dependability
Operational
Platform
Actual
Dependability
User
Acceptance
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
30
Estimating Challenges
• Distinguishing faults, failures, and problem reports
• Understanding fault generation and detection
– Product requirements and architecture
– Process (methods, reviews, tests, tools)
• Relating faults to operational failures
– Failures often arise from occurrence of multiple events
– Assumed operational profile
– Predicting the actions of users and abusers
• The “product” evolves during development
– Add new features
– Revise the design
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
31
Testing (Measurement) Challenges
• Test profile differs from the operational profile
• Detection is flawed (Type 1 and Type 2 errors)
– Test case (or peer review) fails to expose a real fault
– Test case reports a defect when there is actually no fault
• Comprehensive testing is seldom possible
– Many combinations to cover (inputs, modes, internal states)
– Limited time for testing
– Never trigger some faults (so never see resulting failure)
• The product changes during testing
–
–
–
–
Remove defects (“reliability growth”)
Add defects (“bad fixes”)
May even add features
Every change gives a new product to sample
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
32
Comparison of Options
Estimation
Characteristic
Analytic Models
Measurement
Simulation
Stress Tests
Operational Use
Choice of Quality Measures*
Limited
Limited
Limited
Limited
Level of Detail
Limited
Varies
Limited to
dependability
Varies
Accuracy (Realism and Fidelity)
Approximate
Varies
Perfect (if product
is complete)
Good
Effort to Develop and Validate the Model or Tool
High to Extra High
Low to High
Low
Not Applicable
Effort to Use the Model or Tool (set up, configure,
calculate, measure)
Low
Low to Very
High
Low to Medium
Nominal to Medium
*Dependability is most objective and mature. Customer preference is subjective.
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
33
Determining Product Dependability
Specify
Dependability
Goal
Concept
Definition
System
Definition
Application
Metrics
Software
Requirements
Analysis
Requirement
Metrics
Product
Design
Design
Metrics
Detailed
Design
Design
Metrics
Coding and
Unit Testing
Code
Metrics
Integration
Testing
Test
Data
Dependability
(Stress)
Testing
Factory
Acceptance
Testing
Site
Operational
Testing
Operations
 2005 by Richard D. Stutzke
Test
Data
Test
Data
Estimate
Dependability
(“Prediction”)
Measure
Dependability
(“Estimation”)
Test
Data
Actual
Performance
Observe
Dependability
Estimation Perspectives (v5)
34
Process Performance
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
35
Process Control: Measurements + Models
Control
the
Process
Execute
the
Process
Measure
the
Process
Predict
Process
Performance
Define and
Improve the
Process
Legend
Data, Information, and Measurements
Predictions
Defined (and Improved) Process
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
36
Measuring Process Performance
•
Key Questions
–
–
–
–
•
What is the current performance?
Is this value "good"?
Is it changing?
How can I make the value “better”?
Candidate Attributes*
–
–
–
–
–
–
Definition (completeness, compatibility)
Usage (compliance, consistency)
Stability (repeatability, variability)
Effectiveness (capability)
Efficiency (productivity, affordability)
Predictive Ability (accuracy, effects of tailoring and improvements)
*Motivated by [Florac, 1999, Section 2.4]
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
37
Some Examples
Goal
Measure
Completeness
The number of process elements added, changed, and
deleted during tailoring.
Compliance
Number of discrepancy reports generated by Quality
Assurance audits
Stability (volatility)
The number of process elements changed within a
specified time interval.
Effectiveness
Product quality
Effectiveness
Defect leakage to subsequent phases
Efficiency
Productivity (or production coefficient)
Efficiency
Rework as a fraction of total effort
Predictability
Probability distribution for an estimated quantity or
related statistics
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
38
Types of Process Performance Models
Type
Handles Unstable
Processes?
Representation of
Process Mechanisms
Statistical
No
None
Functional
No
Explicit
Parametric models (algorithms
based on causal mechanisms).
COQUALMO and staged models.
Dynamic
Yes
Implicit (via
propagation)
System dynamics models (Coupled
equations embody the causal
mechanisms. Solving numerically
gives predicted behavior.)
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
Examples
Statistical process control
39
Comparison of Options
Measurement
Characteristic
Estimation
Statistical
Functional Models
Dynamic Simulation
Choice of Performance Measures
Limited
Limited
Many
Level of Detail
Limited
Varies
Arbitrary
Accuracy (Realism and Fidelity)
Perfect (for
specified scope)
Approximate
Varies
Steady State Behavior
Yes
Yes
Yes
Transient Response
No
Limited
Yes
Range of Off-Nominal Conditions
Limited
Some
Wide
Effort to Develop and Validate the Model or Tool
Low to High
Low to High
Low to Very High
Effort to Use the Model or Tool (set up, configure,
calculate, measure)
Medium to High
Low
Low to High
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
40
Factors Affecting Predictive Accuracy
• The process definition
– Detail
– Stability
– Tailoring
• The process execution
– Compliance
– Consistency
• The model’s scope, and validity
– Relevant factors and interactions
– Fidelity
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
41
Summary
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
42
Estimation and Measurement Are Coupled
• Estimates provide predicted information to:
–
–
–
–
Assess product and project feasibility
Evaluate tradeoffs in performance, quality, cost, and schedule
Identify and obtain resources
Prepare plans to coordinate and track work
• Measurements provide actual information to:
– Verify expectations and detect deviations
– Control processes (preserve estimating assumptions, adapt to
changes)
– Prepare revised estimates
– Formulate, validate, and calibrate predictive models
– Improve decision making for projects and businesses
– Help the business adapt to changing conditions
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
43
Recurring Themes
• Estimation is a continuing process
– Accuracy usually increases with time (“learning curve”)
– You get what you pay for at any particular time
– Perfectly accurate estimates are impossible (and are not
needed!)
• A little, good data goes a long way (90/10 rule)
– There are never enough resources (time, money)
– Use Goal – Question – Measurement to choose
– Precise definitions are essential (e.g., “line of code” and
“defect”)
• Some quantities are harder to estimate than others
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
44
Relative Estimation Difficulty
1.
2.
3.
4.
5.
6.
7.
Project Cost
Product Size
Project Effort
Project Schedule
Product Performance
Product Quality
Process Performance
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
Increasing
Difficulty
45
References (1 of 2)
General
[Stutzke, 2005]
Estimating Software-Intensive Systems: Projects, Products, and Processes, Richard D. Stutzke, AddisonWesley, 2005, ISBN 0-201-70312-2.
Project Effort, Cost, and Schedule
[Boehm, 2000]
Software Cost Estimation with COCOMO II, Barry W. Boehm, Chris Abts, A. Winsor Brown, Sunita Chulani,
Bradford K. Clark, Ellis Horowitz, Ray Madachy, Donald Reifer, and Bert Steece, Prentice-Hall, 2000, ISBN
0-13-026692-2.
[Boehm, 1981]
Software Engineering Economics, Barry W. Boehm, Prentice-Hall, 1981, ISBN 0-13-822122-7.
Product Performance
[Jain, 1991]
The Art of Computer Systems Performance Analysis Techniques for Experimental Design, Measurement,
Simulation, and Modeling, Raj Jain, John Wiley and Sons, 1991, ISBN 0-471-50336-3.
[Gunther, 2000]
“The Practical Performance Analyst: Performance-By-Design Techniques for Distributed Systems”, Neil J.
Gunther, Authors Choice Press (an imprint of iUniverse.com, Inc.), 2000, ISBN 0-595-12674-X. Originally
published in 1998 by McGraw-Hill as ISBN 0079-12946-3.
[Smith, 2002]
Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software, Connie U. Smith and
Lloyd G. Williams, Addison-Wesley, 2002, ISBN 0-201-72229-1.
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
46
References (2 of 2)
Product Quality
[Kan, 2003]
Metrics and Models in Software Quality Engineering, 2nd edition, Stephen H. Kan, Addison-Wesley, 2003,
ISBN 0-201-72915-6.
[Lyu, 1996]
“Handbook of Software Reliability Engineering”, Michael R. Lyu (editor), IEEE Computer Society Press
(McGraw-Hill), 1996, ISBN 0-07-039400-8. An excellent, up-to-date coverage of the subject.
Process Performance
[Florac, 1999]
Measuring the Software Process: Statistical Process Control for Software Process Improvement, William A.
Florac and Anita D. Carlton, Addison-Wesley, 1999, ISBN 0-201-60444-2.
[Madachy, 2005]
Software Process Modeling with System Dynamics, Raymond J. Madachy, John Wiley & Sons, 2005, ISBN 0471-27555-0.
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
47
 2005 by Richard D. Stutzke
Estimation Perspectives (v5)
48
Download