Project Estimation Overview 4 Aim:

advertisement
Project Estimation
Kostas Kavoussanakis, EPCC
Overview
4Aim:
– To raise awareness of the importance of estimation to project
welfare
– To discuss techniques and methods
– To link estimation with the other process activities
2
1
Estimation
4Want to start a project?
4It is important that you know in advance:
– How long it will take
– How many people it will need
– How much effort it will require
4You can then discuss cost
– …and shake hands
4Estimation is hard
– Projects overrun
– Projects go overbudget
3
Motivation
4Two thirds of all projects substantially overrun
their estimates;
4The average large project misses its delivery
date by 25 to 50 percent;
4The size of the average schedule slip
increases with the size of the project.
4Table from B. Boehm
4
2
Disasters
First;Last Estimate
Project
PROMS (Royalty
Collection)
12;21+
Schedule
(months)
22;46
London Ambulance
1.5;6+
7;17+
Cancelled, Month 17
60-75; 150
19;70
Cancelled, Month 36
Confirm (Travel
Reservations)
56;160+
45;60+
Cancelled, Month 48
Master Net (Banking)
22;80+
9;48+
Cancelled, Month 48
London Stock Exchange
Cost (M$)
Status at end
Cancelled, Month 28
5
Why so hard?
4Estimates are needed and relied upon early
4The functional requirements do not provide a
solid background
4It is not immediately known how long it will take
to develop the features
– Particularly if the desired outcomes are genuinely novel.
4Feature Creep is a killer
– It is the unpredictable yet near-certain change of the
functionality as the project progresses.
6
3
Why so hard? (cont)
4Staff ability
– Estimators
– Programmers
4Code reuse
– Is code reused?
– Is code to be reused?
4Programming language used
7
Effects of underestimation
4Understaffing
4Underscoping the quality assurance effort
4Setting too short a schedule.
4These in turn could lead to:
– staff burnout
– low quality
– loss of credibility as deadlines are missed.
8
4
Effects of overestimation
4Parkinson's Law:
Work expands to fill available time
4The project will take at least as long as estimated
even if it was originally overestimated.
9
When to estimate
4The first estimate is necessary before the start
of the project.
4Estimation is a process of gradual refinement.
4It does not finish until the project finishes.
10
5
Cone of uncertainty
11
Relevance
4 The effort is concentrated where the money is
– MIS
– real-time systems
4 What is wrong with academic projects?
– Complex computations
– Challenging algorithms
4 Good estimation practices can keep the project on
track and even earn some time for the tricky,
interesting areas.
4 We should standardise the estimation process
– First inside the individual research centres
– Then (quite a big step) across the centres in the UK
12
6
Projects revisited
4Remember the project triangle
Cost
Quality
Time
4Quality for software projects
– Functionality
– Correctness of the code
4Cost and time depend on Effort and Schedule
13
Effort and Schedule
4 Effort is the total number of time units (e.g. weeks or
months) needed to complete the task.
4 This may break down to effort from more than one
persons, so as to take advantage of certain skills and
parallelise the work to gain overall time.
4 Caveat:
– the more people one adds to a project the more one needs to work so
as to coordinate them and the more they communicate so as to
interact successfully, thus yielding overheads.
4 COCOMO:
– For “large” projects
Effort = a*{size}b
– For projects that can be achieved with 2-3 people teams
Effort = a*{size}+b
14
7
Effort and Schedule (cont)
4Effort division may cause gaps in personnel
utilisation.
4Schedule is the breakdown of effort per
person at any given time.
4It is sometimes defined as the total time for the
project.
4Schedule can be derived from effort.
4Rule of thumb:
optimal schedule = 3 * effort1/3 (McConnell)
15
Estimation Methods
4Expert Opinion
4Estimation by Analogy
4Metrics
16
8
Expert Opinion
4The classic method
4Prior experience is the key to estimation
– This is true in all cases
– Good practice: only use documented data
4Estimation is left to a senior member of staff,
possibly the Project Leader
– Good practice: involve the developers
4Techniques to mitigate single point of failure
– Work Break Structure
– Delphi
17
Work Break Structure (WBS)
4Two hierarchies:
– Software product
– Process
18
9
Product Hierarchy
19
Process Hierarchy
20
10
Work Break Structure (cont)
4Lay out what you want to do
– Requirements at an early stage
– Architecture document at a later one
– Other documents (e.g. priorities list)
4Break the problem into components
(workpackages)
4Then break the workpackages in tasks
4Perhaps go another level down
– Good practice: the more you break and estimate the better
• “The sum of the errors is less than the error of sums”
21
Work Break Structure (cont)
4Appreciate the workload of each component
– Good practice: How many days to the week?
– Good practice: Check documented experience of similar tasks
22
11
Gantt Charts
4Turn back to the workpackages and seek
dependencies
– E.g. “WP1 and WP2 can run in parallel, but can’t start WP3
until WP1 is finished”
4How to create a Gantt chart
– Get a biiig piece of paper and loads of post-its
– Each post-it is a task
– Try to parallelise tasks as much as dependencies let you
• horizontal axis represents time, vertical axis resources
– Length of chart is time
– Width of chart is effort
• Gaps represent underutilisation of staff
23
Gantt Chart Example
24
12
Delphi
4Participants are asked to make assessment
individually
4Results are collected, tabulated, and then
returned
– Group discussion optional
4Second round of individual assessments taking
into account the previous results.
25
Expert Judgment evaluation
4Pros:
– Applicable to very original projects.
– Inherent local calibration.
– WBS can lead to a well documented process.
4Cons:
– Big dependency on experts' abilities.
– Big dependency on experts' presence; how easy does Delphi
cope with high staff turnover in a small organisation?
– There is so much information outside your establishment
26
13
Estimation by Analogy
4 The cost of a new project is estimated by analogy to
similar completed projects
4 Estimates based on historical data from within an
organisation are more accurate than estimates based
on rules of thumb or educated guesswork.
4 International Software Benchmarking Standards Group
(ISBSG) maintains and exploits a repository of
international software project metrics to help software
and IT business customers with project estimation, risk
analysis, productivity and benchmarking.
27
Estimation by Analogy (cont)
4Pros:
–
–
–
–
Can be accurate.
Simple if the organisation repeats similar work.
Estimates are immediately available.
Encourages detailed documentation.
4Cons:
– Can be very unreliable.
– It is very hard to assess the differences between the
environments and thus assess the accuracy of the ISBSG
data.
28
14
Estimation Metrics
4No definition for project size
– “Something that encompasses all 3 aspects of a project”
– “The area of the project triangle”
4A metric makes estimation:
– Transparent: The abilities of the practitioner are irrelevant
– Repeatable: Exercised by various people at different times
yields the same results
– Reliable: The end results, although never accurate, can be
closer to the Truth
4Additionally:
– The productivity of staff and organisations can be monitored.
– The results can be shared across the globe.
29
Lines of Code
4How many lines in your files?
4Pros:
– Very natural
– Easily countable
4Cons:
– Available too late
– What is a line?
• Comments?
• Compare
d = c++;
with
c++;
d = c;
30
15
Function Points
4Function Points provide a more objective and
reliable estimate of the size of a project
4“Measure of the size of computer applications
and the projects that build them”
4With Function Points one can:
– Measure Productivity (100 FPs produced this month)
– Estimate development and support (100 FPs required thus
XXX man months)
– Monitor outsourcing agreements (this library requires 100 FPs,
says the contract, these are only XXX)
– Normalise other measures (100 defects in a 100 FPs is far
worse than in 10000 FPs)
31
Function Points (cont)
4Started in IBM in the late seventies (Alan
Albrecht)
4International Function Point Users Group
founded in late eighties
4An active area of research (COSMIC-FFP)
4Measurements are taken from the user's point
of view
– Independent of computer language, development
methodology, technology or capability of the project team
http://www.ifpug.org/home/docs/ifpughome.html
32
16
Function Points Evaluation
4Pros:
–
–
–
–
Prescriptive method for sizing.
Can be applied reasonably early in project lifetime.
Immune to language and platform idiosyncrasies.
Large user base-active effort.
4Cons:
– Manual, fiddly process.
– Disagreement about its applicability across the various types
of modern projects.
– Not ideal in the requirements capture period; although the
method can be applied, it is too much work for such a volatile
description.
33
Estimation Tools
4Approaches depending on estimation method:
– Expert Judgement
– Algorithmic Models
• FPs, for size
• COCOMO, or SLIM, or tables based on size estimation for effort
• SLIM or tables based on size of effort estimation for schedule
4Most products are hybrids:
– Organising expert judgement
– Exploiting machine learning
– Refining algorithmic models
34
17
Estimation Tools (cont)
4With estimation tools you can:
– Estimate size using Function Points or other metrics
– Derive effort and schedule using various algorithms and
techniques
– Perform “what if” analyses with staffing, duration etc. and
appreciate how realistic they are
– Produce and update Gantt and other charts easily
– Maintain and exploit a database of historic data
– Import data from other projects run in organisations with which
you have no connection
35
Estimation Tools (cont)
4Tools are important for estimation
– They make estimation feel like an algorithm
– They prevent from skipping necessary tasks
– They help organise, update and archive the results
4Additionally, they provoke you to think about
the lifetime of the project and at the same time
do the dirty job for you
4Make your own research for tools and their
efficiency
– Compare them against already finished projects
– Use them in parallel with your current estimation technique
36
18
Estimation is a part of Process
4The most important issue about estimation is
frame of mind
4It takes more than a good estimate to keep a
project in good shape
4Important things to note:
–
–
–
–
–
–
The silver bullet syndrome
Choice of development model
Track the project progress
Estimate in ranges
Think before you quote
Bring in the right people
37
The silver bullet syndrome
4None of the estimation methods is perfect
4None of the estimation tools is perfect
4Evaluate and calibrate methods and tools
38
19
Development model
4Many models are flexible enough to
accommodate changes.
4Design to schedule is great for fixed time-fixed
cost projects.
4Iterative models are better for projects with
many unknowns.
–
–
–
–
Staged delivery
Design to schedule
Evolutionary delivery
Spiral
39
Keep track of your project
4A software project is a live entity with complex
behaviour
4Monitor and update estimates
– Even after the design stage you can expect to be off by 25%
4Complement your estimates with a good
tracking policy
– Various tools available
4Always update the risks list and the priorities
list.
40
20
Estimate in ranges
4 The tools and methods may give you absolute
numbers
4 Add some padding, i.e. slack time to make up for errors
–
–
–
–
Do it and the customer does not trust your estimate
Don’t do it (or don’t pad enough) and you may overrun
Good practice: estimation range
Good practice: buffers for specific risks
4 Risk analysEs will show things that can go wrong
4 Evaluate their impact to the schedule
4 Provide conditional estimates
– Start with raw numbers (from tools and methods)
– Consider risks
– Consider what could help
41
Estimate in ranges (cont)
4Example:
– “The GUI will take 6 months, +2 if the tool-generated code is
useless, +1 if Jo goes on holiday, -1 if we can reuse the ‘File’
menu functionality from project X.
6+3 -1".
4Alternatively:
[5-9]
4Very different from “8”
– 6+2+1-1=8.
4Incorporation of risks to estimate
42
21
Think before you quote
4When tempted to provide an estimate thinking
on your feet...
DON'T!
4People remember it, pass it on and hold you
accountable.
– Although it is normal for the estimate at the feasibility stage to
be off by 400%.
43
Bring in the right people
42 groups of colleagues one cannot ignore.
4Group 1: those experienced in estimation.
– They know how to appreciate the workload of components
• Trust them in conjunction with documented past estimates
– They know how to do it.
4Group 2: those who will do the work
– They will know how long it will take them
• Do not judge the effort from your own standards
– They can see things from a different angle
• And foresee the technical challenge
– They get a feeling of ownership
• Persistence and application to the project
44
22
References
4 Steve McConnell. Rapid development: taming wild software
schedules. Microsoft Press, 1996.
4 B W Boehm, C Abts, A W Brown, S Chulani, B K Clark, E
Horowitz, R Madachy, D Reifer, and B Steece. Software Cost
Estimation with COCOMO II. Prentice Hall PTR, 2000.
4 Construx Software Inc.
http://www.construx.com/estimate.
4 I Sommerville. Software Engineering, Sixth Edition. AddisonWesley Publishers Limited, 2001.
4 Steve McConnell. Software Project Survival Guide. Microsoft
Press, 1998.
4 W.S.Humphrey. Your Date or Mine. In The Watts New Collection.
Software Engineering Institute, Carnegie Mellon University,
http://interactive.sei.cmu.edu/, 2001.
45
References (cont)
4 B Boehm, C Abts, and S Chulani. Software Development Cost Estimation
Approaches - A Survey. Technical Report USC-CSE-2000-505, University
of Southern California - Center for Software Engineering, USA, 2000.
4 E Nelson. Management Handbook for the Estimation of Computer
Programming Costs. Technical report, Systems Development
Corporation, USA, Oct 1966.
4 L Putnam and W Myers. Measures for Excellence. Yourdon Press
Computing Series, 1992.
4 C Jones. Applied Software Measurement. McGraw Hill, 1997.
4 R Park. The Central Equations of the PRICE Software Cost Model. In
4th COCOMO Users' Group Meeting, November 1988, 1988.
4 R Jensen. An improved Macrolevel Software Development Resource
Estimation Model. In Proceedings 5th ISPA Conference, April, 1983,
pages 88-92, 1983.
46
23
References (cont)
4 B Boehm. Software Engineering Economics. Prentice Hall, 1981.
4 L Briand, V Basili, and W Thomas. A Pattern Recognition Approach
for Software Engineering Data Analysis. IEEE Transactions on
Software Engineering, 18(11), 1992.
4 T Khoshgaftaar, A Pandya, and D Lanning. Application of Neural
Networks for predicting program faults. Annals of Software
Engineering, 1, 1995.
4 H Zuse. History of Software Measurement, Chapter 5 "Cost
Estimation" at
http://irb.cs.tu-berlin.de/zuse/metrics/History_05.html.
4 B Boehm, B Clark, E Horowitz, C Westland, R Madachy, and R Selby.
Cost Models for Future Software Life Cycle Processes: COCOMO2.0
at
http://sunset.usc.edu/publications/TECHRPTS/1995/usccse95508/usccse95 -508.pdf.
47
References (cont)
4 International Software Benchmarking Standards Group
(ISBSG).
Software estimation, benchmarking, productivity, risk analysis,
and cost information for software developers and business at
http://www.isbsg.org.au.
4 O Helmer. Social Technology. Basic Books, NY, USA, 1966.
4 B Baird. Managerial Decisions Under Uncertainty. John Wiley
& Sons, 1989.
4 R Boehm. Function Point FAQ at
http://ourworld.compuserve.com/homepages/softcomp/fpfaq.ht
m.
4 Edmond VanDoren. Software Technology Review: Function
Point Analysis at
http://www.sei.cmu.edu/activities/str/descriptions/fpa_body.htm
l.
48
24
References (cont)
4 The Common Software Measurement Metric International
Consortium. COSMIC-FFP Measurement Manual. Technical report,
The Common Software Measurement Metric International
Consortium, available from
http://www.lrgl.uqam.ca/publications/private/446.pdf, 2001.
4 A Abran, C Symons, and S Cligny. An overview of COSMIC-FFP field
trial results, also available from
http://www.lrgl.uqam.ca/publications/pdf/614.pdf.
In ESCOM 2001, London, England, April 2-4, 2001.
4 R J Kauffman R D Banker and R Kumar. An empirical test of objectbased output measurement metrics in a computer aided software
engineering (CASE) environment. Management Information Systems,
8:127-150, 1991.
4 E Yourdon. Death march: managing "mission impossible" projects.
Prentice Hall PTR, 1997.
49
25
Download