Information Systems Project Management—David Olson 7-1 © McGraw-Hill/Irwin 2004

advertisement
Information Systems Project Management—David Olson
7-1
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-2
Chapter 7: Estimation
project planning - what to do
project control make sure it’s done right
estimation of detailed system design
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-3
Planning Process
• determine requirements from objectives
• specify work activities
• plan project organization
• develop schedule
• develop resource plan and budget
• establish control mechanisms
each project unique
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-4
Determine Requirements
Specify work activities
STATEMENT OF WORK:
– product descriptions
– constraints
– schedule requirements
– budget limits
– roles & responsibilities
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-5
WORK DEFINITION
• once objectives set,
TRANSLATE INTO WORK ELEMENTS
• what needs to be done
– easy to overlook some, or duplicate
• WORK BREAKDOWN STRUCTURE
– divide project into major work categories
• subdivide
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-6
Work Breakdown Structure
write paper
PROJECT
research
CATEGORY
library
TASK
what's been done
PACKAGE
internet
TASK
needs doing
PACKAGE
search
PACKAGE
© McGraw-Hill/Irwin 2004
writing
CATEGORY
printing
CATEGORY
write
PACKAGE
run off
PACKAGE
Information Systems Project Management—David Olson
7-7
Work Breakdown Structure
• level 1 - overall project
• level 2 - category
– major project subelement
• level 3 - task
– subelement of category
• level x - subtask
• level bottom - WORK PACKAGE
– specific activity
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-8
Detailed Task List
• WBS can be focused on
– PRODUCT
– FUNCTION
– etc.
• when work packages identified,
–
–
–
–
estimate requirements by resource
WHAT IS NEEDED
WHEN
WHAT MUST PRECEDE
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-9
Work Breakdown Structure
• needs to be checked, approved
• provides
–
–
–
–
good definition of work
how long it will take
resources required
estimated costs
• planning & control
– assignments, budget, basis for control
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-10
Work Packages
• chunk of required work
• relatively small cost and short duration
• includes
–
–
–
–
–
–
summary of work
inputs required (predecessors)
manager responsible
product specifications
resources required (including budget, dates)
deliverables
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
list of work packages
(activities)
system design
activity
predecessor
A2 identify req’d info A1
A31 basic software
A2
A32 data access req’d A2
A33 vendor software A2
© McGraw-Hill/Irwin 2004
time
10 days
3 days
1 day
1 day
7-11
Information Systems Project Management—David Olson
7-12
Work Packages
• need to identify start and finish events
for each work package
• related tasks without definable end
results (overhead & management;
inspection; maintenance) should be
included as task-oriented work
packages for COST purposes
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-13
Project Organization
• identify resources required by work
package
• RESPONSIBILITY MATRIX
– which functions do what work packages
– cost account structure
• start & finish date
• budget
• responsibilities
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-14
Project Management System
• lists activities on one axis
• lists people on other axis
• shows who is
–
–
–
–
primarily responsible
also involved
has approval authority
must be notified
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-15
Scheduling
• BASIS for
– RESOURCE ALLOCATION
– ESTIMATED COST
– plan for monitoring & control
• EVENTS or MILESTONES
– when activity completed (or started)
– INTERFACE EVENT
• when responsibility passes
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-16
Kinds of Schedules
• project schedules
– project master schedule - top management
– overview rather than detail
• task schedules
– specific activities required
– more detail
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-17
Resource Plans & Budgets
• Activities often compete for the same
resources
– hire more
– reschedule
• Resource plans show critical resource
schedules
– bottlenecks around which schedule is built
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-18
Charts
visual aids
• Gantt Charts
–
–
–
–
plan - activities by time (work in outline)
implement - fill in as work done
doesn’t show relationships well
very good at seeing where things are
(IF ACCURATE)
• Expense Charts - cumulatively graph $ spent
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-19
Recap
• Planning - key to accurate bidding
• need to know what it will cost in order to know how to
price
• need to know resources required, complex projects take
a long time
• MIS projects
– activities, predecessor relations, resource use
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-20
Software Estimation
The Mythical Man-Month:
Essays on Software Engineering
Frederick P. Brooks, Jr. (U N. Carolina)
Addison-Wesley: 1975
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-21
Programming Products
• Program
– usable by author
• Programming System
– usable by anyone
• Programming Product
– tested, documented, maintained
– 3 times the effort of a program
• Programming System Product
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-22
causes of project failure
• LACK OF CALENDAR TIME the most common
– estimating techniques are poor
– assume that effort = progress
• you can’t just throw people at a problem
– poor monitoring of progress
• SCHEDULING
– tendency to assume all will go well
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-23
impact of adding people
• partitionable project
– marginal contribution declines
• non-partitionable project
– no benefit at all from adding people
• complex interactions
must separately coordinate each task with all others
– first few have declining marginal contribution
– after some number, adding people slows down project
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-24
software project activities
• testing is the activity most difficult to predict
–
–
–
–
planning
1/3 of project
coding
1/6 of project
component testing
1/4 of project
system testing 1/4 of project
• most projects are on schedule UNTIL TESTING
• Brooks’s Law: Adding manpower to a late
project makes it later
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-25
programmer productivity
• there is wide variation in productivity between
good and fair programmers
• Brute force failures
–
–
–
–
costly
slow
inefficient
nonintegrated systems
OS/360
TSS
Exec 8
SAGE
Scope 6600
Multics
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-26
impact of adding programmers
• if a 200 person project has its best 25 people as
managers
– fire the 175
– make the 25 managers programmers
• shouldn’t have more than 10 people on a team
• OS/360 had 1000 working on it, 5000 man-years
• small teams infeasible; use surgical teams
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-27
surgical team
•
•
•
•
•
•
•
•
•
surgeon
copilot
administrator
editor
secretaries (2)
program clerk
toolsmith
tester
language lawyer
chief programmer
share thinking, evaluation
boring details
references, documentation
technical records
editing, debugging
develop test cases
expert on language
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-28
conceptual integrity
• better to reflect one set of design ideas than to
add independent and uncoordinated features &
improvements
• purpose of programming system is to make
computer easy to use
• simplicity & straightforwardness come from
conceptual integrity
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-29
conceptual integrity example
• OS/360
– architect manager: said his 10 person team could write
specifications in 10 months (3 months late)
– control program manager: his 150 people could get it done in
7 months, & if his people didn’t do this, they would have
nothing to do
– architect manager: control program people would take 10
months, do a poor job
– Brooks gave to control program group
– took 10 months, plus added year to debugging
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-30
estimating programming time
• duration is exponential (bigger jobs take more than
proportionally longer)
– analogous to sprint 100 yards, running 1 mile
effort = K x (number of instructions)1.5
• one manager noted programming taking twice as long
as estimate
– only getting 20 hours of work/week
– machine down, divert to emergencies, meetings,
paperwork, sick
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-31
programming estimation
• the more interactions, the less productivity
– interaction = coordination with others
• high level languages increase productivity
– now tools should almost eliminate the
programming component, but there are other
activities (the more unpredictable ones)
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-32
(prototyping)
• in most projects, the first system built is barely
usable
• PLAN THE SYSTEM FOR CHANGE
–
–
–
–
–
modularization
subroutining
interfaces
documentation of interfaces
high-level languages
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-33
software estimation
Charles R. Symons
Software Sizing and Estimating: Mk II FPA
Wiley [1991]
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-34
software production cycle
•
•
•
•
DESIGN
DEVELOPMENT
production
MAINTENANCE
not hard
recognized as a difficult task
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-35
software production cycle
• SYSTEM SIZE a variable in
– DESIGN
– DEVELOPMENT
– MAINTENANCE
• components of system size
– amount of information processed
– technical requirements
– performance drivers (objectives)
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-36
objectives
• COST
• TIME
• QUALITY
minimization
minimization
assure that product
performs to specifications
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-37
size measures
• source lines of code
+ concrete measure
- what lines?
- logical or physical?
- housekeeping?
- different across languages
• most commonly used
• some economy of scale
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
Albrecht’s Function Point
Analysis Method
AIMS
– consistent measure
– meaningful to end user
• function points should be easier to understand than
lines of code
– rules easy to apply
– Can estimate from requirements specification
– independent of technology used
© McGraw-Hill/Irwin 2004
7-38
Information Systems Project Management—David Olson
7-39
Albrecht’s system
• count
–
–
–
–
external user inputs
enquiries
outputs
master files delivered (internal & external)
• get points for every useful activity
function points=
information processing size
x technical complexity adjustment
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-40
Albrecht’s System
complexity tables
– data elements referenced
• 1-4
• 5-15
• 16 or more
– file types referenced
• 0 or 1
• 2
• 3 or more
table of SIMPLE, AVERAGE, COMPLEX
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-41
Albrecht complexity
1-4 data
5-15 data
16+ data
0 or 1 file SIMPLE SIMPLE AVERAGE
types
SIMPLE AVERAGE COMPLEX
2 file
types
AVERAGE COMPLEX COMPLEX
3+ file
types
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-42
Albrecht functional
multipliers
COMPLEX
external input
external output
logical internal file
ext interface file
external inquiry
SIMPLE
AVERAGE
x3
x4
x7
x5
x3
x4
x5
x 10
x7
x4
x6
x7
x 15
x 10
x6
add up, get total unadjusted function
points
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
Albrecht Technical
Complexity Adjustment
14 general application characteristics
data communications
complex processing
heavily used configuration
operational ease
end user efficiency
on-line update
distributed functions
performance
re-useability
installation ease
transaction rate
on-line data entry
multiple sites
facilitate change
not present
=0
insignificant influence = 1
moderate influence = 2
average influence
significant influence
strong influence
=3
=4
=5
TCA = 0.65+(0.01x(total degree of influence))
© McGraw-Hill/Irwin 2004
7-43
Information Systems Project Management—David Olson
7-44
Symons’ complaints
• Albrecht method
– alternative counting practices
– weights used are questionable
– large systems under-weighted
• (XEROX 1985: rapid drop in productivity with
increasing system size)
– range of points too narrow
• but MUCH BETTER THAN SLOC
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
Mk II Function Point
Analysis
• modification of Albrecht
• use same Technical Complexity
Adjustment
• extend general application
characteristics to 19 or more
• weights adjusted
• Information Processing Size changed
the most
© McGraw-Hill/Irwin 2004
7-45
Information Systems Project Management—David Olson
7-46
logical transactions
logical transaction =
– unique input/process/output combination
triggered by unique event of interest to the user
– or need to retrieve information
•
•
•
•
create a customer
update an account
enquiry
produce monthly summary report
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-47
unadjusted function points
UFPs =
+
+
WI x # input data element types
WE x # entity types referenced
WO x # output data element types
weights determined by calibration
determine UFPs for system by adding UFPs for all system
logical transactions
assumes work directly proportional to # of data elements;
size of process proportional to # data entries;
weights meaningful, obtainable
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-48
complexity adjustment
Albrecht’s method with 2 modifications:
– extend general application list to 19
•
•
•
•
•
interfaces to other applications
special security features
direct access requirement for third parties
special user training facilities
documentation requirements
TCA = 0.65 + C x (total degree of influence)
where C is obtained by calibration
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-49
calibration
by CALIBRATION Symons means fit the
company’s data
regress
industry averages:
WI 0.58
WE 1.66
WO 0.26
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-50
Mk II FPA summary
•
•
•
•
obtain general understanding of the system
construct a model of primary entities
identify logical transactions
score degree of influence of all 19 general
application characteristics (plus client specific)
• obtain total project work-hours, calibrate
• calculate function points
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-51
comparison
accepted standard
clarity
structured?
easy to use?
automatable?
use for estimating?
SLOC
no
potentially
no
yes
yes
sometimes
Albrecht
Mk II FPA
yes
yes
some subjective objective
no
yes
no
no
no
yes
yes
yes
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-52
Estimation Example
SLOC
Function Point
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-53
Source Lines of Code
• NEED DATABASE of past experience
AVERAGES effort
cost
documentation
errors
defects
people
KLOC
33 months
$361 (thousand)
1194 pages
201
52
4
20.543
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-54
Implementing LOC
• Estimate structured lines of code
10,000
• averages proportional to LOC 10/20.543 = 0.487
effort
cost
documentation
errors
defects
people
33 months
$361 thou
1194 pages
201
52
4
 0.487 =
 0.487 =
 0.487 =
 0.487 =
 0.487 =
 0.487 =
© McGraw-Hill/Irwin 2004
16
$177,000
581 pp.
98
25
2 people
Information Systems Project Management—David Olson
7-55
Function Point Calculation
1 - get count-total
number of features times complexity
2 - get Fi
rate 14 factors (0-5), total
3 - FP = count-total  [0.65 + 0.01   Fi ]
4 - multiply historical averages (623) per FP
by this FP
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-56
1 - get count total
Complexity Weighting
simple average complex
# user inputs
__  3 + __  4 + __  6
# user outputs
__  4 + __  5 + __  7
# user inquiries
__  3 + __  4 + __  6
# files
__  7 + __  10+__  15
# external interfaces __  5 + __  7 + __  10
© McGraw-Hill/Irwin 2004
product
= ___
= ___
= ___
= ___
= ___
Information Systems Project Management—David Olson
7-57
1 - get count-total
Bank accounts record system involving
36 user inputs
simple complexity
5 user outputs
average complexity
20 user inquiries
simple complexity
40 files accessed
simple complexity
3 external interfacesaverage complexity
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-58
1 - get count-total
Complexity Weighting
simple average complex
36 user inputs
36  3 + __  4 + __  6
5 user outputs
__  4 + 5  5 + __  7
20 user inquiries
20  3 + __  4 + __  6
40 files
40  7 + __  10+__  15
3 external interfaces __  5 + 3  7 + __  10
TOTAL
© McGraw-Hill/Irwin 2004
product
= 108
= 25
= 60
= 280
= 21
494
Information Systems Project Management—David Olson
7-59
2 - get Fi
F1 require reliable backup & recovery?
F2 data communications required?
F3 distributed processing functions?
F4 performance critical?
F5 run on existing, heavily utilized environment?
F6 require on-line data entry?
F7 on-line data entry from multiple operations?
F8 master files updated on-line?
F9 inputs, outputs, files, or inquiries complex?
F10 internal processing complex?
F11 code designed to be reusable?
F12 conversion and installation included in the design?
F13 system designed for multiple installations in different orgs?
F14 application designed to facilitate change and ease of use?
© McGraw-Hill/Irwin 2004
Significant 4
Moderate
2
Significant 4
Average
3
Essential
5
Essential
5
Incidental
1
No influence 0
Incidental
1
Incidental
1
Average
3
Average
3
No influence 0
No influence 0  = 32
Information Systems Project Management—David Olson
7-60
3- Calculate FP
FP = count-total  [0.65 + 0.01   Fi ]
= 494  [0.65 + 0.01  32 ] = 479.18
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-61
4- Multiply by Historical
• Estimated FP 479.18
• averages proportional to avg 479.18/623 = 0.77
effort
cost
documentation
errors
defects
people
33 months
$361 thou
1194 pages
201
52
4
 0.77 =
 0.77 =
 0.77 =
 0.77 =
 0.77 =
 0.77 =
© McGraw-Hill/Irwin 2004
25.4
$278,000
918 pp.
155
40
3 people
Information Systems Project Management—David Olson
7-62
Scheduling
• “Coding is 90% finished half of the coding time”
• “Debugging is 99% complete most of the time”
• MILESTONES: concrete events
studies of government projects
estimates carefully updated every 2 weeks before activity
starts rarely change
during activity, overestimates drop
underestimates don’t change until into activity
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-63
Control
• When delay is first noticed,
the tendency is to not report it
• STATUS INFORMATION
what is going on
• ACTION INFORMATION
learning this causes something to be done
• KEY: know when which case applies
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-64
Summary
• Estimation of duration & cost key to sound project
decision making
• Estimating software development very difficult
– Can improve by
• Keeping records
• Using productivity-enhancing methods
• Use more off-the-shelf software
– Estimation methods can become accurate if
systematically applied
© McGraw-Hill/Irwin 2004
Download