Information Technology Project Management * Third Edition

Information Technology Project
Management – Third Edition
By Jack T. Marchewka
Northern Illinois University
Copyright 2009 John Wiley & Sons, Inc. all rights reserved. Reproduction or translation of this work beyond that permitted in Section 117 of the
1976 United States Copyright Act without the express permission of the copyright owner is unlawful. Request for further information should be
addressed to the Permissions Department, John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for
distribution or resale. The Publisher assumes no responsibility for errors, omissions, or damages caused by the use of these programs or from the
1
use of the information contained herein.
The Work Breakdown Structure
and Project Estimation
Chapter 6
2
Project Time Management
PMBOK®

Activity definition


Activity sequencing


Estimating the time to complete each activity
Schedule development


Identifying the type of resources (people, technology, facilities, etc.) and the quantity of
resources needed to carry out project activities
Activity duration estimation


Determining whether activities can be completed sequentially or in parallel and any
dependencies that may exist among them
Activity resource estimation


Identifying what activities must be completed to produce the project scope deliverables
Based on the availability of resources, the activities, their sequence, and time estimates, a
schedule for the entire budget can be developed
Schedule control

Ensuring that proper processes and procedures are in place in order to control changes
to the project schedule
3
Work Breakdown Structure (WBS)




The WBS represents a logical decomposition of the work
to be performed and focuses on how the product,
service, or result is naturally subdivided.
It is an outline of what work is to be performed
Provides a link between the project’s scope and detailed
project plan
Once the activities are defined, the next step is to
estimate the duration of each activity

Estimation is not an exact science but the estimates improve as
more details about the project are uncovered
4
Work Package

The WBS subdivides the project into smaller
components and more manageable units of work called
work packages


Enables the development of a project plan schedule,
budget and subsequent monitoring of the project’s
progress
Each phase should provide at least one specific
deliverable (a tangible and verifiable piece of work)

Activities or tasks are identified in order to produce the
project’s deliverable
5
Work Package
6
Deliverables versus Milestones

Deliverables

Tangible, verifiable work products


Reports, presentations, prototypes, etc.
Milestones


Significant events or achievements that provides evidence that the
deliverable has been completed or that a phase is formally over
Focuses on the achievement, not the deliverable




Smaller, shorter term deliverables keep the team focused
Cruxes (proof of concepts)


Deliverable – prototype
Milestone – stakeholder’s formal acceptance of the user interface
Successfully use a piece of software for the first time on small set of data,
validates proof of concept for expanding to full blown database
Quality control

No user acceptance, no moving forward. The deliverable must be done
7
right.
Developing the WBS


A work package is developed for each of the phases and deliverables
defined in the Deliverable Structure Chart (DSC)
Focus on Testing box with deliverable of Test Plan and Test Results
8
Deliverable: Test Results Report

Logical Activities to produce the test results document:
1. Review the test plan with the client so that key
stakeholders are clear as to what will be tested, how the
tests will be conducted, and when the tests will be carried
out.
2. Carry out the tests as outlined in the plan.
3. Once the test results are collected, we need to analyze
them.
4. The results should be summarized in the form of a report
and presentation to the client.
5. If all goes well, the client will sign-off or approve the test
results and then we can move on to the implementation
phase of the project. If not, then we need to address and
fix any problems.
9
Example Work Breakdown Schedule
10
The WBS Should Follow the Work Package Concept
11
The WBS…

Should be “deliverable-oriented”


Should support the project’s MOV


May require a number of iterations to be accomplished
100% rule – The next level decomposition of a WBS element
(child level) must represent 100% of the level of the work
applicable to the next higher (parent) element
Have enough detail to support planning and control

Level of detail should support the development of the
project plan as well as allow the PM to monitor and
compare the project’s actual progress to the original plan’s
schedule and budget. Not too much detail and not to little
12
The WBS…

Should involve those who will be doing the work


The experience and expertise of those involved will
ensure that the level of detail will be appropriate
Learning cycles and lessons learned can support the
development of a WBS


Focus on what they know (facts), what they think they
know (assumptions) and what they need to find out
(research) in order to develop a more useful WBS.
Lessons learned help keep the project plan realistic and
complete.
13
Estimation Questions
What are you going to estimate?
Where do you start?
How do you estimate?
14
Estimation Techniques - Traditional
Project Management Approaches

Most difficult and critical activity in project management








Time to complete an activity impacts cost due to resource(s)
needed – the project budget is thus impacted
Guesstimating
Delphi Technique
Time Boxing
Top-Down
Bottom-Up
Analogous Estimates (Past experiences)
Parametric Modeling (Statistical)
15
Guestimating



Estimation by guessing or just picking numbers out of the air is not the best
way to derive a project’s schedule and budget.
Unfortunately, many inexperienced project managers tend to guesstimate, or
guess at the estimates, because it is quick and easy.
When put on the spot to give an estimate, give a range of time and cost and
say that more research will enable a more confident estimate
16
Delphi Technique



Involves multiple, anonymous experts
Each expert makes an estimate
Estimates compared



If close, can be averaged
If not, do another iteration until consensus is reached
Can take longer than most other estimation methods, but
can be very effective and provide reasonable assurance
17
Time Boxing

A “box” of time is allocated
for a specific activity, task, or
deliverable



May be based on a requirement
that can’t be changed
Can focus a team if used
effectively
Can demoralize a team if not
used effectively


May result in long hours and
pressure
May not succeed 18
Top-Down



Mandate from above has
predetermined the time and
cost of the project
(competitor, satisfy client)
Top & middle managers
determine overall project
schedule &/or cost
Lower level managers are
expected to breakdown
schedule/budget estimates
into specific activities (WBS)

Previous project experience
can help in allocating time to
activities
19
Top-Down

When top-down estimation is done by people
independent of the project it may be overly optimistic or
overly aggressive

Death March project





Project schedule has been compressed 50% or more
Staff has been reduced by 50% or more
Budget and resources have been reduced by 50% or more
Functionality, features or other requirements are twice what they
should be under typical circumstances
Can force the PM to examine the project's risks more closely
so that a specific budget or schedule target can be achieved
20
Bottom-Up




Schedules & budgets are
constructed from WBS
Starts with people who will be
doing the work
Schedules & budgets are the
aggregate of detailed activities
& costs
Analogous estimation - use
information from previous,
similar projects as a basis for
estimation

BUT – is everything else the
same?
21
Estimates are made for each activity in
the WBS
6.2 Test Results Report
6.2.1 Review test plan with client
6.2.2 Carry out test plan
6.2.3 Analyze results
6.2.4 Prepare test results report and presentation
6.2.5 Present test results to client
6.2.6 Address any software issues or problems
1 day
5 days
2 days
3 days
1 day
5 days
How did we come up with these estimates? Using a technique,
or combination of techniques, with the exception of guestimating!
22
Estimating Techniques Software Engineering Approaches




Lines of Code (LOC)
Function Points
COCOMO
Heuristics
Software engineering techniques focus on estimating the size of
the system to be developed
23
Software Engineering Metrics


Software engineering focuses on processes, tools and
methods for developing a quality approach to developing
software
Metrics provide the basis for SE and refers to a broad range
of measurements for objectively evaluating computer
software

The greatest challenge for estimating an IT project is estimating
the time and effort for the largest deliverable of the project – the
application system



True for maintenance projects and installation of packaged software as
well
Trying to estimate something that is not well defined until the later stages
of the project life cycle
The complexity and technical challenges are unknown or optimistically
24
glossed over in the early stages of the project
Software Engineering Metrics
Size
Application
Estimate
Complexity
Constraints &
Influences
25
Lines of Code (LOC)

Counting lines of code is a widely used software metric,
but it is also very controversial

What counts as LOC





comments? Could be inflated but could be important
documentation
variable declaration?
more experienced programmers write more efficient code in
fewer lines
programming language is important variable (assembler vs C++)
programmers could write less efficient code is being measured by
LOC
26
Function Point Analysis

Developed to address the inadequacies of LOC




Proposed by Allan Albrecht of IBM in 1979
It is a synthetic metric much as hours, kilos and Celsius are
Focuses on the functionality and complexity of the application
Independent of the Technology



Avoids the program of different programming languages or technology platforms
FP analysis is reliable in the sense that two analysts trained in FP analysis will
obtain the same results within a margin of error
Two main organizations oversee the rules, guidelines, standards
and certification for FP analysis

IFPUG (www.ifpug.org) and UFPUG
27
Function Point Analysis



A FAP is done early on based on the project’s scope followed by
a more detailed analysis during the analysis and design stage
FAP is based on an evaluation of five data and transactional types
that define the application boundary
5 Primary Elements





Inputs
Outputs
Inquiries
Logical Files
Interfaces
28
The Application Boundary for Function
Point Analysis
29
Function Point Analysis

Internal Logical File (ILF) – AN ILF is a logical file that stores data within the application
boundary.

For example, each entity in an E-R diagram would be considered an ILF. The complexity of an ILF can
be classified as low, average, or high based on the number of data elements and subgroups of data
elements maintained by the ILF.

An example of a sub-group would be new customers for an entity called customer.

Examples of data elements would be customer number, name, address, phone number, and so forth. In short,
ILFs with fewer data elements and subgroups will be less complex than lLFs with more data elements and
subgroups.

External interface file (EIF)——An EIF is similar to an ILF; however, an EIF is a file maintained
by another application system. The complexity of an EIF is determined using the same criteria
used for an ILF.

External input (El)—An El refers to processes or transactional data that originate outside the
application and cross the application boundary from outside to inside. The data generally are
added, deleted, or updated in one or more files internal to the application (i.e., internal logical
files).

A common example of an EI would be a screen that allows the user to input information using a
keyboard and a mouse.

Data can, however, pass through the application boundary from other applications.

For example, a sales system may need a customer’s current balance from an accounts receivable system. Based
on its complexity, in terms of the number of internal files referenced, number of data elements (i.e., fields)
30
included, and any other human factors, each EI is classified as low, average, or high.
Function Point Analysis

External output (EO)—Similarly, an EO is a process or transaction that allows data
to exit the application boundary.


Examples of EOs include reports, confirmation messages, derived or calculated totals, and
graphs or charts. This data could go to screens, printers, or other applications. After the
number of EOs are counted, they are rated based on their complexity, like the external inputs
(El).
External inquiry (EQ)——An EQ is a process or transaction that includes a
combination of inputs and outputs for retrieving data from either the internal files or
from files external to the application.

EQs do not update or change any data stored in a file. They only read this information.

Queries with different processing logic or a different input or output format are counted as a
single EQ. Once the EQs are identified, they are classified based on their complexity as low,
average or high, according to the number of files referenced and number of data elements
included in the query
31
Function Point Analysis

Once all of data and transactional types are counted and their relative complexities
rated, an unadjusted function point (UAF) count is determined

The following has been determined after reviewing the application system


ILF: 3 low, 2 average and 1 complex

EIF: 2 average

EI: 3 low, 5 average and 4 complex

EO: 4 low, 2 average and 1 complex

EQ: 2 low, 5 average and 3 complex
The next step is to compute a Value Adjustment Factor (VAF)

It is based on the Degrees of Influence (DI), often called Processing Complexity Adjustment
(PCA)

Derived from the 14 General Systems Characteristics (GSC)

To determine the total DI, each GSC is rated based on the following

0 = not present or no influence

1 = incidental influence

2 = moderate influence

3 = average influence

4 = significant influence

5 = strong influence
32
Function Point Analysis

After reviewing the application, the Total Adjusted Function Points (TAFP) is computed
to be 210

That number can be transformed into development estimates

Productivity – how many function points can a programmer produce in a given period of time


Can be based on lessons learned from prior project experience
LOC – convert function points into lines of code based on the programming language

Backfiring – technique which allows for direct conversion from source code to a function point count

Accuracy not high due to individual programming styles but can be use to create a FP inventory of an
organization’s project portfolio
33
General System Characteristic
Complexity
Low
Internal
Logical Files
(ILF)
External
Interface
Files (EIF)
External
Input (EI)
External
Output (EO)
Average
_3 x 7 = 21
__ x 5 = __
_3 x 3 = 9
_4 x 4 = 16
_2 x 10 = 20
_2 x 7 = 14
_5 x 4 = 20
_2 x 5 = 10
High
_1 x 15 = 15
__ x 10 = __
_4 x 6 = 24
_1 x 7 = 7
Total
56
14
53
33
Data Communications
3
Distributed Data Processing
2
Performance
4
Heavily Used Configuration
3
Transaction Rate
3
On-line Data Entry
4
End User Efficiency
4
Online Update
3
Complex Processing
3
Reusability
2
Installation Ease
3
Operational Ease
3
Multiple Sites
1
Facilitate Change
2
Total Degrees of Influence
External
Inquiry (EQ)
_2 x 3 = 6
_5 x 4 = 20
_3 x 6 = 18
Total Unadjusted Function Points (UAF)
44
Degree of Influence
Value Adjustment Factor
VAF = (TDI * 0.01) + .65
40
VAF = (40 * .01)
+ .65 = 1.05
200
Total Adjusted Function Points
= FP = UAF * VAF
34
FP = 200 * 1.05 =
210
COCOMO (COnstructive COst MOdel)

Developed by Barry Boehm in 1981in his book Software
Engineering Economics




Based on LOC estimates, used to estimate cost, effort and
schedule
Open model (all equations, assumptions, definitions, etc. are freely
available)
It is a parametric model because it uses dependent variables such
as cost or duration, based upon one or more independent
variables that are quantitative indices of performance and/or
physical attributes of the system
Has been extended to COCOMO II
http://sunset.usc.edu/csse/research/COCOMOII/cocomo_main.html
35
COCOMO Models (Effort)

First determine the type of the project

Organic – Routine



Embedded – Challenging




Technology, processes and people are expected to all work together
smoothly
Person Months = 2.4 * KDSI1.05
System to support a new business process or new ground for the
organization
People may be less experienced and the processes and technology less
mature
Person Months = 3.6 * KDSI1.20
Semi-Detached – Middle


Not simple or straightforward but the organization feels confident that the
processes, people and technology are adequate to meet the challenge
Person Months = 3.0 * KDSI1.12
36
Person month = 152 hours
KDSI = Thousands of delivered source instructions i.e., LOC
COCOMO – Effort Example

Semi-Detached




200 total adjusted function points
Using Table 6.3 we know Java averages 53 SLOC per FP
200 FP * 53  10,600 Java LOC
Assuming the project is of medium difficulty, we use the semidetached equation to compute the effort
Person Months = 3.0 * KDSI1.12
= 3.0 * (10.6) 1.12
= 42.21

Having computed effort, we can now determine duration using
the following formulas



Organic
Duration = 2.5 * Effort0.38
Semi-Detached Duration = 2.5 * Effort0.35
Embedded
Duration = 2.5 * Effort0.32
37
COCOMO Duration Example
Duration = 2.5 * Effort0.35
= 2.5 *(42.21)0.35
= 9.26 months
People Required = Effort / Duration
= 42.21 / 9.26
= 4.55
38
COCOMO

Intermediate COCOMO


Advanced COCOMO


Includes all the characteristics of intermediate COCOMO but
with an assessment of the cost driver’s impact over four phases
of development: product design, detailed design, coding/testing
and integration/testing
COCOMO II


Estimates the software development as a function of the size and
a set of 15 subjective cost drives that include attributes of the
end product, the computer used, the personnel staffing and the
project environment
More suited for projects developed using 4GLs (VB, Delphi,
Power Builder)
SLIM

Uses LOC to estimate the project’s size and 22 questions to
calibrate the model
39
Heuristics (Rules of Thumb)
The same base activities will be required for a typical s/w
development project and these activities will require a predictable
percentage of the overall effort


Use knowledge gained from previous project experience when
scheduling a software task:




30% Planning
20% Coding
25% Component test and early system test
25% System test, all components in hand
40
Heuristics (Rules of Thumb)

T. Capers Jones provides these heuristics
 Creeping user requirements will grow at an average rate of 2% per month
from design through coding phases
 FPs raised to the 1.25 power predict the approximate defect potential for
new s/w projects
 Each formal design inspection will find and remove 65% of the bugs present
 Maintenance programmers can repair 8 bugs per staff month
 FPs raised to the 0.4 power predict the approximate development schedule
in calendar months
 FPs divided by 150 predict the approximate number of personnel required
for the application
 FPs divided by 750 predict the approximate number of maintenance
personnel required to keep the application updated
 Rules of thumb are easy, but they are not always accurate
41
The seeds of major software disasters are usually sown in
the first three months of commencing the software project.
Hasty scheduling, irrational commitments, unprofessional
estimating techniques, and carelessness of the project
management function are the factors that tend to introduce
terminal problems.
Once a project blindly lurches forward toward an
impossible delivery date, the rest of the disaster will occur
almost inevitably.
T. Capers Jones, 1988 Page 120
42
Brooks’ Law
Adding manpower to a late software
project makes it later.
43
The Man Month
People
Time versus number of workers
perfectly partitionable task – i.e.,
No communication among them
e.g., reaping wheat.
People
When a task that cannot be partitioned
because of sequential constraints, the
application of more effort has no
effect on the schedule.
44
Adding People

Increases the total effort
necessary



The work & disruption of
repartitioning
Training new people
Added intercommunication
45
What can cause inaccurate estimates?





Scope changes
Overlooked tasks
Poor developer-user
communication
Poor understanding
of project goals
Insufficient analysis





No (or poor)
methodology
Changes in team
Red tape
Lack of project control
Not identifying or
understanding impact of
risks
46
Other Factors to Consider When Estimating









Rate at which requirements may change
Experience & capabilities of project team
Process or methods used in development
Specific activities to be performed
Programming languages or development tools to be used
Probable number of bugs or defects & removal methods
Environment or ergonomics of work space
Geographic separation of team across locations
Schedule pressure placed on the team
47
How can estimates be improved?





Experience!

Lessons learned

Best Practices
Revision
Monitor
Focus on deliverables
Control
48