PPT03_4

advertisement
PLANNING AND MANAGING THE
PROJECT
CODY STANISH
3.1 TRACKING PROGRESS
• Do you understand the customer’s needs?
• Can you design a system to solve customer’s problems or satisfy their needs?
• How long will it take to develop the system?
• How much will it cost to develop such a system?
3.1 TRACKING PROGRESS
• Project schedule describes software-development cycle for a project
- enumerates the phases or stages of a project
- break each phase into discrete tasks to be completed
• Portrays interactions among the activities and estimates the time for each task
3.1 TRACKING PROGRESS
• Activity: takes place over a period of time
• Milestone: completion of an activity
• Precursor: event or set of events that must occur for an activity to start
• Duration: length of time needed to complete an activity
• Due date: date by which an activity must be completed
3.1 TRACKING PROGRESS
• Project development can be
separated into phases which are
made up of steps which are made
up of activities
3.1 TRACKING PROGRESS
• Activity graphs depict dependencies
among the activities
• The nodes represent the project
milestone
• The lines represent the activities
involved
3.1 TRACKING PROGRESS
• Estimated time can be added to an
activity graph to give a better
understanding of the project’s
schedule
3.1 TRACKING PROGRESS
• Critical Path Method(CPM) is the minimum amount of time it will take to
complete a project by revealing the most critical activities to complete
• Real time(actual time): estimated amount of time required for the activity to
be completed
• Available time: the amount of time available in the schedule for completion
• Slack time: the difference between the available time and the real time
3.1 TRACKING PROGRESS
• Critical path: the slack at every node is zero
• Slack time = available time – real time or latest start time – earliest start time
3.1 TRACKING PROGRESS
• Includes information about early and
late start dates
3.1 TRACKING PROGRESS
• Activities are shown in parallel which
helps understand which activities can
be performed together
3.1 TRACKING PROGRESS
• Displays who is assigned to the
project and those needed for each
stage
3.1 TRACKING PROGRESS
• This is an example of how to monitor
expenditures
3.2 PROJECT PERSONNEL
• Key activities which need personnel
- requirement
analysis
- system design
- program design
- program implementation
- testing
- training
- maintenance
- quality assurance
• It is advantageous to assign different people to different responsibilities
3.2 PROJECT PERSONNEL
•
When assigning responsibilities to personnel you need to take into account their experience and
expertise
- Ability to perform work
- Interest in work
- Experience with similar application, tool, languages, techniques, and
development environments
- Training
- Ability to communicate with others
- Ability to share responsibility
- Management skills
3.2 PROJECT PERSONNEL
•
A project’s progress can be affected by
degree of communication and ability of
individuals to communicate their ideas
•
Breakdown in communication and
understanding can result in software
failures
•
For example, if there are n worker(s) in
the project, then there exists n(n-1)/2
pairs of communication
3.2 PROJECT PERSONNEL
• Certain work styles include
- Extroverts: tell their thoughts
- Introverts: ask for suggestions
- Intuitives: base their decisions on feelings
- Rationals: base decisions on facts
3.2 PROJECT PERSONNEL
• Work styles determine their
communication styles
• Understanding work styles help to
be flexible and give information
based on their priorities
• Impacts customers, developers, and
users interactions
3.2 PROJECT PERSONNEL
• Examples of project organization
- Chief programmer team: one person is responsible for a system’s
design and development
- Egoless approach: everyone is equally responsible, democratic
3.2 PROJECT PERSONNEL
• Team members must communicate
with the chief
3.2 PROJECT PERSONNEL
• Characteristics of project and
organizational structure
Highly structured
High certainty
Repetition
Large projects
Loosely structured
Uncertainty
New techniques or technology
Small projects
3.3 EFFORT ESTIMATION
• Estimating costs is crucial for project
planning and management
• Estimating costs must be done as early as
possible
• Types of cost
- facilities: hardware, space,
- furniture, telephones
- software tools for designing
- staff
3.3 EFFORT ESTIMATION
•
•
Types of estimation techniques include
Expert judgement
- Top down or bottom up
- Delphi technique: based on the average of secret expert judgements
•
Algorithmic methods
- Watson and Felix model: E = 5.25 S^0.91
- Bailey and Basili model: E = 5.5 + 0.73 S^1.16
3.3 EFFORT ESTIMATION
• An example of Expert judgement is
the Wolverton Model
• Two factors that influence difficulty
- problem is old(O) or new (N)
- easy(E) or moderate(M)
Type of software
Control
Input/output
Pre/post processor
Algorithm
Data management
Time-critical
OE
21
17
16
15
24
75
OM
27
24
23
20
31
75
Difficulty
OH NE
30 33
27 28
26 28
22 25
35 37
75 75
NM
40
35
34
30
46
75
NH
49
43
42
35
57
75
3.3 EFFORT ESTIMATION
1. Customer interface complexity
•
An example of an Algorithmic Method
is the Watson and Felix Model
•
The productivity index is in the equation
•
29 factors that affect productivity
- 1 for increase in productivity
- 0 for decrease in productivity
2. User participation in requirements
definition
3. Customer-originated program
design changes
4. Customer experience with the
application area
5. Overall personnel experience
6. Percentage of development
programmers who participated in the
design of functional specifications
7. Previous experience with the
operational computer
8. Previous experience with the
programming language
9. Previous experience with
applications of similar size and
complexity
10. Ratio of average staff size to
project duration (people per month)
11. Hardware under concurrent
development
12. Access to development computer
open under special request
13. Access to development computer
closed
14. Classified security environment
for computer and at least 25% of
programs and data
15. Use of structured programming
16. Use of design and code
inspections
17. Use of top-down development
18. Use of a chief programmer team
19. Overall complexity of code
20. Complexity of application
processing
21. Complexity of program flow
22. Overall constraints on program’s
design
23. Design constraints on the
program’s main storage
24. Design constraints on the
program’s timing
25. Code for real-time or interactive
operation or for execution under
severe time constraints
26. Percentage of code for delivery
27. Code classified as
nonmathematical application and
input/output formatting programs
28. Number of classes of items in the
database per 1000 lines of code
29. Number of pages of delivered
documentation per 1000 lines of code
3.3 EFFORT ESTIMATION
• Bailey-Basili is another example
• Minimize standard error estimate to
produce E = 5.5 + 0.73S^1.16
• Adjust initial estimate based on difference
ratio
Total methodology
(METH)
Tree charts
Top-down design
Cumulative complexity
(CPLX)
Customer interface
complexity
Application complexity
Formal documentation
Program flow complexity
Chief programmer
teams
Formal training
Formal test plans
Internal communication
complexity
Database complexity
External communication
complexity
Customer-initiated
program design changes
Design formalisms
- If R is the ratio between the actual
effort, E, and the predicted effort, E’,
then the effort adjustment is defined as
- ERadj = R – 1 if R > 1
= 1 – 1/R if R < 1
Code reading
Unit development
folders
Cumulative experience
(EXP)
Programmer
qualifications
Programmer machine
experience
Programmer language
experience
Programmer application
experience
Team experience
3.3 EFFORT ESTIMATION
• The COCOMO model was introduced by Boehm
• The most current version is COCOMO II
• Basic models
- E = b(S^c)m(X)
- bS^c is the initial size estimate
- M(X) is vector of cost driver information
3.3 EFFORT ESTIMATION
•
COCOMO II Stages of development
•
Application composition
- prototyping to resolve high-risk user
interface
- size estimates in object points
•
Early design
- explore alternative architectures
Model Aspect
Stage 1:
Application
Composition
Postarchitecture
- development begins
- size estimates in lines of code
Stage 3:
Post-architecture
Size
Application
points
Function points (FP)
and language
FP and language or source lines
of code (SLOC)
Reuse
Implicit in
model
Equivalent SLOC as
function of other
variables
Equivalent SLOC as function of
other variables
Requirements
change
Implicit in
model
% change expressed as
a cost factor
% change expressed as a
cost factor
Maintenance
Application
Point
Annual
Change
Traffic
Function of ACT, software Function of ACT, software
understanding,
understanding,
unfamiliarity
unfamiliarity
Scale (c) in
nominal effort
equation
1.0
0.91 to 1.23, depending
on precedentedness,
conformity, early
architecture, risk
resolution, team
cohesion, and SEI
process maturity
0.91 to 1.23, depending on
precedentedness, conformity,
early architecture, risk resolution,
team cohesion, and SEI process
maturity
Product cost
drivers
None
Complexity, required
reusability
Reliability, database size,
documentation needs, required reuse,
and product complexity
Platform cost
drivers
None
Platform difficulty
Execution time constraints, main
storage constraints, and virtual
machine volatility
- size estimates in function points
•
Stage 2:
Early
Design
Personnel
None
Personnel capability
cost drivers
and experience
programmer experience,
experience, and personnel continuity
Analyst capability, applications
experience, programmer capability,
language and tool
Project cost
drivers
Use of software tools, required
development schedule, and
None
Required development
schedule, development
environment
multisite development
3.3 EFFORT ESTIMATION
• Machine learning techniques
• Case based reasoning(CBR)
- User identifies new problem as case
- System retrieves similar case
- System uses knowledge from previous
cases
- System suggests solution
• Neural network
- A cause- effect networked trained with
data from the past
3.3 EFFORT ESTIMATION
•
Which model fits your situation?
•
Mean magnitude of relative error(MMRE)
- Absolute value of mean of
(actual – estimate)/actual
- Should be .25 or less
•
Pred(x/100): percentage of projects
where estimate is within x% of actual
- should be .75 or greater for x = .25
Model
Walston-Felix
Basic COCOMO
Intermediate COCOMO
Intermediate COCOMO
(variation)
Bailey-Basili
Pfleeger
SLIM
Jensen
COPMO
General COPMO
PRED(0.25)
0.30
0.27
0.63
0.76
MMRE
0.48
0.60
0.22
0.19
0.78
0.50
0.06-0.24
0.06-0.33
0.38-0.63
0.78
0.18
0.29
0.78-1.04
0.70-1.01
0.23-5.7
0.25
3.3 EFFORT ESTIMATION
• Understanding which types of effort
are needed during development is
important
• Two reports of effort distribution
from different researchers
3.4 RISK MANAGEMENT
• Risk is an unwanted event that has a negative
consequences
• Distinguish risk from project events
- Risk impact: loss associated with an event
- Risk probability: likelihood that an event will occur
• Quantifying risk
- Risk exposure = risk probability x risk impact
• Sources of risk
- Generic
- Project specific
3.4 RISK MANAGEMENT
• Risk exposure calculation
3.4 RISK MANAGEMENT
• Risk reduction
- Avoiding the risk: change requirements of performance or functionality
- Transferring the risk: transfer to another system
- Assuming the risk: accept and control the risk
• Cost of reducing risk
- Risk leverage = (risk exposure before reduction – risk exposure after reduction)/cost of
risk reduction
3.4 RISK MANAGEMENT
• Personnel shortfalls
• Unrealistic schedules or budgets
• Developing wrong functions
• Developing wrong user interfaces
• Gold plating
• Continuing stream of requirement changes
• Shortfalls in externally performed tasks
• Shortfalls in externally furnished components
• Real time performance shortfalls
• Straining computer science capabilities
3.5 PROJECT PLAN
• Items to be included in a project plan
•
•
•
•
•
•
•
•
•
Project scope
Project schedule
Project team organization
Technical description of system
Project standards and procedures
Quality assurance plan
Configuration management plan
Documentation plan
Data management plan
•
•
•
•
•
•
Resource management plan
Test plan
Training plan
Security plan
Risk management plan
Maintenance plan
3.5 PROJECT PLAN
• List of people in the development team
• List of hardware and software
• Standards and methods
-
Algorithms
Tools
Review/inspection techniques
Design language
Coding languages
Testing techniques
3.6 PROCESS MODELS AND MANAGEMENT
• Establish an appropriately large shared
vision
• Delegate and get commitments from
participants
• Inspect and provide supportive feedback
• Acknowledge every advance and learn
throughout the program
• Organize to allow technical focus and
project focus
3.6 PROCESS MODELS AND MANAGEMENT
• Accountability modeling consists of
• Matrix organization
- Engineers belong to a functional unit primarily
based on their type of skill
• Integrated product development team
•
- Combines engineers from different
functional units
Activity tracked using cost estimation, critical
path analysis, and schedule tracking
- Earned value a common measure of
progress
3.6 PROCESS MODELS AND MANAGEMENT
• In this example teams have
overlapping activities
• This activity map shows progress of
each activity
3.6 PROCESS MODELS AND MANAGEMENT
• Progress on activities are using this
earned value chart
3.6 PROCESS MODELS AND MANAGEMENT
•
Life cycle objective
-
Objectives
Milestones and schedules
Responsibilities
Approach
Resources
Feasibility
•
Life cycle architecture
- define system and software
architectures and address
architectural choices and risks
•
Operational capability
- Readiness of software
- Deployment site
- User training
3.6 PROCESS MODELS AND MANAGEMENT
• The Win – Win spiral method is used
as a supplement to milestones
Download