Incremental Development Productivity Decline (IDPD)

advertisement
University of Southern California
Center for Systems and Software Engineering
Incremental Development
Productivity Decline (IDPD) in
Software Development
Center for Software and Systems Engineering,
Annual Research Review
March 2012
Ramin Moazeni– Ph.D. Student
Computer Science Department
Viterbi School of Engineering
University of Southern California
1
University of Southern California
Center for Systems and Software Engineering
Outline
• Incremental Development Productivity Decline
(IDPD)
–
–
–
–
Overview
Relevant Examples
Cost Drivers
Effect on number of increments
2
University of Southern California
Center for Systems and Software Engineering
Incremental Development
Productivity Decline (IDPD)
• Overview
– The “Incremental Productivity Decline” (IDPD) factor represents the
percentage of decline in software producibility from one increment to
the next.
– The decline is due to factors such as previous-increment breakage
and usage feedback, increased integration and testing effort.
– Another source of productivity decline is that maintenance of reused
previous build software is not based on equivalent lines of software
credited during the previous build, but on the full amount of reused
software.
• Build 1: 200 KSLOC new, 200K Reused@20% yields a
240 K ESLOC “count” for estimation models.
• Build 2: there are 400 KSLOC of Build 1 to maintain and integrate
– Such phenomena may cause the IDPD factor to be higher for some
builds and lower for others.
3
University of Southern California
Center for Systems and Software Engineering
Incremental Development
Productivity Decline (IDPD)
• Example: Site Defense BMD Software
– 5 builds, 7 years, $100M
– Build 1 productivity over 300 SLOC/person month
– Build 5 productivity under 150 SLOC/person month
• Including Build 1-4 breakage, integration, rework
• 318% change in requirements across all builds
• A factor of 2 decrease across 4 builds corresponds to an
average build to build IDPD factor of 20% productivity
decrease per build
– Similar IDPD factors have been found for:
• Large commercial software such as multi-year slippage in
delivery of MS Word for Windows and Windows Vista.
4
University of Southern California
Center for Systems and Software Engineering
Quality Management Platform (QMP)
Project
• QMP Project Information:
– Web-based application
– System is to facilitate the process improvement initiatives in
many small and medium software organizations
– 6 builds, 6 years, different increment duration
– Size after 6th build: 548 KSLOC mostly in Java
– Average staff on project: ~20
5
University of Southern California
Center for Systems and Software Engineering
Quality Management Platform (QMP)
Project
• Data Collection
– Most data come from release documentation, build reports,
and project member interviews/surveys
– Data include product size, effort by engineering phase, effort
by engineering activities, defects by phase, requirements
changes, project schedule, COCOMO II driver ratings (rated
by project developers and organization experts)
– Data collection challenges:
• Incomplete and inconsistency data
• Different data format, depends on who filled the data report
• No system architecture documents available
6
University of Southern California
Center for Systems and Software Engineering
Quality Management Platform (QMP) Project
Data Analysis – Productivity Trends
Productivity Trend
Build
Size
(KSLOC)
Effort
(PH)
Productivity
(SLOC/PH)
Productivity
Variance
1
69.12
7195
9.61
0.0%
2
64
9080
7.05
-26.6%
8.00
6.00
3
19
6647
2.86
-59.4%
4
39.2
8787.5
4.46
56.1%
5
207
31684.5
6.53
46.5%
6
65
16215.1
4.01
-38.6%
12.00
10.00
9.61
7.05
6.53
4.46
4.00
4.01
2.86
2.00
0.00
0
1
2
3
4
5
6
Builds
Productivity
Linear (Productivity)
Table 1
• The slope of the trend line is -0.76 SLOC/PH per build
• Across the five builds, this corresponds to a 14% average
decline in productivity per build. This is smaller than the 20%
Incremental Development Productivity Decline (IDPD) factor
observed for a large defense program
• Most likely because the project is considerably smaller in
system size and complexity
7
7
University of Southern California
Center for Systems and Software Engineering
Causes of IDPD Factor
Explorations of the IDPD factors on several projects, the following
sources of variation were identified:
Higher IDPD (Less Productive)
Lower IDPD (More Productive)
Effort to maintain previous increment; bug
fixing, COTS upgrades, interface changes
Next increment requires previous increment
modification
Next increment spun out to more platforms
Next increment has more previous
increments to integrate/interact with
Next increment touches less of the
previous increment
Staff turn over reduces experience
Current staff more experienced,
productive
Next increment software more complex
Next increment software less
complex
Previous increment incompletely developed,
tested, integrated
Previous increment did much of the
next increment’s requirements,
8
architecture
University of Southern California
Center for Systems and Software Engineering
Software Categories & IDPD Factor
• Categorization of different type of software with different
productivity curves through multiple build cycles, require different
IDPD factors applicable to each class of software.
Software Category
Impact on IDPD Factor
Non-Deployable
Throw-away code. Low Build-Build integration. High
reuse. IDPD factor lowest than any type of
deployable/operational software
Infrastructure Software
Often the most difficult software. Developed early on in
the program. IDPD factor likely to be the highest.
Application Software
Builds upon Infrastructure software. Productivity can be
increasing, or at least “flat”
Platform Software
Developed by HW manufacturers. Single vendor,
experienced staff in a single controlled environment.
Integration effort is primarily with HW.
IDPD will be lower due to the benefits mentioned above.
Firmware (Single Build) IDPD factor not applicable. Single build increment.
9
University of Southern California
Center for Systems and Software Engineering
IDPD Model & Data Definition
•
Based on analysis of the IDPD effects, the next increment’s ESLOC,
IDPD factor, and software effort can be estimated
– KSLOC (I) in the existing increments 1 through I
• The fraction F of these are going to be modified
– The previous increments (I) would then be considered as reused software and
their equivalent lines of code calculated as
•
EAF (I+1): EAF factor for the next increment’s development EAF (I+1)
would be the EAF (I) adjusted by the product of Delphi-TBD effort
multipliers for the following IDPD factors:
IDPD Drivers
Description
RCPLX
Complexity of Increment I+1 relative to
complexity of existing software
IPCON
Staff turnover from previous increment’s staff
PRELY
Previous increments’ required reliability
IRESL
Architecture and risk resolution (RESL) rating
of I+1 relative to RESL of 1..I
10
University of Southern California
Center for Systems and Software Engineering
IDPD Model & Data Definition - 2
• The total EKSLOC(I+1) = EKSLOC (I) + EKSLOC (I+1 Added)
and the EAF (I+1) factor would be used in the standard COCOMO
II equations to compute PM (I+1).
EKSLOC (I) = KSLOC (I) * (0.4*DM + 0.3*CM + 0.3*IM)
• The standalone PM for Increment I+1, SPM (I+1), would be
calculated from EKSLOC (I+1 Added) and EAF (I), using the
standard COCOMO II equations.
• The IDPD (I+1) factor would then be
IDPD(I+1) = SPM (I+1) / PM (I+1)
11
University of Southern California
Center for Systems and Software Engineering
Data Collection
• Collect data of completed Incremental projects from industry
• Inclusion Criteria:
– Starting and ending dates are clear
– Has at least two increments of significant capability that have been
integrated with other software (from other sources) and shown to work in
operationally-relevant situations
– Has well-substantiated sizing and effort data by increment
– Less than an order of magnitude difference in size or effort per increment
– Realistic reuse factors for COTS and modified code
– Uniformly code-counted source code
– Effort pertaining just to increment deliverables
12
University of Southern California
Center for Systems and Software Engineering
Data Collection - 2
• Metrics
Metric
Description
Product
Name of software product
Build
Build number. A software product that has
multiple builds (increments)
Effort
Total time in person-hours
New SLOC
SLOC count of new modules
Reused SLOC
SLOC count of reused modules
COTS SLOC
SOLC count of COTS modules
CM
The percentage of modified code
DM
The percentage of design modified
IM
The percentage of implementation and test
needed for the reused/COTS modules
SU
Software Understanding
UNFM
Programmer Unfamiliarity
Cost drivers
Rating levels for 22 cost drivers
13
University of Southern California
Center for Systems and Software Engineering
Preliminary Results - 1
• Delphi Results
– 2 rounds of Delphi at CSSE and ISCAS
IDPD Factor
Productivity
Range
Variance
RCPLX
1.79
0.098
IPCON
1.77
0.066
PRELY
1.49
0.029
IRESL
1.53
0.029
14
University of Southern California
Center for Systems and Software Engineering
Preliminary Results - 2
• IDPD Driver ratings
VL
L
N
H
VH
PRELY
0.972
0.986
1
1.008
1.025
RCPLX
0.985
0.904
1
0.783
0.732
IPCON
0.841
0.900
1
0.951
1.052
IRESL
1.065
1.046
1
0.988
0.933
15
University of Southern California
Center for Systems and Software Engineering
Preliminary Results - 3
• Relative Impact of Cost drivers on Effort: Productivity Ranges
CPLX
PCON
RESL
RELY
0
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
16
University of Southern California
Center for Systems and Software Engineering
Question?
17
Download