Future Cost Estimation Challenges

advertisement
University of Southern California
Center for Systems and Software Engineering
Future Challenges for
Systems and Software Cost Estimation
and Measurement
Barry Boehm, USC-CSSE
CS 510, Fall 2015
Fall 2015
USC-CSSE
1
University of Southern California
Center for Systems and Software Engineering
Summary
• Current and future trends create challenges for
systems and software data collection and analysis
– Metrics and “productivity:” “equivalent” size;
requirements/design/product/value metrics; productivity
growth and decline phenomena
– Cost drivers: effects of complexity, volatility, architecture
– Alternative processes: rapid/agile; systems of systems;
evolutionary development
– Model integration: systems and software; cost, schedule, and
quality; costs and benefits
• Updated systems and software data definitions and
estimation methods needed for good management
– COCOMO III effort
Fall 2015
USC-CSSE
2
University of Southern California
Center for Systems and Software Engineering
Metrics and “Productivity”
• “Equivalent” size
• Requirements/design/product/value metrics
• Productivity growth phenomena
• Incremental development productivity
decline
Fall 2015
USC-CSSE
3
University of Southern California
Center for Systems and Software Engineering
Size Issues and Definitions
• An accurate size estimate is the most important input to
parametric cost models.
• Desire consistent size definitions and measurements
across different models and programming languages
• The upcoming Software Metrics Guide addresses these:
– Common size measures defined and interpreted for all the models
– Guidelines for estimating software size
– Guidelines to convert size inputs between models so projects can be
represented in in a consistent manner
• Using Source Lines of Code (SLOC) as common measure
–
–
–
–
Logical source statements consisting of data declarations executables
Rules for considering statement type, how produced, origin, build, etc.
Providing automated code counting tools adhering to definition
Providing conversion guidelines for physical statements
• Addressing other size units such as requirements, use
cases, etc.
Fall 2015
USC-CSSE
4
University of Southern California
Center for Systems and Software Engineering
Equivalent SLOC – A User Perspective *
• “Equivalent” – A way of accounting for relative work done to generate
software relative to the code-counted size of the delivered software
• “Source” lines of code: The number of logical statements prepared by
the developer and used to generate the executing code
– Usual Third Generation Language (C, Java): count logical 3GL statements
– For Model-driven, Very High Level Language, or Macro-based development:
count statements that generate customary 3GL code
– For maintenance above the 3GL level: count the generator statements
– For maintenance at the 3GL level: count the generated 3GL statements
• Two primary effects: Volatility and Reuse
– Volatility: % of ESLOC reworked or deleted due to requirements volatility
– Reuse: either with modification (modified) or without modification (adopted)
*Stutzke, Richard D, Estimating Software-Intensive Systems,
Upper Saddle River, N.J.: Addison Wesley, 2005
Fall 2015
USC-CSSE
5
University of Southern California
Center for Systems and Software Engineering
“Number of Requirements”
- Early estimation availability at kite level
-Data collection and model calibration at clam level
Cockburn, Writing Effective Use Cases, 2001
Fall 2015
USC-CSSE
6
University of Southern California
Center for Systems and Software Engineering
IBM-UK Expansion Factor Experience
Business Objectives
5
Cloud
Business Events/Subsystems
35
Kite
Use Cases/Components
250
Sea level
Main Steps/Main Operations
2000
Fish
Alt. Steps/Detailed Operations
15,000
Clam
1,000K – 1,500K
Lava
SLOC*
*(70 – 100 SLOC/Detailed Operation)
(Hopkins & Jenkins, Eating the IT Elephant, 2008)
7 2015
Fall
USC-CSSE
University of Southern California
Center for Systems and Software Engineering
SLOC/Requirement Data (Selby, 2009)
Fall 2015
USC-CSSE
8
University of Southern California
Center for Systems and Software Engineering
Estimation Challenges: A Dual Cone of Uncertainty
– Need early systems engineering, evolutionary development
4x
Uncertainties in scope,
COTS, reuse, services
Uncertainties in competition,
technology, organizations,
mission priorities
2x
1.5x
1.25x
Relative
Cost Range
x
0.8x
0.67x
0.5x
0.25x
Concept of
Operation
Feasibility
Plans
and
Rqts.
Detail
Design
Spec.
Product
Design
Spec.
Rqts.
Spec.
Product
Design
Detail
Design
Accepted
Software
Devel. and
Test
Phases and Milestones
Fall 2015
USC-CSSE
9
University of Southern California
Center for Systems and Software Engineering
Incremental Development Productivity Decline (IDPD)
• Example: Site Defense BMD Software
– 5 builds, 7 years, $100M; operational and support software
– Build 1 productivity over 200 LOC/person month
– Build 5 productivity under 100 LOC/PM
• Including Build 1-4 breakage, integration, rework
• 318% change in requirements across all builds
• IDPD factor = 20% productivity decrease per build
– Similar trends in later unprecedented systems
– Not unique to DoD: key source of Windows Vista delays
• Maintenance of full non-COTS SLOC, not ESLOC
– Build 1: 200 KSLOC new; 200K reused@20% = 240K ESLOC
– Build 2: 400 KSLOC of Build 1 software to maintain, integrate
Fall 2015
USC-CSSE
10
University of Southern California
Center for Systems and Software Engineering
IDPD Cost Drivers:
Conservative 4-Increment Example
• Some savings: more experienced personnel (5-20%)
• Depending on personnel turnover rates
• Some increases: code base growth, diseconomies of
scale, requirements volatility, user requests
• Breakage, maintenance of full code base (20-40%)
• Diseconomies of scale in development, integration
(10-25%)
• Requirements volatility; user requests (10-25%)
• Best case: 20% more effort (IDPD=6%)
• Worst case: 85% (IDPD=23%)
Fall 2015
USC-CSSE
11
University of Southern California
Center for Systems and Software Engineering
Effects of IDPD on Number of Increments
•
•
•
SLOC
Model relating productivity decline to
number of builds needed to reach 8M
20000
8M
SLOC Full Operational Capability
18000
Assumes Build 1 production of 2M SLOC16000
14000
@ 100 SLOC/PM
Cumulative 12000
KSLOC 10000
– 20000 PM/ 24 mo. = 833 developers
8000
– Constant staff size for all builds
6000
2M
Analysis varies the productivity decline 4000
2000
per build
0
1
– Extremely important to determine the
incremental development
productivity decline (IDPD) factor per
build
Fall 2015
USC-CSSE
0% productivity decline
10% productivity decline
15% productivity decline
20% productivity decline
2
3
4
5
6
7
8
Build
12
University of Southern California
Center for Systems and Software Engineering
Incremental Development Data Challenges
• Breakage effects on previous increments
– Modified, added, deleted SLOC: need Code Count with diff tool
• Accounting for breakage effort
– Charged to current increment or I&T budget (IDPD)
• IDPD effects may differ by type of software
– “Breakage ESLOC” added to next increment
– Hard to track phase and activity distributions
• Hard to spread initial requirements and architecture effort
• Size and effort reporting
– Often reported cumulatively
– Subtracting previous increment size may miss deleted code
• Time-certain development
– Which features completed? (Fully? Partly? Deferred?)
Fall 2015
USC-CSSE
13
University of Southern California
Center for Systems and Software Engineering
“Equivalent SLOC” Paradoxes
•
•
•
•
Not a measure of software size
Not a measure of software effort
Not a measure of delivered software capability
A quantity derived from software component sizes
and reuse factors that helps estimate effort
• Once a product or increment is developed, its
ESLOC loses its identity
– Its size expands into full SLOC
– Can apply reuse factors to this to determine an ESLOC
quantity for the next increment
• But this has no relation to the product’s size
Fall 2015
USC-CSSE
14
University of Southern California
Center for Systems and Software Engineering
COCOMO II Database Productivity Increases
• 1970-1999
productivity trends
largely explained by
cost drivers and
scale factors
• Post-2000
productivity trends
not explained by
cost drivers and
scale factors
SLOC per PM
• Two productivity increasing trends exist: 1970 – 1994 and 1995 – 2009
1970-1974 1975-1979
1980-1984 1985-1989 1990-1994
1995-1999
2000-2004
Five-year Periods
Fall 2015
USC-CSSE
15
2005-2009
University of Southern California
Center for Systems and Software Engineering
Constant A Decreases Over Post-2000 Period
• Calibrate the constant A while stationing B = 0.91
• Constant A is the inverse of adjusted productivity
– adjusts the productivity with SF’s and EM’s
• Constant A decreases over the periods
3.5
PM
B  0.01 SF
Size
*
 EM
Productivity is not fully
characterized by SF’s and
EM’s
What factors can
explain the
Fall 2015phenomenon?
3.0
2.5
Constant A
A
2.0
1.5
50%
decrease
over Post2000 period
1.0
0.5
0.0
0
11970- 219751974
USC-CSSE
1979
31980- 41985- 51990- 61995- 72000- 82005- 9
1984
1989
1994
1999
2004
2009
16
University of Southern California
Center for Systems and Software Engineering
Candidate Explanation Hypotheses
• Productivity has doubled over the last 40 years
– But scale factors and effort multipliers did not fully
characterize this increase
• Hypotheses/questions for explanation
– Is standard for rating personnel factors being raised?
• E.g., relative to “national average”
– Was generated code counted as new code?
• E.g., model-driven development
– Was reused code counted as new code?
– Are the ranges of some cost drivers not large enough?
• Improvement in tools (TOOL) only contributes to 20%
reduction in effort
– Are more lightweight projects being reported?
• Documentation relative to life-cycle needs
Fall 2015
USC-CSSE
17
University of Southern California
Center for Systems and Software Engineering
Summary
• Current and future trends create challenges for
systems and software data collection and analysis
– Metrics and “productivity:” “equivalent” size;
requirements/design/product/value metrics; productivity
growth and decline phenomena
– Cost drivers: effects of complexity, volatility, architecture
– Alternative processes: rapid/agile; systems of systems;
evolutionary development
– Model integration: systems and software; cost, schedule, and
quality; costs and benefits
• Updated systems and software data definitions and
estimation methods needed for good management
– COCOMO III effort
Fall 2015
USC-CSSE
18
University of Southern California
Center for Systems and Software Engineering
Cost Driver Rating Scales and Effects
• Application Complexity
– Difficulty and Constraints scales
• Architecture, Criticality, and Volatility
Effects
– Architecture effects as function of product size
– Added effects of criticality and volatility
Fall 2015
USC-CSSE
19
University of Southern California
Center for Systems and Software Engineering
Candidate AFCAA Difficulty Scale
Difficulty would be described in terms of required software reliability,
database size, product complexity, integration complexity, information
assurance, real-time requirements, different levels of developmental risks, etc.
20 2015
Fall


Attitude Control
Radar / Sonar / Telemetry
processing
Process Control
Seismic Processing
Factory Automation
Digital Switches & PBX

Firmware (ROM)

Very Challenging
Autonomous
operation
Extremely high
reliability
Complex algorithms
Micro-second
control loop
Micro-code
programming
Guidance / Navigation

Case Tools / Compilers
Website Automation
Business Systems
Very Easy
 Simple handling of
events / inputs
 Relies on O/S or
middleware for
control
 Can be restarted
with minor
inconvenience
USC-CSSE
University of Southern California
Center for Systems and Software Engineering
Candidate AFCAA Constraints Scale
Dimensions of constraints include electrical power, computing capacity,
storage capacity, repair capability, platform volatility, physical environment
accessibility, etc.

Hardware
Platforms
Very Unconstrained
 Virtually unlimited
resources
 Accessible physical
environment
 Stationary
conditions
 Stable platform
21 2015
Fall
Land Site




Mobile
Ground
Shipboard
Un-Manned
Airborne
Airborne
Very Constrained
Limited physical
volume, power &
weight
Unmanned
Inaccessible
physical
environment
Evolving platform
Hard real-time
Manned Space Un-Manned
Vehicle
Space Bus
Missile
USC-CSSE
University of Southern California
Center for Systems and Software Engineering
Added Cost of Weak Architecting
Calibration of COCOMO II Architecture and Risk Resolution
factor to 161 project data points: Technical Debt
Fall 2015
USC-CSSE
22
University of Southern California
Center for Systems and Software Engineering
Effect of Size on Software Effort Sweet Spots
Fall 2015
USC-CSSE
23
University of Southern California
Center for Systems and Software Engineering
Effect of Volatility and Criticality on Sweet Spots
Fall 2015
USC-CSSE
24
University of Southern California
Center for Systems and Software Engineering
Summary
• Current and future trends create challenges for
systems and software data collection and analysis
– Metrics and “productivity:” “equivalent” size;
requirements/design/product/value metrics; productivity
growth and decline phenomena
– Cost drivers: effects of complexity, volatility, architecture
– Alternative processes: rapid/agile; systems of systems;
evolutionary development
– Model integration: systems and software; cost, schedule, and
quality; costs and benefits
• Updated systems and software data definitions and
estimation methods needed for good management
– COCOMO III effort
Fall 2015
USC-CSSE
25
University of Southern California
Center for Systems and Software Engineering
Estimation for Alternative Processes
• Agile Methods
– Planning Poker/Wideband Delphi
– Yesterday’s Weather Adjustment: Agile COCOMO II
• Evolutionary Development
– Schedule/Cost/Quality as Independent Variable
– Incremental Development Productivity Decline
• Systems of Systems
– Hybrid Methods
Fall 2015
USC-CSSE
26
University of Southern California
Center for Systems and Software Engineering
Planning Poker/Wideband Delphi
• Stakeholders formulate story to be developed
• Developers choose and show cards indicating their
estimated ideal person-weeks to develop story
– Card values: 1,2,3,5,8,13,20,30, 50,100
• If card values are about the same, use the median
as the estimated effort
• If card values vary significantly, discuss why some
estimates are high and some low
• Re-vote after discussion
– Generally, values will converge, and the median can be
used as the estimated effort
Fall 2015
USC-CSSE
27
1
University of Southern California
Center for Systems and Software Engineering
Agile COCOMO II
Adjusting agile “yesterday’s weather” estimates
Agile COCOMO II is a web-based
software cost estimation tool that
enables you to adjust your estimates
by analogy through identifying the
factors that will be changing and by
how much.
Step 1
Estimate Cost: Estimate Effort:
Analogy Parameter
Project Name:
Baseline Value:
(Dollars) (Person - Month)(Dollars / Function Point) (Dollars / Lines of
Code) (Function Points / Person-Months) (Lines of Code / PersonMonths)(Ideal-Person-Weeks / Iteration)
Current Project Function Points
Current Project Size (SLOC):
(Lines of Code)
Current Labor Rate:
(Dollars / Person-Month)
Current Labor Rate (for Ideal-PersonWeeks):
(Dollars / Ideal-Person-Week)
Current Iteration Number:
Fall 2015
USC-CSSE
28
University of Southern California
Center for Systems and Software Engineering
Incremental Development Forms
Type
Examples
Pros
Cons
Cost Estimation
Evolutionary
Sequential
Small: Agile
Large: Evolutionary
Development
Adaptability to
change; rapid
fielding
Easiest-first; late,
costly breakage
Small: Planning-poker-type
Large: Parametric with IDPD
Platform base plus
PPPIs
Prespecifiable
full-capability
requirements
Emergent
requirements or
rapid change
COINCOMO with no increment
overlap
Overlapped
Evolutionary
Product lines with
ultrafast change
Modular product
line
Cross-increment
breakage
Parametric with IDPD and
Requirements Volatility
Rebaselining
Evolutionary
Mainstream
product lines;
Systems of
systems
High assurance
with rapid change
Highly coupled
systems with
very rapid change
COINCOMO, IDPD for
development; COSYSMO for
rebaselining
Prespecified
Sequential
Time phasing terms: Scoping; Architecting; Developing; Producing; Operating (SADPO)
Prespecified Sequential: SA; DPO1; DPO2; DPO3; …
Evolutionary Sequential: SADPO1; SADPO2; SADPO3; …
Evolutionary Overlapped: SADPO1;
SADPO2;
SADPO3; …
Evolutionary Concurrent: SA; D1 ; PO1…
SA2; D2 ; PO2…
SA3; D3; PO3 …
Fall 2015
USC-CSSE
29
University of Southern California
Center for Systems and Software Engineering
Evolutionary Development Implications
• Total Package Procurement doesn’t work
– Can’t determine requirements and cost up front
• Need significant, sustained systems engineering effort
– Need best-effort up-front architecting for evolution
– Can’t dismiss systems engineers after Preliminary Design Review
• Feature set size becomes dependent variable
– Add or drop borderline-priority features to meet schedule or cost
– Implies prioritizing, architecting steps in SAIV process model
– Safer than trying to maintain a risk reserve
Fall 2015
USC-CSSE
30
University of Southern California
Center for Systems and Software Engineering
Future DoD Challenges: Systems of Systems
FCR1
SoS-Level Exploration
Valuation
Candidate Supplier/
Strategic Partner n
Source
Selection
Rebaseline/
Adjustment FCR1
Architecting
OCR1
Develop
OCR2
Operation

LCO-type
Proposal &
Feasibility
Info
●
●
●
Candidate Supplier/
Strategic Partner 1
OCRx2
OCRx1
System x   
Develop
OCRx3
Operation
●
●
●
System C    Exploration
Operation
FCRC
Valuation
System B   
Exploration
Valuation
System A   
Exploration
Valuation
Operation
Operation
DCRC
OCRC1
Develop
DCRB
Architecting
FCRA
OCRx5
OCRx4
Architecting
FCRB
31 2015
Fall
DCR1
USC-CSSE
OCRC2
Operation
OCRB1
Develop

OCRB2
Operation

OCRA1
DCRA
Architecting

Develop
Operation

University of Southern California
Center for Systems and Software Engineering
Conceptual SoS SE Effort Profile
Inception
Elaboration
Construction
Transition
Planning, Requirements
Management, and Architecting
Source Selection and
Supplier Oversight
SoS Integration and Testing
• SoS SE activities focus on three somewhat independent activities,
performed by relatively independent teams
• A given SoS SE team may be responsible for one, two, or all activity areas
• Some SoS programs may have more than one organization performing
SoS SE activities
32 2015
Fall
USC-CSSE
University of Southern California
Center for Systems and Software Engineering
SoS SE Cost Model
•
Size Drivers
Planning,
Requirements
Management,
and Architecting
Cost Factors
Source Selection
and Supplier
Oversight
SoS Integration
and Testing
SoSs supported by cost model
– Strategically-oriented
stakeholders interested in
tradeoffs and costs
– Long-range architectural vision
for SoS
– Developed and integrated by an
SoS SE team
– System component
independence
SoS SE
Effort
•
Size drivers and cost factors
– Based on product
characteristics, processes that
impact SoS SE team effort, and
SoS SE personnel experience
and capabilities
Calibration
33 2015
Fall
USC-CSSE
University of Southern California
Center for Systems and Software Engineering
Comparison of SE and SoSE Cost Model Parameters
Parameter Aspects
COSYSMO
COSOSIMO
Size drivers
# of system requirements
# of system interfaces
# operational scenarios
# algorithms
# of SoS requirements
# of SoS interface protocols
# of constituent systems
# of constituent system organizations
# operational scenarios
“Product” characteristics
Size/complexity/volatility
Requirements understanding
Architecture understanding
Level of service requirements
# of recursive levels in design
Migration complexity
Technology risk
#/ diversity of platforms/installations
Level of documentation
Size/complexity/volatility
Requirements understanding
Architecture understanding
Level of service requirements
Component system maturity and stability
Process characteristics
Process capability
Multi-site coordination
Tool support
Maturity of processes
Tool support
Cost/schedule compatibility
SoS risk resolution
People characteristics
Stakeholder team cohesion
Personnel/team capability
Personnel experience/continuity
Stakeholder team cohesion
SoS team capability
Fall 2015
USC-CSSE
Component system readiness
34
University of Southern California
Center for Systems and Software Engineering
Summary
• Current and future trends create challenges for
systems and software data collection and analysis
– Metrics and “productivity:” “equivalent” size;
requirements/design/product/value metrics; productivity
growth and decline phenomena
– Cost drivers: effects of complexity, volatility, architecture
– Alternative processes: rapid/agile; systems of systems;
evolutionary development
– Model integration: systems and software; cost, schedule, and
quality; costs and benefits
• Updated systems and software data definitions and
estimation methods needed for good management
– COCOMO III effort
Fall 2015
USC-CSSE
35
University of Southern California
Center for Systems and Software Engineering
COCOMO III Effort
COCOMO III Project Scope
COCOMO III Project Purpose
•Broaden audiences of COCOMO® and address
scope of modern projects: mobile devices,
web/internet, big data, cloud-targeted, and multitenant software
•Improve the accuracy and realism of estimates
•
Improve driver definitions
•
New and updated software cost drivers
and adjust their ratings as needed
•
Quality estimation capability
•
Point and range estimates based on risk
•Improve value of COCOMO® in decision-making
36
Fall 2015
•COCOMO® III will product estimates for:
– Effort, Schedule, Cost, Defects
•COCOMO® III can be applied at various
moments in a project’s lifecycle:
– Early Estimation, Post-Architecture
Estimation, Project Re-estimation
•COCOMO® III’s functional vision
– Single and Multiple component
estimate
– Analysis of alternatives
– Analysis with Size-Effort-Schedule
as independent variables
– Support for different lifecycle
processes
– Lifecycle cost estimation
– Legacy system transformation
– Alternative size measures
– Include technical debt and its effects
on effort and schedule
USC-CSSE
University of Southern California
Center for Systems and Software Engineering
Historical Overview of COCOMO Suite of Models
Software Cost Models
COCOMO 81
COQUALMO
1998
COCOMO II
2000
iDAVE
2003
Software Extensions
Dates indicate the time
that the first paper was
published for the model.
Fall 2015
DBA COCOMO
2004
COINCOMO
2004
COPLIMO
2003
Other Independent
Estimation Models
COCOTS
2000
COSYSMO
2002
COSoSIMO
2004
Costing Secure
System 2004
COPSEMO
1998
COPROMO
1998
Security
Extension 2004
CORADMO
1999
Legend:
Model has been calibrated with historical project data
Model is derived from calibrated model
Model has been calibrated with expert (Delphi) data
USC-CSSE
37
University of Southern California
Center for Systems and Software Engineering
COQUALMO
Defect Introduction pipes
•••
Code Defects
Residual
Software
Defects
Design Defects
Requirements Defects
Defect Removal pipes
COCOMO II
Software Size Estimate
Software platform,
Project, product and
personnel attributes
Defect removal profile
levels
Automation, Reviews,
Testing
38 2015
Fall
COQUALMO
Defect
Introduction
Model
Defect
Removal
Model
USC-CSSE
Software development
effort, cost and schedule
estimate
Number of residual
defects
Defect density per unit of
size
University of Southern California
Center for Systems and Software Engineering
Non-SLOC Size Measures
•
•
•
•
•
•
Software Requirements
Function Points
SNAP Points
Fast Function Points
COSMIC Points
Object / Application
Points
• Feature Points
• Use Case Points
• Story Points (Agile
Development)
• Design Modules
• RICE Objects
– Reports, Screens, 3G
Modules
39
Fall 2015
– Reports, Interfaces,
Conversions,
Enhancements
USC-CSSE
University of Southern California
Center for Systems and Software Engineering
References
Boehm, B., “Some Future Trends and Implications for Systems and Software Engineering Processes”,
Systems Engineering 9(1), pp. 1-19, 2006.
Boehm, B., and Lane, J., “Using the ICM to Integrate System Acquisition, Systems Engineering, and
Software Engineering,” CrossTalk, October 2007, pp. 4-9.
Boehm, B., Brown, A.W.. Clark, B., Madachy, R., Reifer, D., et al., Software Cost Estimation with COCOMO II,
Prentice Hall, 2000.
Dahmann, J. (2007); “Systems of Systems Challenges for Systems Engineering”, Systems and Software
Technology Conference, June 2007.
Department of Defense (DoD), Instruction 5000.02, Operation of the Defense Acquisition System,
December 2008.
Galorath, D., and Evans, M., Software Sizing, Estimation, and Risk Management, Auerbach, 2006.
Lane, J. and Boehm, B., “Modern Tools to Support DoD Software-Intensive System of Systems Cost
Estimation, DACS State of the Art Report, also Tech Report USC-CSSE-2007-716
Lane, J., Valerdi, R., “Synthesizing System-of-Systems Concepts for Use in Cost Modeling,” Systems
Engineering, Vol. 10, No. 4, December 2007.
Madachy, R., “Cost Model Comparison,” Proceedings 21st, COCOMO/SCM Forum, November, 2006,
http://csse.usc.edu/events/2006/CIIForum/pages/program.html
Northrop, L., et al., Ultra-Large-Scale Systems: The Software Challenge of the Future, Software
Engineering Institute, 2006.
Reifer, D., “Let the Numbers Do the Talking,” CrossTalk, March 2002, pp. 4-8.
Stutzke, R., Estimating Software-Intensive Systems, Addison Wesley, 2005.
Valerdi, R, Systems Engineering Cost Estimation with COSYSMO, Wiley, 2010 (to appear)
USC-CSSE Tech Reports, http://csse.usc.edu/csse/TECHRPTS/by_author.html
Fall 2015
USC-CSSE
40
University of Southern California
Center for Systems and Software Engineering
Backup Charts
Fall 2015
USC-CSSE
41
University of Southern California
Center for Systems and Software Engineering
COSYSMO Operational Concept
# Requirements
# Interfaces
# Scenarios
# Algorithms
+
Volatility Factor
Size
Drivers
Effort
Effort
Multipliers
- Application factors
-8 factors
- Team factors
-6 factors
- Schedule driver
42 2015
Fall
COSYSMO
Calibration
WBS guided by
ISO/IEC 15288
USC-CSSE
University of Southern California
Center for Systems and Software Engineering
4. Rate Cost Drivers Application
43
University of Southern California
Center for Systems and Software Engineering
COSYSMO Change Impact Analysis – I
– Added SysE Effort for Going to 3 Versions
• Size: Number, complexity, volatility, reuse of system
requirements, interfaces, algorithms, scenarios (elements)
– 13 Versions:
– Exercise Prep.:
add 3-6% per increment for number of elements
add 2-4% per increment for volatility
add 3-6% per increment for number of elements
add 3-6% per increment for volatility
• Most significant cost drivers (effort multipliers)
–
–
–
–
–
Migration complexity: 1.10 – 1.20 (versions)
Multisite coordination: 1.10 – 1.20 (versions, exercise prep.)
Tool support: 0.75 – 0.87 (due to exercise prep.)
Architecture complexity: 1.05 – 1.10 (multiple baselines)
Requirements understanding: 1.05 – 1.10 for increments 1,2;
1.0 for increment 3; .9-.95 for increment 4
Fall 2015
USC-CSSE
44
University of Southern California
Center for Systems and Software Engineering
COSYSMO Change Impact Analysis – II
– Added SysE Effort for Going to 3 Versions
Cost Element
Incr. 1
Incr. 2
Incr. 3
Incr. 4
Size
1.11 – 1.22
1.22 – 1.44
1.33 – 1.66
1.44 – 1.88
Effort Product
1.00 – 1.52
1.00 – 1.52
0.96 – 1.38
0.86 – 1.31
Effort Range
1.11 – 1.85
1.22 – 2.19
1.27 – 2.29
1.23 – 2.46
Arithmetic Mean
1.48
1.70
1.78
1.84
Geometric Mean
1.43
1.63
1.71
1.74
Fall 2015
USC-CSSE
45
University of Southern California
Center for Systems and Software Engineering
COSYSMO Requirements Counting Challenge
• Estimates made in early stages
– Relatively few high-level design-to requirements
• Calibration performed on completed projects
– Relatively many low-level test-to requirements
• Need to know expansion factors between levels
– Best model: Cockburn definition levels
• Cloud, kite, sea level, fish, clam
• Expansion factors vary by application area, size
– One large company: Magic Number 7
– Small e-services projects: more like 3:1, fewer lower levels
• Survey form available to capture your experience
Fall 2015
USC-CSSE
46
University of Southern California
Center for Systems and Software Engineering
Achieving Agility and High Assurance -I
Using timeboxed or time-certain development
Precise costing unnecessary; feature set as dependent variable
Rapid
Change
Foreseeable
Change
(Plan)
Increment N Baseline
High
Assurance
47 2015
Fall
Short
Development
Increments
Short, Stabilized
Increment N Transition/O&M
Development
Of Increment N
Stable Development
Increments
USC-CSSE
University of Southern California
Center for Systems and Software Engineering
Evolutionary Concurrent: Incremental Commitment Model
Unforeseeable Change (Adapt)
Future Increment Baselines
Agile
Rebaselining for
Future Increments
Rapid
Change
Foreseeable
Change
(Plan)
Short
Development
Increments
Increment N Baseline
Stable Development
Increments
High
Assurance
Current V&V
Resources
Continuous V&V
48 2015
Fall
Deferrals
Short, Stabilized
Development
of Increment N
Artifacts
Operations and Maintenance
Concerns
Verification and
Validation (V&V)
of Increment N
USC-CSSE
Increment N Transition/
Future V&V
Resources
University of Southern California
Center for Systems and Software Engineering
Effect of Unvalidated Requirements
-15 Month Architecture Rework Delay
$100M
Arch. A:
Custom
many cache processors
$50M
Arch. B:
Modified
Client-Server
Available budget
Original Spec
1
After Prototyping
2
3
4
5
Response Time (sec)
49 2015
Fall
USC-CSSE
University of Southern California
Center for Systems and Software Engineering
Reasoning about the Value of
Dependability – iDAVE
• iDAVE: Information Dependability Attribute Value
Estimator
• Use iDAVE model to estimate and track software
dependability ROI
– Help determine how much dependability is enough
– Help analyze and select the most cost-effective
combination of software dependability techniques
– Use estimates as a basis for tracking performance
– Integrates cost estimation (COCOMO II), quality estimation
(COQUALMO), value estimation relationships
Fall 2015
USC-CSSE
50
University of Southern California
Center for Systems and Software Engineering
iDAVE Model Framework
Time-phased
information
processing
capabilities
Cost estimating relationships (CER’s)
Time-phased
Cost = f
IP Capabilities (size),
project attributes
 Dependability
attribute levels Di
Project attributes
Time-phased
dependability
investments
 Cost
Dependability attribute estimating
relationships (DER’s)
 Value components
Vj
Di = gi
Dependability
investments,
project attributes
 Return on
Investment
Value estimating relationships (VER’s)
Vj = hj
Fall 2015
IP Capabilities
dependability levels Di
USC-CSSE
51
University of Southern California
Center for Systems and Software Engineering
Examples of Utility Functions: Response Time
Value
Value
Critical
Region
Time
Time
Mission Planning,
Competitive Time-to-Market
Real-Time
Control; Event
Support
Value
Value
Time
Time
Event Prediction
- Weather; Software Size
Fall 2015
Data Archiving
Priced Quality of
Service
USC-CSSE
52
University of Southern California
Center for Systems and Software Engineering
Tradeoffs Among Cost, Schedule, and
Reliability: COCOMO II
9
(RELY, MTBF (hours))
8
(VL, 1)
Cost ($M)
7
(L, 10)
6
5
(N, 300)
4
(H, 10K)
3
(VH, 300K)
•For 100-KSLOC set of features
•Can “pick all three” with 77-KSLOC set of
features
2
1
-- Cost/Schedule/RELY:
“pick any two” points
0
0
10
20
30
40
50
Development Time (Months)
53 2015
Fall
USC-CSSE
University of Southern California
Center for Systems and Software Engineering
The SAIV* Process Model
1. Shared vision and expectations management
2. Feature prioritization
3. Schedule range estimation and core-capability
determination
- Top-priority features achievable within fixed schedule with 90% confidence
4. Architecting for ease of adding or dropping borderlinepriority features
- And for accommodating past-IOC directions of growth
5. Incremental development
- Core capability as increment 1
6. Change and progress monitoring and control
- Add or drop borderline-priority features to meet schedule
*Schedule As Independent Variable; Feature set as dependent variable
– Also works for cost, schedule/cost/quality as independent variable
54 2015
Fall
USC-CSSE
University of Southern California
Center for Systems and Software Engineering
How Much Testing is Enough?
- Early Startup: Risk due to low dependability
- Commercial: Risk due to low dependability
- High Finance: Risk due to low dependability
- Risk due to market share erosion
Combined Risk Exposure
1
Market Share
Erosion
0.8
Early Startup
0.6
RE =
P(L) * S(L)
Sweet
Spot
0.4
Commercial
High Finance
0.2
0
VL
L
N
H
VH
RELY
COCOMO II:
0
12
22
34
54
Added % test time
COQUALMO:
1.0
.475
.24
.125
0.06
P(L)
Early Startup:
.33
.19
.11
.06
.03
S(L)
Commercial:
1.0
.56
.32
.18
.10
S(L)
High Finance:
3.0
1.68
.96
.54
.30
S(L)
Market Risk:
55 2015
Fall
.008
.027
.09
.30
1.0
RE
m
USC-CSSE
Download