u i d rd

advertisement
Ricardo Valerdi
rvalerdi@mit.edu
1
IEEE Connecticut Section
May 24, 2007
(4) Recent developments;
(3) Limitations;
(2) Explanation of its size and cost
drivers;
(1) Description of the model;
Roadmap
2
– Measurement Working Group
– Corporate Advisory Board
• USC CSSE Corporate Affiliate Program
• LAI Consortium Members
• Boeing, LMCO, NGC, Raytheon, General
Dynamics, SAIC, BAE Systems
• Practical Software & Systems Measurement
• INCOSE
3
COSYSMO Development Support
…but some of them are useful.
All models are wrong…
0.25x
0.5x
Relative
x
Size
Range
2x
4x
Feasibility
Phases and Milestones
Develop
and Test
Life Cycle
Architecture
Design
Life Cycle
Objectives
Plans/Rqts.
Operational
Concept
Initial
Operating
Capability
Estimation Accuracy
5
6
Calibration: the tuning of parameters based on project
data
CER: a model that represents the cost estimating
relationships of factors
Cost Estimation: prediction of both the
person-effort and elapsed time of a project
Driver: A factor that is highly correlated to the amount of
Systems Engineering effort
Parametric: an equation or model that is approximated by
a set of parameters
Rating Scale: a range of values and definitions for a
particular driver
Understanding: an individual’s subjective judgment of
their level of comprehension
Key Definitions & Concepts
• COCOMO is the most widely used, thoroughly
documented and calibrated software cost model
• COCOMO - the “COnstructive COst MOdel”
– COCOMO II is the update to COCOMO 1981
– ongoing research with annual calibrations made
available
• Originally developed by Dr. Barry Boehm and
published in 1981 book Software Engineering
Economics
• COCOMO II described in Software Cost Estimation
with COCOMO II (Prentice Hall 2000)
COCOMO II
7
iDAVE
2003
COCOMO II
2000
COPLIMO
2003
Costing Secure
System 2004
COSoSIMO
2004
CORADMO
1999
Dates indicate the time that the first paper was published for the model
8
Security
Extension 2004
COSYSMO
2002
COCOTS
2000
Other Independent
Estimation Models
COPSEMO
1998
COPROMO
1998
COINCOMO
2004
DBA COCOMO
2004
Legend:
Model has been calibrated with historical project data and expert (Delphi) data
Model is derived from COCOMO II
Model has been calibrated with expert (Delphi) data
Software Extensions
COQUALMO
1998
COCOMO 81
1981
Software Cost Models
Historical Overview of COCOMO Suite of Models
9
• Capability to measure systems engineering is
unavailable in current cost models
• Possible approaches
– Heuristics/rules of thumb (Honour)
– Analogy
– % of SW or HW effort (COCOMOII, PRICE-H)
– % of total effort (Honour)
– A function of complexity (Ernstoff)
• Systems Engineering is evolving
– INCOSE (est. 1992)
– Standards (EIA/ANSI632, EIA/ANSI731, ISO/IEC15288)
– Academic degrees
• We can start where COCOMO left off…
State of the Practice
Develop
Operate,
Maintain,
or
Enhance
Replace
or
Dismantle
• Considers standard Systems Engineering
Work Breakdown Structure tasks (per
EIA/ANSI 632)
Conceptualize
Oper Test Transition
to
& Eval
Operation
• Addresses first four phases of the system
engineering lifecycle (per ISO/IEC 15288)
COSYSMO Scope
10
Effort
Multipliers
Size
Drivers
- Application factors
-8 factors
- Team factors
-6 factors
- Schedule driver (future)
# Requirements
# Interfaces
# Scenarios
# Algorithms
+
3 Adjustment
Factors
Calibration
COSYSMO
WBS guided by
EIA/ANSI 632
Effort
COSYSMO Operational Concept
11
System
Size
0.50
1.00
1.50
2.00
2.50
3.00
3.50
1.00
1.00
1.15
1.41
1.41
1.63
2.31
2.66
2.39
2.10
2.10
2.29
RELY CPLX RUSE DOCU BAND PVOL SCAP DCAP PCON APEX PLEX DTEX
2.50
LIFE
2.25
SITE
2.25
SCED
12
Nominal
Cost
Project
Cost
Visualization of Cost Driver Impacts
Where:
PMNS = effort in Person Months (Nominal Schedule)
A = calibration constant derived from historical project data
k = {REQ, IF, ALG, SCN}
wx = weight for “easy”, “nominal”, or “difficult” size driver
Φ x = quantity of “k” size driver
E = represents diseconomy of scale
EM = effort multiplier for the jth cost driver. The geometric product results in an
overall effort adjustment factor to the nominal effort.
PM NS
E
13
 14

= A ⋅  ∑ ( we,k Φ e,k + wn ,k Φ n ,k + wd ,k Φ d ,k )  ⋅ ∏ EM j
 j =1
 k
Model Form
Transportation & Security Solutions (Rockville, MD)
Integrated Systems & Solutions (Valley Forge, PA)
Systems Integration (Owego, NY)
Aeronautics (Marietta, GA)
Maritime Systems & Sensors (Manassas, VA;
Baltimore, MD; Syracuse, NY)
Maritime Digital Systems/AIS (Pittsfield, MA)
Surveillance & Reconnaissance Systems/AIS
(Bloomington, MN)
National Security Solutions/ISS (San Diego, CA)
Lockheed Martin
General Dynamics
BAE Systems
SAIC
Mission Systems (Redondo Beach, CA)
Northrop Grumman
Integrated Data Solutions & Analysis (McLean, VA)
Army Transformation (Orlando, FL)
Information & Electronic Warfare Systems (Nashua, NH)
Intelligence & Information Systems (Garland, TX)
Raytheon
COSYSMO Data Sources
Number of System Requirements
Number of System Interfaces
Number of System Specific Algorithms
Number of Operational Scenarios
Weighted by complexity, volatility, and degree of reuse
1.
2.
3.
4.
4 Size Drivers
15
Source: Cockburn 2001
Underwater level: Perform statistical analysis of data with
software tools and implement model in Excel
16
Sea level: Utilize size and cost drivers, definitions, and counting
rules
Kite level: Adopt EIA 632 as the WBS and ISO 15288 as the life
cycle standard
Sky level: Build an SE cost model
COSYSMO example for sky, kite, sea,
and underwater levels where:
Counting Rules Example
1.
2.
3.
4.
5.
6.
7.
8.
17
Requirements understanding
Architecture understanding
Level of service requirements
Migration complexity
Technology Risk
Documentation Match to Life Cycle Needs
# and Diversity of Installations/Platforms
# of Recursive Levels in the Design
Application Factors (8)
14 Cost Drivers
1.
2.
3.
4.
5.
6.
Stakeholder team cohesion
Personnel/team capability
Personnel experience/continuity
Process capability
Multisite coordination
Tool support
Team Factors (6)
14 Cost Drivers (cont.)
18
0.5
1.7
3.4
9.8
# of System Requirements
# of Interfaces
# of Critical Algorithms
# of Operational Scenarios
Easy
22.8
6.5
4.3
1.00
Nominal
Size Driver Weights
47.4
18.2
9.8
5.0
Difficult
1.50
1.48
1.47
1.39
1.39
Personnel/team capability
Personnel experience/continuity
Process capability
Multisite coordination
Tool support
1.18
1.18
1.21
1.22
1.22
1.00
1.00
1.00
1.00
1.00
1.00
1.50
Stakeholder team cohesion
1.22
1.00
0.76
# of recursive levels in the design
0.87
1.00
# and diversity of installations/platforms
1.00
0.78
Documentation
0.88
1.00
0.82
1.00
0.67
0.79
1.00
Technology Risk
0.62
Level of Service Requirements
1.28
1.00
Nominal
1.00
1.64
Architecture Understanding
1.37
Low
Migration Complexity
1.87
Requirements Understanding
Very
Low
0.85
0.90
0.88
0.82
0.81
0.81
1.21
1.23
1.13
1.32
1.25
1.36
0.81
0.77
High
0.72
0.80
0.77
0.67
0.65
0.65
1.47
1.52
1.28
1.75
1.55
1.85
0.65
0.60
Very
High
0.72
0.68
1.87
1.93
Extra
High
Cost Driver Rating Scales
20
1.93
1.93
2.16
2.21
2.31
2.31
1.93
1.87
1.64
2.61
1.93
2.98
2.52
3.12
EMR
21
Technical
Evaluation
Product
Realization
System
Design
Technical
Management
Acquisition &
Supply
EIA/ANSI 632
Conceptualize
Develop
Transition to
Operation
ISO/IEC 15288
Operational
Test &
Evaluation
Effort Profiling
Operate,
Maintain,
or Enhance
22
Replace or
Dismantle
Mostly qualitative drivers
Variance of Delphi responses
Small sample size
Aerospace-heavy
Calibration is biased by successful
projects because successful projects
share data, bad ones don’t
6. Model will not work outside of calibrated
range
7. A fool with a tool is still a fool
1.
2.
3.
4.
5.
Limitations of the model
23
24
Ricardo Valerdi
MIT
rvalerdi@mit.edu
(617) 253-8583
www.valerdi.com/cosysmo
Contact
25
Download