COSYSMO Workshop Future Directions and Priorities 23 International Forum on COCOMO and

advertisement
COSYSMO Workshop
Future Directions and Priorities
23rd International Forum on COCOMO and
Systems/Software Cost Modeling
Los Angeles, CA
Wed Oct 29 & Thurs Oct 30, 2008
Garry Roedler
John Gaffney
Ricardo Valerdi
Gan Wang
Jared Fortune
Agenda
• Context setting
• Discussion on COSYSMO 2.0
improvements
• Recursive levels in the design parameter
• Update on COSYSMO book
• Heuristics
Introductions
8:30 am to 9:00 am Review of COSYSMO workshop in July – Mystic, CT (Garry Roedler,
Lockheed Martin)
Results of Reuse Survey (Jared Fortune, USC)
Harmonization of Software and Systems Engineering Cost Estimation
9:00 am to 10:00 am
(Garry Roedler, Lockheed Martin)
10:00 am to 10:30 amBreak
Experience with SEEMAP at BAE Systems: Quantitative Risk Modeling
10:30 am to 11:00 am
Using Monte Carlo / Crystal Ball (Gan Wang, BAE Systems)
Experience with COSYSMO-R at Lockheed Martin (John Gaffney,
11:00 am to 11:30 am
Lockheed Martin)
Heuristic Risk Assessment (Ray Madachy, Naval Postgraduate School
11:30 am to 12:00 pm
and USC)
12:00 pm to 1:00 pm Lunch
Best Practice Guidance (Garry Roedler, Lockheed Martin)
1:00 pm to 2:00 pm
Model Usage Heuristics (Ricardo Valerdi, MIT)
Working Session on Harmonization of SW & SE Estimation: WBS
2:00 pm to 3:00 pm
Approach (Garry Roedler, Lockheed Martin; Gan Wang, BAE Systems)
3:00 pm to 3:30 pm Break
Discussion on Reuse Framework (Jared Fortune, USC; Ricardo
3:30 pm to 4:00 pm
Valerdi, MIT)
4:00 pm to 5:00 pm Discussion on Recursive Levels Cost Driver (Ricardo Valerdi, MIT)
5:00 pm to 7:00 pm Reception
Context setting
How is Systems Engineering Defined?
•
•
•
Acquisition and Supply
•
– Supply Process
– Acquisition Process
Technical Management
•
– Planning Process
– Assessment Process
– Control Process
System Design
– Requirements Definition Process
– Solution Definition Process
Product Realization
– Implementation Process
– Transition to Use Process
Technical Evaluation
– Systems Analysis Process
– Requirements Validation Process
– System Verification Process
– End Products Validation Process
EIA/ANSI 632, Processes for Engineering a System, 1999.
COSYSMO Origins
Systems Engineering
(Warfield 1956)
1950
Software Cost Modeling
(Boehm 1981)
COSYSMO
1980
CMMI*
(Humphrey 1989)
1990
*developed at Carnegie Mellon University
Warfield, J. N., Systems Engineering, United States Department of Commerce PB111801, 1956.
Boehm, B. W., Software Engineering Economics, Prentice Hall, 1981.
Humphrey, W. Managing the Software Process. Addison-Wesley, 1989.
COSYSMO Data Sources
Boeing
Integrated Defense Systems (Seal Beach, CA)
Raytheon
Intelligence & Information Systems (Garland, TX)
Northrop Grumman
Mission Systems (Redondo Beach, CA)
Lockheed Martin
Transportation & Security Solutions (Rockville, MD)
Integrated Systems & Solutions (Valley Forge, PA)
Systems Integration (Owego, NY)
Aeronautics (Marietta, GA)
Maritime Systems & Sensors (Manassas, VA;
Baltimore, MD; Syracuse, NY)
General Dynamics
Maritime Digital Systems/AIS (Pittsfield, MA)
Surveillance & Reconnaissance Systems/AIS
(Bloomington, MN)
BAE Systems
National Security Solutions/ISS (San Diego, CA)
Information & Electronic Warfare Systems (Nashua, NH)
SAIC
Army Transformation (Orlando, FL)
Integrated Data Solutions & Analysis (McLean, VA)
L-3 Communications
Greenville, TX
Modeling Methodology
3 rounds; > 60 experts
62 data points; 8 organizations
COSYSMO Scope
• Addresses first four phases of the system
engineering lifecycle (per ISO/IEC 15288)
Conceptualize
Develop
Oper Test Transition
to
& Eval
Operation
Operate,
Maintain,
or
Enhance
Replace
or
Dismantle
• Considers standard Systems Engineering
Work Breakdown Structure tasks (per
EIA/ANSI 632)
COSYSMO Operational Concept
# Requirements
# Interfaces
# Scenarios
# Algorithms
+
3 Adj. Factors
Size
Drivers
Effort
Multipliers
- Application factors
-8 factors
- Team factors
-6 factors
COSYSMO
Calibration
Effort
COSYSMO Model Form
E
PM NS

 14
 A    ( we,k  e,k  wn ,k  n ,k  wd ,k  d ,k )    EM j
 k
 j 1
Where:
PMNS = effort in Person Months (Nominal Schedule)
A = calibration constant derived from historical project data
k = {REQ, IF, ALG, SCN}
wx = weight for “easy”, “nominal”, or “difficult” size driver
 x = quantity of “k” size driver
Ex = represents diseconomies of scale
EM = effort multiplier for the jth cost driver. The geometric product results in an
overall effort adjustment factor to the nominal effort.
Size Driver Weights
Easy
Nominal
Difficult
# of System Requirements
0.5
1.00
5.0
# of Interfaces
1.7
4.3
9.8
# of Critical Algorithms
3.4
6.5
18.2
# of Operational Scenarios
9.8
22.8
47.4
Cost Driver Clusters
UNDERSTANDING FACTORS
– Requirements understanding
– Architecture understanding
– Stakeholder team cohesion
– Personnel experience/continuity
COMPLEXITY FACTORS
– Level of service requirements
– Technology Risk
– # of Recursive Levels in the Design
– Documentation Match to Life Cycle Needs
OPERATIONS FACTORS
– # and Diversity of Installations/Platforms
– Migration complexity
PEOPLE FACTORS
– Personnel/team capability
– Process capability
ENVIRONMENT FACTORS
– Multisite coordination
– Tool support
Cost Driver Rating Scales
Very
Low
Low
Nominal
High
Very High
Requirements Understanding
1.87
1.37
1.00
0.77
0.60
3.12
Architecture Understanding
1.64
1.28
1.00
0.81
0.65
2.52
Level of Service Requirements
0.62
0.79
1.00
1.36
1.85
2.98
1.00
1.25
1.55
Migration Complexity
Extra
High
1.93
EMR
1.93
Technology Risk
0.67
0.82
1.00
1.32
1.75
2.61
Documentation
0.78
0.88
1.00
1.13
1.28
1.64
1.00
1.23
1.52
# and diversity of installations/platforms
1.87
1.87
# of recursive levels in the design
0.76
0.87
1.00
1.21
1.47
1.93
Stakeholder team cohesion
1.50
1.22
1.00
0.81
0.65
2.31
Personnel/team capability
1.50
1.22
1.00
0.81
0.65
2.31
Personnel experience/continuity
1.48
1.22
1.00
0.82
0.67
2.21
Process capability
1.47
1.21
1.00
0.88
0.77
0.68
2.16
Multisite coordination
1.39
1.18
1.00
0.90
0.80
0.72
1.93
Tool support
1.39
1.18
1.00
0.85
0.72
1.93
Cost Drivers Ordered by Effort Multiplier Ratio (EMR)
Effort Profiling
Conceptualize
Develop
Operational
Test &
Evaluation
Transition to
Operation
ISO/IEC 15288
Technical
Management
System
Design
Product
Realization
Technical
Evaluation
EIA/ANSI 632
Acquisition &
Supply
Operate,
Maintain,
or Enhance
Replace or
Dismantle
Impact
10 theses
Model
Academic Curricula
E

 14
PM NS  A    ( we,k  e,k  wn ,k  n ,k  wd ,k  d ,k )    EM j
 k
 j 1
Academic
prototype
Commercial Implementations
Policy & Contracts
Proprietary Implementations
SEEMaP
COSYSMO-R
Intelligence Community
Sheppard Mullin, LLC
SECOST
COSYSMO 2.0 Improvements
Recommended Improvements
(from user community)
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
Reuse
Integration of SwE & SysE estimation
Assumption of linearity in COSYSMO cost drivers
Effect of cost drivers and scale factors
Number of recursive levels of design
Risk modeling
Establishing best practice guidance
Consideration of SoS scope in COSYSMO
Estimation in Operation & Maintenance Phase
Deferred
Requirements volatility
1. Reuse
• Central question: What is the effect of reuse in
estimating systems engineering size/effort?
• Hypothesis: A COSYSMO reuse submodel will improve
the model’s estimation accuracy
• POC: Jared Fortune
• References
– Valerdi, R., Wang, G., Roedler, G., Rieff, J., Fortune, J.,
“COSYSMO Reuse Extension,” 22nd International Forum on
COCOMO and Systems/Software Cost Modeling, 2007.
2. Integration of SwE & SysE estimation
• Central question: What is the overlap between
COCOMO II and COSYSMO?
• Hypothesis: By identifying the WBS elements in
COSYSMO that overlap with the WBS in COCOMO II,
the systems engineering resource estimation accuracy
increases
• POC: Ricardo Valerdi
• References
– Valerdi, R., The Architect and the Builder: Overlaps Between
Software and Systems Engineering. (working paper)
3. Linearity in COSYSMO cost drivers
• Central question: How do we characterize the nonlinearity of cost drivers across the system life cycle?
• Hypothesis: Not all cost drivers have a constant impact
on systems engineering effort throughout the life cycle.
• POC: Gan Wang
• References
– Wang, G., Valerdi, R., Boehm, B., Shernoff, A., “Proposed
Modification to COSYSMO Estimating Relationship,” 18th
INCOSE Symposium, June 2008.
4. Effect of cost drivers and scale factors
• Central question: Can some of the cost drivers become
scale factors in the cost estimating relationship
calibrated by the new data set?
• Hypothesis: The current set of size and cost drivers are
too sensitive to small variations in rating levels.
• POC: Gan Wang
• References
– Wang, G., Valerdi, R., Boehm, B., Shernoff, A., “Proposed
Modification to COSYSMO Estimating Relationship,” 18th
INCOSE Symposium, June 2008.
5. Number of recursive levels of design
• Central question: How can the integration complexity of
subsystems one layer below the system-of-interest be
operationalized?
• Hypothesis: The integration complexity of subsystems is
a predictor of systems engineering effort.
• POC: John Rieff
• References
– Marksteiner, B., “Recursive Levels and COSYSMO”, October
2007. (working paper)
6. Risk Modeling
• Central question: How can risk associated with the
COSYSMO estimate be quantified?
• Hypothesis: The output generated by COSYSMO can be
quantified using probability distributions for better
assessment of the likelihood of meeting the estimate
• POC: John Gaffney (developer of COSYSMO-R)
• References
– Valerdi, R., Gaffney, J., “Reducing Risk and Uncertainty in
COSYSMO Size and Cost Drivers: Some Techniques for
Enhancing Accuracy,” 5th Conference on Systems Engineering
Research, March 2007, Hoboken, NJ.
7. Best practice guidance for use of Cost Drivers
• Central question: How can misuse of the COSYSMO
cost drivers be avoided?
• Hypothesis: By developing a best practice guide that
describes common pitfalls associated with COSYSMO
cost drivers, over-estimation can be reduced or avoided
• POC: Garry Roedler
• References
– COSYSMO User Manual
8. Consideration of SoS scope in COSYSMO
• Central question: How can COSYSMO be updated to
address system of systems effort estimation?
• Hypothesis: To be discussed in joint session
• POC: Jo Ann Lane
9. Estimation in Operation & Maintenance Phase
• Central question: How can we estimate systems
engineering effort in the Operate & Maintain phase?
• Hypothesis: Coverage of the Operate & Maintenance
phases will broaden to model’s life cycle coverage
• POC: Ricardo Valerdi
10. Requirements volatility
• Central question: How do we quantify the effects of
requirements volatility on systems engineering effort
throughout the life cycle?
• Hypothesis: Requirements volatility is a significant factor
for predicting systems engineering effort and can serve
as a leading indicator for project success
• POC: Ricardo Valerdi
• Feb 15, 2007 Workshop led by Rick Selby
– Identified critical success factors in: technical, product, process, people
– http://sunset.usc.edu/events/2007/ARR/presentations/RequirementsVol
atilityWorkshopSummaryARR2007.ppt
– Loconsole, A., Borstler, J., “An industrial case study on requirements
volatility measures,” 12th Asia-Pacific Software Engineering
Conference, 2005.
Prioritization Exercise
• Factors to Consider
–
–
–
–
Availability of data
Impact on total cost of ownership
Frequency of use
Compatibility with other models (i.e., COCOMO
family, PRICE-H, etc.)
– Addressal of future trends (Volatility, Uncertainty,
Scalability)
– Factor interactions
Recursive Levels in the Design
Number of Recursive Levels in the Design
The number of levels of design related to the
system-of-interest (as defined by ISO/IEC
15288) and the amount of required SE
effort for each level.
Definition Issues
• Clarification of “one deep”
– Integration complexity of subsystems one
layer below the system-of-interest
• Recursive
– of, relating to, or constituting a procedure that
can repeat itself indefinitely
Possible Interpretations
• The largest number of decomposition levels in
any branch of the system’s specification tree
• The average number of decomposition levels in
the branches of the system’s specification tree
• The smallest number of decomposition levels in
any branch of the system’s specification tree
• The number of levels of the system’s Work
Breakdown Structure (WBS)
• The number of levels on the system’s Bill of
Materials (BOM)
Discussion
• Form vs. function
– Form is handled by levels in the design cost
driver
– Function is handled by requirements
interfaces size drivers
Update on COSYSMO Book
• 6 chapters/300 pages
• Foreword by Barry Boehm
• Endorsements from
–
–
–
–
–
–
–
–
Bill Rouse (Georgia Tech)
Paul Nielsen (SEI)
Dinesh Verma (Stevens)
Dan Galorath (Galorath)
Andy Sage (George Mason)
Rick Selby (Northrop Grumman/USC)
Wolt Fabrycky (Virginia Tech)
Marilee Wheaton (Aerospace Corporation/USC)
Objectives
•
•
•
•
Quantify systems engineering
Provide framework for decision making
Define opportunities for tailoring & calibration
Capture lessons learned from development, validation
and implementation
• Complement other estimation methods (heuristics,
analogy, expert-based) and models (COCOMO II)
• Cater to commercial marketplace in support of
– SEER-SEM
– TruePlanning
– SystemStar
• Continue to build repository of systems engineering data
• Provide a platform for future research
First Sentence
“COSYSMO is a model to help you reason
about the cost and schedule implications of
systems engineering decisions you may
need to make”.
Table of Contents
1. Scope of COSYSMO 2. Model Definition
New
New
Dissertation
Dissertation
3. Model Validation
& Verification
New
Dissertation
5. Systems Engineering
& Program Management 6. Evolution of Systems
Engineering Cost Estimation
4. Model Usage Strategies
New
Paper
Paper
New
New
Paper
Paper
Paper
Chapter 1
Chapter 2
Chapter 5
Size
Drivers
Effort
Multipliers
COSYSMO
Chapter 4
Effort
Calibration
Chapter 3
Chapter 6
Cost Estimation Heuristics
Criteria for Developing Heuristics
1. Agreement among experts that the
heuristic is useful and correct
2. Heuristic must stand the test of time
3. Heuristic must be resilient across
different scenarios
4. Heuristic must demonstrate value by
– reoccurring more than once
– Not be considered obvious by everybody,
particularly people who are new to the field
Model Development Heuristics
• Heuristic #1: More parameters increase
the explanatory power of the model, but
too many parameters make the model too
complex to use and difficult to calibrate.
• Heuristic #2: Break the problem and
analysis into phases over time; the right
amount of granularity is important.
• Heuristic #3: Let available data drive the
application boundaries of the model.
Model Development Heuristics
• Heuristic #4: Design the rating scale
according to the phenomenon being
modeled.
• Heuristic #5: Some system characteristics
are more likely to be cost penalties than
cost savings.
Model Calibration Heuristics
• Heuristic #6: All calibrations are local.
• Heuristic #7: Calibrations fix chronic errors
in over- or underestimation.
• Heuristic #8: Be skeptical of data that you
did not collect.
• Heuristic #9: For every parameter in the
model, 5 data points are required for the
calibration.
Model Calibration Heuristics
• Heuristic #10: Don’t do more analysis than
the data is worth.
• Heuristic #11: You need less data than
you think, you have more data than you
think.
Model Usage Heuristics
• Heuristic #12: A model is not reality.
• Heuristic #13: All models are wrong, but
some of them are useful.
• Heuristic #14: Begin with the end in mind.
• Heuristic #15: Requirements are king.
• Heuristic #16: Not all requirements are
created equal.
Model Usage Heuristics
• Heuristic #17: Reuse is not free.
• Heuristic #18: Operational Scenarios may
come first, but requirements will ultimately
describe the system.
• Heuristic #19: Don't double dip.
• Heuristic #20: Find your sea level.
• Heuristic #21: Nominal is the norm.
Model Usage Heuristics
• Heuristic #22: If you're estimating a large
project, personnel capability is Nominal.
• Heuristic #23: Most of your off-Nominal
cost drivers should match your last project.
• Heuristic #24: If you're going to sin, sin
consistently.
• Heuristic #25: Use a combination of
models to estimate total system cost.
Model Usage Heuristics
• Heuristic #26: Avoid overlap between
models.
• Heuristic #27: Estimate using multiple
methods (analogy, parametric, etc.)
Estimation Heuristics
• Heuristic #28: Estimate early and often.
• Heuristic #29: Experts all disagree forever.
Bound the options they are given to
evaluate.
• Heuristic #30: Models are optimistic.
• Heuristic #31: People are generally
optimistic.
Download