Next-Generation Software Sizing and Costing Metrics Workshop Report

advertisement
University of Southern California
Center for Systems and Software Engineering
Next-Generation Software Sizing and Costing Metrics
Workshop Report
Wilson Rosa, Barry Boehm, Ray Madachy, Brad Clark
USC CSSE Annual Research Review
March 9, 2010
University of Southern California
Center for Systems and Software Engineering
Topics
• AFCAA Study Overview
• Current and Future Challenges for Software Cost Estimation and
Data Collection
• Proposed Metrics Definition Highlights
• Productivity Data Analysis and Issues for Discussion
This work is sponsored by the Air Force Cost Analysis Agency
USC CSSE Annual Research Review - Mar 2010
2
University of Southern California
Center for Systems and Software Engineering
Project Background
• Goal is to improve the quality and consistency of estimating
methods across cost agencies and program offices through
guidance, standardization, and knowledge sharing.
• Project led by the Air Force Cost Analysis Agency (AFCAA)
working with service cost agencies, and assisted by University of
Southern California and Naval Postgraduate School
• We will publish the AFCAA Software Cost Estimation Metrics
Manual to help analysts and decision makers develop accurate,
easy and quick software cost estimates for avionics, space,
ground, and shipboard platforms.
USC CSSE Annual Research Review - Mar 2010
3
University of Southern California
Center for Systems and Software Engineering
AFCAA Software Cost Estimation
Metrics Manual
Chapter 1: Software Estimation Principles
Chapter 2: Product Sizing
Chapter 3: Product Growth
Chapter 4: Effective SLOC
Chapter 5: Historical Productivity
Chapter 6: Model Calibration
Chapter 7: Calibrated SLIM-ESTIMATE
Chapter 8: Cost Risk and Uncertainty Metrics
Chapter 9: Data Normalization
Chapter 10: Software Resource Data Report
Chapter 11: Software Maintenance
Chapter 12: Lessons Learned
USC CSSE Annual Research Review - Mar 2010
4
University of Southern California
Center for Systems and Software Engineering
Stakeholder Communities
• Research is collaborative across heterogeneous stakeholder
communities who have helped us in refining our data definition
framework, domain taxonomy and providing us project data.
–
–
–
–
Government agencies
Tool Vendors
Industry
Academia
SLIM-Estimate™
TruePlanning® by PRICE Systems
USC CSSE Annual Research Review - Mar 2010
5
University of Southern California
Center for Systems and Software Engineering
Current and Future DoD Cost Estimation
Challenges
• Emergent requirements
– Cannot prespecify requirements, cost, schedule, EVMS
– Need to estimate and track early concurrent engineering
• Rapid change
– Long acquisition cycles breed obsolescence
– DoD Inst 5000.02 emphasis on evolutionary acquisition
• Net-centric systems of systems
– Incomplete visibility and control of elements
• Model, COTS, service-based, Brownfield systems
– New phenomenology, counting rules
• Always-on, never-fail systems
– Need to balance agility and high assurance
USC CSSE Annual Research Review - Mar 2010
6
University of Southern California
Center for Systems and Software Engineering
Rapid Change Creates a Late Cone of Uncertainty
– Need evolutionary/incremental vs. one-shot development
4x
Uncertainties in competition,
technology, organizations,
mission priorities
2x
1.5x
1.25x
Relative
Cost Range
x
0.8x
0.67x
0.5x
0.25x
Feasibility
Plans
and
Rqts.
Detail
Design
Spec.
Product
Design
Spec.
Rqts.
Spec.
Concept of
Operation
Product
Design
Detail
Design
Accepted
Software
Devel. and
Test
Phases and Milestones
USC CSSE Annual Research Review - Mar 2010
7
University of Southern California
Center for Systems and Software Engineering
Incremental Development Productivity
Decline (IDPD)
• Example: Site Defense BMD Software
– 5 builds, 7 years, $100M; operational and support software
– Build 1 productivity over 300 LOC/person month
– Build 5 productivity under 150 LOC/PM
• Including Build 1-4 breakage, integration, rework
• 318% change in requirements across all builds
• IDPD factor = 20% productivity decrease per build
– Similar trends in later unprecedented systems
– Not unique to DoD: key source of Windows Vista delays
• Maintenance of full non-COTS SLOC, not ESLOC
– Build 1: 200 KSLOC new; 200K reused@20% = 240K ESLOC
– Build 2: 400 KSLOC of Build 1 software to maintain, integrate
USC CSSE Annual Research Review - Mar 2010
8
Software Resources Data Report: Final Developer Report - Sample
Page 1: Report Context, Project Description and Size
1.
Report Context
SRDR Data Source
1. System/Element Name (version/release):
3. Authorizing Vehicle (MOU, contract/amendment, etc.):
2. Report As Of:
4. Reporting Event: Contract/Release End
Submission # ________
(Supersedes # _______, if applicable)
Description of Actual Development Organization
5. Development Organization:
6. Certified CMM Level
(or equivalent):
7. Certification Date:
8. Lead Evaluator:
9. Affiliation:
10. Precedents (list up to five similar systems by the same organization or team):
Comments on Part 1 responses:
2.
Product and Development Description
Percent of
Product Size
1. Primary Application Type:
2.
17. Primary Language Used:
18.
Upgrade or
New?
Actual Development Process
% 3.
4.
%
21. List COTS/GOTS Applications Used:
22. Peak staff (maximum team size in FTE) that worked on and charged to this project: __________
23. Percent of personnel that was: Highly experienced in domain: ___%
Nominally experienced: ___%
Entry level, no experience: ___%
Comments on Part 2 responses:
3.
Provide Actuals at
Final Delivery
Product Size Reporting
1. Number of Software Requirements, not including External Interface Requirements (unless noted in associated Data
Dictionary)
2. Number of External Interface Requirements (i.e., not under project control)
3. Amount of Requirements Volatility encountered during development (1=Very Low .. 5=Very High)
Code Size Measures for items 4 through 6. For each, indicate S for physical SLOC (carriage returns); Snc for noncomment
SLOC only; LS for logical statements; or provide abbreviation _________ and explain in associated Data Dictionary.
4. Amount of New Code developed and delivered (Size in __________ )
5. Amount of Modified Code developed and delivered (Size in __________ )
6. Amount of Unmodified, Reused Code developed and delivered (Size in __________ )
Comments on Part 3 responses:
DD Form 2630-3
USC CSSE Annual Research Review - Mar 2010
Page 1 of 2
9
University of Southern California
Center for Systems and Software Engineering
Proposed Metrics Definition Highlights
• Data quality and standardization issues
– No reporting of equivalent “new” code size inputs: Design Modified,
Code Modified, Integration Testing Modified, Software
Understandability, Programmer Unfamiliarity
– No common SLOC reporting – logical, physical, etc.
– No standard definitions – Application Domain, Build, Increment,
Spiral,…
– No common effort reporting – analysis, design, code, test, CM, QA,…
– No reporting of quality measures – defect density, defect
containment, etc.
USC CSSE Annual Research Review - Mar 2010
10
University of Southern California
Center for Systems and Software Engineering
Proposed Metrics Definition Highlights
• Limited empirical research within DoD on other contributors to
productivity besides effort and size:
– Operating Environment, Application Domain, and Product
Complexity
– Personnel Capability
– Required Reliability
– Quality – Defect Density, Defect Containment
– Integrating code from previous deliveries – Builds, Spirals,
Increments, etc.
– Converting to Equivalent SLOC
• Reported code sizes for Modified, Unmodified/Reused add no
value to a cost estimate unless they translate into equivalent
“new” SLOC
• This research and the resulting Cost Metrics Manual will discuss
and address these issues
USC CSSE Annual Research Review - Mar 2010
11
University of Southern California
Center for Systems and Software Engineering
SRDR Data
Application Domain
Business Systems
Command & Control
Communications
Controls & Displays
Executive
Information Assurance
Infrastructure or Middleware
Mission Management
Mission Planning
Process Control
Scientific Systems
Sensor Control and Processing
Simulation & Modeling
Spacecraft Payload
Test & Evaluation
Tool & Tool Systems
Training
Weapons Delivery and Control
Total
Avionics
Fixed
Ground
1
1
2
8
35
1
12
1
1
2
2
4
Operating Environment
Mobile
Unmanned
Missile
Ground
Shipboard
Space
4
5
2
1
1
3
3
1
3
1
4
3
10
3
2
9
1
1
3
1
4
21
68
7
10
16
25
2
Total
4
14
39
7
3
1
3
18
5
4
3
12
12
1
1
3
1
11
142
Missing Domains: Internet, Maintenance and Diagnostics, Spacecraft bus
Notes:
SRDR: Software Resources Data Report
USC CSSE Annual Research Review - Mar 2010
12
University of Southern California
Center for Systems and Software Engineering
Data Analysis Issues
Preliminary Results - Do Not Use!
PM=15*(EKSLOC)0.62
USC CSSE Annual Research Review - Mar 2010
13
University of Southern California
Center for Systems and Software Engineering
Sizing Issues -1
NCSS to Logical
SLOC Conversion
Ada: 45%
C/C++: 61%
C#:
61%
Java: 72%
USC CSSE Annual Research Review - Mar 2010
14
University of Southern California
Center for Systems and Software Engineering
Sizing Issues -2
• No Modified Code parameters
–
–
–
–
–
Percent Design Modified (DM)
Percent Code Modified (CM)
Percent Integration and Test Modified (IM)
Software Understanding (SU)
Programmer Unfamiliarity (UNFM)
• Program interviews provided parameters for some records
USC CSSE Annual Research Review - Mar 2010
15
University of Southern California
Center for Systems and Software Engineering
Effort Issues
• Missing effort reporting for different lifecycle phases
–
–
–
–
–
–
Software requirements analysis (REQ)
Software architectural design (ARCH)
Software coding and testing (CODE)
Software integration (INT)
Software qualification testing (QT)
Software management, CM, QA, etc. (Other – very inconsistent)
USC CSSE Annual Research Review - Mar 2010
16
Application Domain Issues
Application Domains
Very Easy
Nominal
Challenging
Large biz
system
Business Systems
Internet
Easy
Simple web
pages
Trillion $/day
transaction
Web application
(shopping)
Tools and Tool Systems
Mega-web
application
Verification tools
Safety critical
Scientific Systems
Offline data
reduction
Large dataset
Simulation and Modeling
Low fidelity
simulator
Physical
phenomenon
Test and Evaluation
Usual
Distributed
debugging
Training
Command and Control
Very Challenging
Simulation
network
Set of screens
Taxi-cab
dispatch
SOS (C4ISR)
Mission Management
Usual
Multi-level security
and safety
Weapon Delivery and
Control
Weapon space
Safety
Communications
Noise,
anomalies
handling
Radio
Safety/Security
Frequencyhopping
USC CSSE Annual Research Review - Mar 2010
17
University of Southern California
Center for Systems and Software Engineering
Proposed SRDR Changes -1
Current SRDR
Proposed Modifications
Application Types (3.7.1 – 17)
Software and External Interface
Requirements
Personnel Experience &
Turnover
Amount of New (>25%)
Modified (<25% mod) Code
• Reorganize around Operating Environments
and Application Domains
• Add Mission Criticality (add reliability and
complexity in a single rating scale)
• Revisit detailed definitions of the Application
Domains
• Add anticipated requirements volatility to 26301, 2
• Use percentage of requirements change as
volatility input (SRR baseline)
• Add to 2630-1
• Expand years of experience rating scale to 12
years
• Add DM, CM, IM, SU, & UNFM factors for
modified code
• Incorporate DM-CM-IM questionnaire
• Add IM for Reused code
• Definitions for code types
• Count at the level it will be maintained
USC CSSE Annual Research Review - Mar 2010
18
University of Southern California
Center for Systems and Software Engineering
Proposed SRDR Changes -2
Current SRDR
Proposed Modifications
Deleted Code
• Report deleted code counts
Project- or CSCI-level data
• Specify the level of data reporting
All Other Direct Software Engineering
Development Effort (4.7):
•Project Management
•IV&V
•Configuration Management
•Quality Control
•Problem Resolution
•Library Management
•Process Improvement
•Measurement
•Training
•Documentation
•Data Conversion
•Customer-run Acceptance Test
•Software Delivery, Installation & Deployment
Break into:
• Management functions
• Configuration / Environment functions
• Assessment functions
• Organization functions (e.g. user &
maintainer documentation, measurement,
training, process improvement, etc.)
USC CSSE Annual Research Review - Mar 2010
19
University of Southern California
Center for Systems and Software Engineering
Concluding Remarks
• Goal is to publish a manual to help analysts develop quick
software estimates using empirical metrics from recent programs
• Additional information is crucial for improving data quality across
DoD
• We want your input on Productivity Domains and Data Definitions
• Looking for collaborators
• Looking for peer-reviewers
• Need more data
USC CSSE Annual Research Review - Mar 2010
20
Download