Introduction Wilson Rosa, AFCAA CSSE Annual Research

advertisement
Introduction
Wilson Rosa, AFCAA
wilson.rosa@pentagon.af.mil
CSSE Annual Research
Review
March 8, 2010
Agenda
• Overview
2
Project Background
• Goal is to improve the quality and consistency of
estimating methods across cost agencies and program
offices through guidance, standardization, and knowledge
sharing.
• Project led by the Air Force Cost Analysis Agency
(AFCAA) working with service cost agencies, and
assisted by University of Southern California and Naval
Postgraduate School
• We will publish the AFCAA Software Cost Estimation
Metrics Manual to help analysts and decision makers
develop accurate, easy and quick software cost estimates
for avionics, space, ground, and shipboard platforms.
3
Stakeholder Communities
• Research is collaborative across heterogeneous stakeholder
communities who have helped us in refining our data
definition framework, domain taxonomy and providing us
project data.
–
–
–
–
Government agencies
Tool Vendors
Industry
Academia
SLIM-Estimate™
TruePlanning® by PRICE Systems
4
Research Objectives
• Establish a robust and cost effective software
metrics collection process and knowledge base
that supports the data needs of the United
States Department of Defense (DoD)
• Enhance the utility of the collected data to
program oversight and management
• Support academic and commercial research
into improved cost estimation of future DoD
software-intensive systems
5
Software Cost Model Calibration
 Most program offices and support contractors
rely heavily on software cost models
 May have not been calibrated with most recent
DoD data
 Calibration with recent data (2002-Present)
will help increase program office estimating
accuracy
6
AFCAA Software Cost Estimation Metrics Manual
Table of Contents
Chapter 1: Software Estimation Principles
Chapter 2: Product Sizing
Chapter 3: Product Growth
Chapter 4: Effective SLOC
Chapter 5: Historical Productivity
Chapter 6: Model Calibration
Chapter 7: Calibrated SLIM-ESTIMATE
Chapter 8: Cost Risk and Uncertainty Metrics
Chapter 9: Data Normalization
Chapter 10: Software Resource Data Report
Chapter 11: Software Maintenance
Chapter 12: Lessons Learned
7
Manual Special Features
• Augment NCCA/AFCAA Software Cost Handbook:
– Default Equivalent Size Inputs (DM, CM, IM, SU, AA, UNFM)
– Productivity Benchmarks by Operating Environment, Application
Domain, and Software Size
– Empirical Code, Effort, and Schedule Growth Measures derived from
SRDRs
– Empirically Based Cost Risk and Uncertainty Analysis Metrics
– Calibrated SLIM-Estimate™ using most recent SRDR data
– Mapping between COCOMO, SEER, True S cost drivers
– Empirical Dataset for COCOMO, True S, and SEER Calibration
– Software Maintenance Parameters
8
Manual Special Features (Cont.)
• Guidelines for reconciling inconsistent data
• Standard Definitions (Application Domain, SLOC, etc.)
• Address issues related to incremental development (overlaps,
early-increment breakage, integration complexity growth,
deleted software, relations to maintenance) and version
management (a form of product line development and
evolution).
• Impact of Next Generation Paradigms – Model Driven
Architecture, Net-Centricity, Systems of Systems, etc.
9
Agenda
•
•
•
•
•
Overview (Barry Boehm)
Data Analysis
Software Sizing
Recent Workshop Results
Conclusions
10
DoD Empirical Data
• Data quality and standardization issues
–
–
–
–
–
–
–
No reporting of Equivalent Size Inputs – CM, DM, IM, SU, AA, UNFM, Type
No common SLOC reporting – logical, physical, etc.
No standard definitions – Application Domain, Build, Increment, Spiral,…
No common effort reporting – analysis, design, code, test, CM, QA,…
No common code counting tool
Product size only reported in lines of code
No reporting of quality measures – defect density, defect containment, etc.
• Limited empirical research within DoD on other contributors to
productivity besides effort and size:
–
–
–
–
–
–
Operating Environment, Application Domain, and Product Complexity
Personnel Capability
Required Reliability
Quality – Defect Density, Defect Containment
Integrating code from previous deliveries – Builds, Spirals, Increments, etc.
Converting to Equivalent SLOC
• Categories like Modified, Reused, Adopted, Managed, and Used add no
value unless they translate into single or unique narrow ranges of DM, CM,
and IM parameter values. We have seen no empirical evidence that they
do…
11
SRDR Data Source
Software Resources Data Report: Final Developer Report - Sample
Page 1: Report Context, Project Description and Size
1.
Report Context
1. System/Element Name (version/release):
2. Report As Of:
3. Authorizing Vehicle (MOU, contract/amendment, etc.):
4. Reporting Event: Contract/Release End
Submission # ________
(Supersedes # _______, if applicable)
Description of Actual Development Organization
5. Development Organization:
6. Certified CMM Level
(or equivalent):
7. Certification Date:
8. Lead Evaluator:
9. Affiliation:
10. Precedents (list up to five similar systems by the same organization or team):
Comments on Part 1 responses:
2.
Product and Development Description
Percent of
Product Size
1. Primary Application Type:
2.
17. Primary Language Used:
18.
Upgrade or
New?
Actual Development Process
% 3.
4.
%
21. List COTS/GOTS Applications Used:
22. Peak staff (maximum team size in FTE) that worked on and charged to this project: __________
23. Percent of personnel that was: Highly experienced in domain: ___%
Nominally experienced: ___%
Entry level, no experience: ___%
Comments on Part 2 responses:
3.
Product Size Reporting
Provide Actuals at
Final Delivery
1. Number of Software Requirements, not including External Interface Requirements (unless noted in associated Data
Dictionary)
2. Number of External Interface Requirements (i.e., not under project control)
3. Amount of Requirements Volatility encountered during development (1=Very Low .. 5=Very High)
Code Size Measures for items 4 through 6. For each, indicate S for physical SLOC (carriage returns); Snc for noncomment
SLOC only; LS for logical statements; or provide abbreviation _________ and explain in associated Data Dictionary.
4. Amount of New Code developed and delivered (Size in __________ )
5. Amount of Modified Code developed and delivered (Size in __________ )
6. Amount of Unmodified, Reused Code developed and delivered (Size in __________ )
Comments on Part 3 responses:
12
Data Collection and Analysis
• Approach
– Be sensitive to the application domain
– Embrace the full life cycle and Incremental Commitment Model
• Be able to collect data by phase, project and/or build or
increment
• Items to collect
– SLOC reporting – logical, physical, NCSS, etc.
– Requirements Volatility and Reuse
• Modified or Adopted using DM, CM, IM; SU, UNFM as
appropriate
– Definitions for Application Types, Development Phase, Lifecycle
Model,…
– Effort reporting – phase and activity
– Quality measures – defects, MTBF, etc.
13
Data Normalization Strategy
• Interview program offices and developers to obtain additional
information not captured in SRDRs…
– Modification Type – auto generated, re-hosted, translated,
modified
– Source – in-house, third party, Prior Build, Prior Spiral, etc.
– Degree-of-Modification – %DM, %CM, %IM; SU, UNFM
as appropriate
– Requirements Volatility -- % of ESLOC reworked or deleted
due to requirements volatility
– Method – Model Driven Architecture, Object-Oriented,
Traditional
– Cost Model Parameters – True S, SEER, COCOMO, SLIM
14
Download