presentation - Center for Software Engineering

advertisement
University of Southern California
Center for Systems and Software Engineering
AFCAA Database and Metrics Manual
Ray Madachy, Brad Clark, Barry
Boehm, Thomas Tan
Wilson Rosa, Sponsor
USC CSSE Annual Research Review
March 8, 2011
University of Southern California
Center for Systems and Software Engineering
Agenda
• Introduction
• SRDR Overview
– Review of Accepted Changes
– SRDR Baseline Issues
• Data Analysis
• Manual Reviewers Website
2011 USC CSSE Annual Research Review
2
University of Southern California
Center for Systems and Software Engineering
Project Overview
• Goal is to improve the quality and consistency of estimating
methods across cost agencies and program offices through
guidance, standardization, and knowledge sharing.
• Project led by the Air Force Cost Analysis Agency (AFCAA)
working with service cost agencies, and assisted by University of
Southern California and Naval Postgraduate School
• Metrics Manual will present data analysis from existing final
SRDR data
• Additional information is crucial for improving data quality
– Data homogeneity is important
2011 USC CSSE Annual Research Review
3
University of Southern California
Center for Systems and Software Engineering
Software Cost Databases
• Purpose: to derive estimating relationships and benchmarks for
size, cost, productivity and quality
• Previous and current efforts to collect data from multiple projects
and organizations
–
–
–
–
Data Analysis Center for Software (DACS)
Software Engineering Information Repository (SEIR)
International Software Benchmarking Standards Group (ISBSG)
Large Aerospace Mergers (Attempts to create company-wide
databases)
– USAF Mosemann Initiative (Lloyd Mosemann Asst. Sec. USAF)
– USC CSSE COCOMO II repository
– DoD Software Resources Data Report (SRDR)
• All have faced common challenges such as data definitions,
completeness and integrity
2011 USC CSSE Annual Research Review
4
University of Southern California
Center for Systems and Software Engineering
Research Objectives
•
Using SRDR data, improve the quality and consistency of estimating
methods across cost agencies and program offices through guidance,
standardization, and knowledge sharing.
– Characterize different Application Domains and Operating Environments
within DoD
– Analyze collected data for simple Cost Estimating Relationships (CER) within
each domain
– Develop rules-of-thumb for missing data
•
Make collected data useful to oversight and management entities
Data Analysis
Data Records
Cost = a * Xb
CERs
2011 USC CSSE Annual Research Review
5
University of Southern California
Center for Systems and Software Engineering
Agenda
• Introduction
• SRDR Overview
– Review of Accepted Changes
– SRDR Baseline Issues
• Data Analysis
• Manual Reviewers Website
2011 USC CSSE Annual Research Review
6
University of Southern California
Center for Systems and Software Engineering
Software Resources Data Report
•
•
•
The Software Resources Data Report (SRDR) is used to obtain both the
estimated and actual characteristics of new software developments or
upgrades. Both the Government program office and, later on after
contract award, the software contractor submit this report. For
contractors, this report constitutes a contract data deliverable that
formalizes the reporting of software metric and resource data.
All contractors, developing or producing any software development
element with a projected software effort greater than $20M (then year
dollars) on major contracts and subcontracts within ACAT I and ACAT IA
programs, regardless of contract type, must submit SRDRs. The data
collection and reporting applies to developments and upgrades whether
performed under a commercial contract or internally by a government
Central Design Activity (CDA) under the terms of a memorandum of
understanding (MOU).
Reports mandated for Initial Government, Initial Developer, and Final
Developer.
2011 USC CSSE Annual Research Review
7
University of Southern California
Center for Systems and Software Engineering
Submittal Process
•
•
Data accessed through Defense Cost and Resource Center (DCARC),
http://dcarc.pae.osd.mil.
The DCARC's Defense Automated Cost Information Management
System (DACIMS) is the database with current and historical cost and
software resource data for independent, substantiated estimates.
2011 USC CSSE Annual Research Review
Government
Program
Office
8
University of Southern California
Center for Systems and Software Engineering
Agenda
• Introduction
• SRDR Overview
– Review of Accepted Changes
– SRDR Baseline Issues
• Data Analysis
• Manual Reviewers Website
2011 USC CSSE Annual Research Review
9
University of Southern California
Center for Systems and Software Engineering
Proposed SRDR Modifications
• Data analysis problems with project types, size and effort
normalization, and multiple builds motivated some of these:
2011 USC CSSE Annual Research Review
10
University of Southern California
Center for Systems and Software Engineering
Proposed SRDR Modifications
2011 USC CSSE Annual Research Review
11
University of Southern California
Center for Systems and Software Engineering
Proposed SRDR Modifications
…
2011 USC CSSE Annual Research Review
12
University of Southern California
Center for Systems and Software Engineering
Proposed SRDR Modifications
…
2011 USC CSSE Annual Research Review
13
University of Southern California
Center for Systems and Software Engineering
Modified Final Developer Form (1/3)
2011 USC CSSE Annual Research Review
14
University of Southern California
Center for Systems and Software Engineering
Modified Final Developer Form (2/3)
2011 USC CSSE Annual Research Review
15
University of Southern California
Center for Systems and Software Engineering
Modified Final Developer Form (3/3)
2011 USC CSSE Annual Research Review
16
University of Southern California
Center for Systems and Software Engineering
Product and Development Description
• Our recommendation for one operating type and one application
domain was not incorporated
2011 USC CSSE Annual Research Review
17
University of Southern California
Center for Systems and Software Engineering
Staffing
• Our expanded number of experience levels was adopted for
Personnel Experience.
2011 USC CSSE Annual Research Review
18
University of Southern California
Center for Systems and Software Engineering
Size (1/2)
•
Accepted these recommendations:
– Requirements volatility changed from relative scale to percentage
– Reused code with Modifications is our Modified code (report DM, CM IM)
– Reused code without Modifications is our Reused code (DM, CM =0, report IM)
2011 USC CSSE Annual Research Review
19
University of Southern California
Center for Systems and Software Engineering
Size (2/2)
•
Our recommendation for deleted code was accepted.
2011 USC CSSE Annual Research Review
20
University of Southern California
Center for Systems and Software Engineering
Resource and Schedule
• We recommended the breakout of QA, CM and PM which were
previously under “other”.
2011 USC CSSE Annual Research Review
21
University of Southern California
Center for Systems and Software Engineering
Quality
• We recommended more
common Defect metrics:
Number of Defects
Discovered and Number
of Defects Removed.
• Gone is the Mean Time
to Serious or Critical
Defect (MTTD) or
Computed Reliability.
2011 USC CSSE Annual Research Review
22
University of Southern California
Center for Systems and Software Engineering
Agenda
• Introduction
• SRDR Overview
– Review of Accepted Changes
– SRDR Baseline Issues
• Data Analysis
• Manual Reviewers Website
2011 USC CSSE Annual Research Review
23
University of Southern California
Center for Systems and Software Engineering
SRDR Revisions
• Two-year update cycle
• We submitted our recommendations in Spring 2010
• Received draft of updated SRDR DID in Fall 2010 from
committee reflecting our changes and more
• Adoption schedule unclear
• DCARC website shows 2007 version posted at
http://dcarc.pae.osd.mil/Policy/CSDR/csdrReporting.as
px.
• Issues: what version(s) to cover in manual for
collection and analysis
– E.g. previously we had guidance to mitigate downfalls of
2007 version
2011 USC CSSE Annual Research Review
24
University of Southern California
Center for Systems and Software Engineering
Agenda
• Introduction
• SRDR Overview
– Review of Accepted Changes
– SRDR Baseline Issues
• Data Analysis
• Manual Reviewers Website
2011 USC CSSE Annual Research Review
25
University of Southern California
Center for Systems and Software Engineering
SRDR Raw Data (520 observations)
PM = 1.67 * KSLOC0.66
2011 USC CSSE Annual Research Review
26
University of Southern California
Center for Systems and Software Engineering
Data Conditioning
• Segregate data
• Normalize sizing data
• Map effort distribution
2011 USC CSSE Annual Research Review
27
University of Southern California
Center for Systems and Software Engineering
SRDR Data Segmentation
2011 USC CSSE Annual Research Review
28
University of Southern California
Center for Systems and Software Engineering
SRDR Data Fields
• Used
• Missing
– Contractor (assigned OID)
– Component description
(sanitized)
– Development process
– Percentage personnel
experience
– Peak staffing
– Amount of relative
requirements volatility
– Primary & secondary lang.
– Months of duration by activity
– Hours of effort by activity &
comments
– Lines of code: new, modified,
unmodified, auto-generated
– Our Application and
Environment classification
– Build information
– Effort in Person Months
– Adapted code parameter: DM,
CM & IM
– Equivalent KSLOC
2011 USC CSSE Annual Research Review
29
University of Southern California
Center for Systems and Software Engineering
Derive Equivalent Size
•
Normalize the SLOC counting method to Logical SLOC
– Physical SLOC count converted to Logical SLOC count by programming
language
– Non-comment SLOC count converted to Logical SLOC count by programming
language
•
Convert Auto-Generated SLOC convert to Equivalent SLOC (ESLOC)
– Use AAF formula: (DM% * 0.4) + (CM% * 0.3) + (IM% * 0.3)
– DM = CM = 0; IM = 100
•
Convert Reused SLOC to ESLOC with AAF formula
– DM = CM = 0; IM = 50
•
Convert Modified SLOC to ESLOC
– Use AAF formula: (DM% * 0.4) + (CM% * 0.3) + (IM% * 0.3
– Default values: Low – Mean – High based on 90% confidence interval
•
•
•
Create Equivalent SLOC count and scale to thousands (K) to derive
EKSLOC
(New + Auto-Gen+ Reused+ Modified) / 1000 = EKSLOC
Remove all records with an EKSLOC below 1.0
2011 USC CSSE Annual Research Review
30
University of Southern California
Center for Systems and Software Engineering
Convert Modified Size to ESLOC
• Use AAF formula: (DM% * 0.4) + (CM% * 0.3) + (IM% * 0.3)
• Problems with missing DM, CM & IM in SRDR data
• Program interviews provided parameters for some records
• For missing data, use records that have data in all fields to derive
recommended values for missing data
2011 USC CSSE Annual Research Review
31
University of Southern California
Center for Systems and Software Engineering
Map Effort Distribution
• Labor hours are reported for 7 activities:
–
–
–
–
–
–
–
Software Requirements
Software Architecture (including Detailed Design)
Software Code (including Unit Testing)
Currently don’t use
Software Integration and Test
Software Requirements and
Software Qualification Test
Developmental Test hours
Software Developmental Test & Evaluation
Other (Mgt, QA, CM, PI, etc.)
• Create effort distribution percentages for records that have hours
in requirements, architecture, code, integration and qualification
test phases (developmental test evaluation and other phases may
or may not be blank)
• Using activity distribution to backfill missing activities makes
results worse
2011 USC CSSE Annual Research Review
32
University of Southern California
Center for Systems and Software Engineering
Team Experience
• SRDR Data Definition
– Report the percentage of project personnel in each category
– Highly Experienced in the domain (three or more years of
experience)
– Nominally Experienced in the project domain (one to three years of
experience)
– Entry-level Experienced (zero to one year of experience)
• Need to include Team Experience (TXP) in CERs to estimate cost
• After analyzing the data, the following quantitative values are
assigned:
– Highly experienced: 0.60
– Nominally experienced: 1.00
– Entry-level experienced: 1.30
2011 USC CSSE Annual Research Review
33
University of Southern California
Center for Systems and Software Engineering
Analysis Steps
2011 USC CSSE Annual Research Review
34
University of Southern California
Center for Systems and Software Engineering
Derive EKSLOC
2011 USC CSSE Annual Research Review
35
University of Southern California
Center for Systems and Software Engineering
Size - Effort
Effort for small software configuration
items (CI) appears high.
For larger CIs, it appears more normal.
2011 USC CSSE Annual Research Review
36
University of Southern California
Center for Systems and Software Engineering
Productivity Conundrum
• We can confirm the
unusually high effort by
comparing productivity to
size.
• We see that productivity is
lower for smaller sizes (5 to
40 EKSLOC) than larger sizes.
• This is counter to what we
believe to be true –
productivity should be lower
for larger the software CIs
(all other factors being held
constant).
2011 USC CSSE Annual Research Review
37
University of Southern California
Center for Systems and Software Engineering
Large Project CER
• We also can see that
modeling this domain above
50 EKSLOC produces a
model that shows more
effort is required with larger
CIs. This is what we expect.
• This model can be used in
analysis to help quantify the
amount of “extra” effort in
projects below 50 KESLOC.
2011 USC CSSE Annual Research Review
38
University of Southern California
Center for Systems and Software Engineering
Overhead Function
• If we use the model from
the previous slide on the
data below 50 KESLOC, we
can express the difference in
“what we would expect”
(the model) and “what we
observe” (the data) as a
function of size.
• Yet another model can be
created to express the
decreasing decreasing
difference with increasing
size.
• Call this model the
“Overhead Function”
2011 USC CSSE Annual Research Review
39
Derive Effort
2011 USC CSSE Annual Research Review
40
University of Southern California
Center for Systems and Software Engineering
Communications Domain
After applying the overhead
function to the observed
effort (by subtracting the
overhead function effort from
the observed effort), we get
an overall Cost Estimating
Relationship that seems
reasonable.
2011 USC CSSE Annual Research Review
41
University of Southern California
Center for Systems and Software Engineering
Command & Control Domain
2011 USC CSSE Annual Research Review
42
University of Southern California
Center for Systems and Software Engineering
Agenda
• Introduction
• SRDR Overview
– Review of Accepted Changes
– SRDR Baseline Issues
• Data Analysis
• Manual Reviewers Website
2011 USC CSSE Annual Research Review
43
University of Southern California
Center for Systems and Software Engineering
Manual Reviewers Website
• http://csse.usc.edu/afcaa/manual_draft/
• Review and Discussion
2011 USC CSSE Annual Research Review
44
University of Southern California
Center for Systems and Software Engineering
Questions?
For more information, contact:
Ray Madachy
rjmadach@nps.edu
Or
Brad Clark
bkclark@csse.usc.edu
Or
Wilson Rosa
Wilson.Rosa@pentagon.af.mil
2011 USC CSSE Annual Research Review
45
Download