DoD Software Resource Data Reports (SRDRs) and Cost Data Analysis Workshop Summary

advertisement

Center for Systems and

Software Engineering

DoD Software Resource Data Reports (SRDRs) and

Cost Data Analysis Workshop

Summary

Brad Clark

University of Southern California

Raymond Madachy

Naval Postgraduate School

27th International Forum on COCOMO® and Systems/Software Cost Modeling

October 18, 2012

Participants

• Dan Galorath, Galorath Inc.

• Dan Ligett, Softstar

• Arlene Minkiewicz, PRICE Systems

• Linda Esker, Fraunhofer USA

• Rick Selby, Northrop Grumman

• Kathryn Conner, RAND

• Qi Li, USC

• Tony Abolfotouh, Robbins Gioia

• Dave Zubrow, CMU-SEI

• Pete McLoone, Lockheed Martin

27th International Forum on COCOMO® and Systems/Software Cost Modeling 10/16/2012 2

SRDR Issues

• What about asking contractors for the parameter ratings for the cost model they used? Maybe deleting some other less useful fields would be more palatable.

• Can monthly metrics reports be stored with SRDR data?

• Visual programming effort not represented in lines of code.

• There could be a SRDR quality acceptance clause in the contracts so that reviews of the delivered SRDRs aren’t so superficial. Could use independent reviewers.

• No definitions of requirements so they should be added.

10/16/2012 27th International Forum on COCOMO® and Systems/Software Cost Modeling 3

Research Directions (1/2)

• Missing activity effort data can be derived other ways so it’s not so homogeneous

(e.g. regression, sampling approaches) instead of using central tendencies of few data points.

• Can use a box plot to visualize activity percentages if mean is appropriate to use since data may be skewed vs. normally distributed. See median, min, max, …

• Still need to do an ANOVA of productivity types. They might overlap.

Use Complementary approaches:

• Visually assess the productivity distributions and look for breakdown. They may be multimodal, so means may not be the best way to quantify.

• A top down analysis has been done, but can use a bottoms-up approach to complement it. E.g. group the data points by productivity to see how they map up.

Segment into data divisions, and see how good the PRED is. Then the homogeneity of productivity types can be assessed.

– E.g., Sort the rows, group into deciles and label them. With labels the data points

10/16/2012 may form new groups.

27th International Forum on COCOMO® and Systems/Software Cost Modeling 4

Research Directions (2/2)

• Add a size category parameter. 2-D histograms of 320 data points could be used to determine the categories.

• CERs by productivity type are for nominal schedule only. Can assess the actual schedule constraints by comparing to COCOMO result, as a proxy.

• Apply personally identifiable information (PII) approach so individual projects cannot be identified. Combine into aggregate groups that contain multiple reporting organizations.

10/16/2012 27th International Forum on COCOMO® and Systems/Software Cost Modeling 5

Download