Integration of Software Cost Estimates Across COCOMO, SEERSEM, and PRICE-S models Tom Harwick, Engineering Specialist Northrop Grumman Corporation Integrated Systems October 26-29, 2004 19th International Forum on COCOMO and Software Cost Modeling 2 19th International Forum on COCOMO and Software Cost Modeling Outline • Need for Inter-Model Estimate Integration • Model Analysis – Model Comparisons & Sensitivity Analysis – Economic Properties • Baseline Example – COCOMO – SEER-SEM – PRICE-S • Summary 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 3 19th International Forum on COCOMO and Software Cost Modeling Need for Inter-Model Estimate Integration • Communication of overall Software Cost Estimate to the Customer • Minimize Program Cost Risk • Integration of Software Estimates Across Vendors – Various Software Cost Models – Sizing Definitions – Selection of Cost Driver Settings – Vendors have different Statement Of Work (SOW) 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 4 19th International Forum on COCOMO and Software Cost Modeling Model Analysis • Approach – Understand Properties of Each Cost Model – Sizing and Sizing Normalization – Productivity Cost Drivers –Team & Process –Market – Customer(s) –Complexity, Operating Environment –Model Scope Differences 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 5 19th International Forum on COCOMO and Software Cost Modeling Model Analysis - Cost Drivers by Cost Model Team & Process Team COCOMO II ACAP, PCAP, APEX, PLEX, LTEX Process TOOL, PCON, PVOL Market/ Customer Market COCOMO II RUSE, SITE, SECU * Schedule SCED Reliability Certification Requirements Complexity Complexity RELY DOCU Operating Environment & Technology Sizing & Exponent Size Normalized size Exponent Scope Hardware dev. in parallel with software System Integration 10/07/2004 COCOMO II CPLX, DATA, STOR TIME COCOMO II Raw KSLOC Equiv. new KSLOC PREC, FLEX, RESL, TEAM, PMAT- or SEI rating COCOMO II Not in the default model SEER-SEM Analyst Capabilities; Analyst Experience; Programmer Capabilities; Programmers Language Experience Development Method – KnowledgeBase; and Practices and Methods Experience SEER-SEM Requirements Definition Normality ; Development Method – Application and Acquisition MethodKnowledgeBase; Multiple Site Development Required Schedule; Start Date Development Standard KnowledgeBase SEER-SEM Application – KnowledgeBase Platform; and DevelopmentMethod KnowledgeBase SEER-SEM New Lines of Code; Pre-existing SLOC. New Lines of Code PRICE-S INTEGI CPLX1 PRICE-S CPLXM, CPLX1 DSTART, PEND Standard PRICE-S UTIL, APPL PLTFM PRICE-S Raw SLOC NEWD, NEWD, SLOC SEER-SEM PRICE-S CPLX2 19th International Forum on COCOMO and Software Cost Modeling 6 19th International Forum on COCOMO and Software Cost Modeling Model Analysis - Concept of “Ceterus Parabus” Methodology to Extract Cost Driver Information Cost Slope = delta Cost/delta (Analyst Experience) Range = Cost/(Analyst Experience) Parameter (e.g. Analyst Experience) Other parameters held constant 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 7 19th International Forum on COCOMO and Software Cost Modeling Model Analysis - Major COCOMO Cost Drivers COCOMO II - Local Impact (about Baseline, +/- 1 setting) ACAP = High PVOL= Nom PCAP= Nom RUSE= Nom CPLX= XHigh DATA= High TIME= XHigh PCON= Nom APEX = High SITE= Low DOCU= Nom TOOL=Nom LTEX= Nom PLEX= Nom RELY= VHigh SCED= Nom SECU = High STOR=Nom - 0.20 0.40 0.60 0.80 1.00 1.20 Env ironme ntal Factors 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 1.40 1.60 8 19th International Forum on COCOMO and Software Cost Modeling Model Analysis - Major SEER-SEM Cost Drivers SEER-SEM 7.0 Security Requirements= “Nom+” Analyst Applic. Experience = “Nom” Analyst Capabilities = “Nom+” Programmer Capabilities = “Nom” Requirements Volatility= “Hi” Modern Development Practices Use = “Hi-“ Automated Tool Use = “Nom+” Process Volatility= “Nom+” Multiple Site Development= “VHi” Programmers Language Experience = “Nom” LanguageType = “Nom” Specif ication Level Reliability= “Hi” 0% 10% 20% 30% 40% 50% 60% 70% 80% Local Impact, Percentage 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 9 19th International Forum on COCOMO and Software Cost Modeling Model Analysis - Major PRICE-S Cost Drivers PRICE-S - Local Range INTEGI-Team=0.70, INTEGETeam= 0.70, CPLX1-person.= 0.2 INTEGI-timing= 0.7,INTEGEtiming=0.7,UTIL=0.5,APPL=8.46 PROFAC= 6.5 CPLXM=1.2 PLTFM = 1.8 CPLX1 -product familiarity CPLX1-Tools= -0.1, CPLX1Req.+new Lang. = 0 - 0.50 1.00 1.50 Productivity Factors 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 2.00 10 19th International Forum on COCOMO and Software Cost Modeling Model Analysis - Team & Process Team Process Team & COCOMO II SEER-SEM PRICE-S Analyst Capabilities (1.20) INTEGI (crew) (1.58) Programmer Capabilities (1.18) Analyst Experience (1.21) PROFAC (1.35) ACAP (1.41) PCAP (1.31) APEX (1.23) PLEX (1.20) LTEX (1.20) Process & Tools Programmers Language Experience (1.05) Development Method – KnowledgeBase; CPLX1 – Product familiarity (1.25) CPLX1 (1.10) PVOL (1.99) PCON (1.24) TOOL (1.21) 10/07/2004 Process & Reqmts. Volatility (1.32) Practices and Methods Experience 19th International Forum on COCOMO and Software Cost Modeling CPLX!-Reqmts. Volatility (1.13) CPLX1-Tools (1.23) 11 19th International Forum on COCOMO and Software Cost Modeling Model Analysis - Market Market Market 10/07/2004 COCOMO II RUSE (1.31) SITE (1.22) SECU (1.10) Schedule SCED (1.14) Reliability RELY (1.15) Certification requirements DOCU (1.22) SEER-SEM PRICE-S CPLXM (1.2) Security Requirements(1.75) Required Schedule; Start Date Requirements Definition Formality; (part of Platform, Application -Kbase) Development Standard Kbase 19th International Forum on COCOMO and Software Cost Modeling DSTART, PEND (part of PLTFM) 12 19th International Forum on COCOMO and Software Cost Modeling Model Analysis - Complexity Complexity Complexity COCOMO II CPLX (1.30) SEER-SEM Application (Kbase) DATA (1.28) STOR (1.05) Operating Environment Technology 10/07/2004 TIME (1.26) & PRICE-S APPL UTIL INTEGI (Timing. Coupling) (1.55) Platform; and DevelopmentMethod KnowledgeBase 19th International Forum on COCOMO and Software Cost Modeling PLTFM (1.25) 13 19th International Forum on COCOMO and Software Cost Modeling Model Analysis - Scope Scope b/t Models COCOMO II SEER-SEM PRICE-S Hardware developed in parallel w/ software System Integration Not included Hardware Integration Level (1.09) CPLX2 (1.20) Programs Currently Integrating (1.04) INTEGE (1.13) Not included Did not find many data points in the COCOMO I database that suggest a System Integration on the scale of our (next) Baseline example (3-4 Million SLOCs). Has COCOMO II been able to collect sufficient data to develop a (large scale) System Integration factor? 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 14 19th International Forum on COCOMO and Software Cost Modeling Model Analysis - Economic Properties (Models Exponents Indicate Diseconomies of Scale for Large Projects) COCOMO, SEER-SEM, PRICE-S Effort- Person-Months 20,000 COCOMO 18,000 16,000 y = 19.4 x 1.09 SEER-SEM 14,000 y = 15.1 x 1.12 12,000 10,000 PRICE-S 8,000 6,000 y = 9.7 x 1.11 4,000 2,000 0 100 200 300 400 500 600 KSLOC The COCOMO II exponent ranges from 0.91 to approximately 1.26 depending upon the Scale Factors. 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 15 19th International Forum on COCOMO and Software Cost Modeling Model Analysis - Economic Properties Schedule Variation (SLOC = 50,000) 2,500 2,000 1,500 1,000 500 0 20 40 60 Months All Software Cost Models show a Schedule Compression penalty. 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling COCOMO 80 PRICE-S SEER-SEM 16 19th International Forum on COCOMO and Software Cost Modeling Baseline Example • Illustrate Software Cost Estimation & Integration – Baseline – SEER-SEM – Sensor Software – PRICE-S – Air Vehicle Software – COCOMO – Control Station – Sizing Assumptions and Scenarios – Equivalent New SLOC – Productivity Range – Hours/ SLOC – Estimating Ranges using Monte Carlo – Hours = (SLOC) X (Hours/SLOC) 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 17 19th International Forum on COCOMO and Software Cost Modeling Baseline Example - Sizing Scenarios (Equivalent New SLOC) Air Vehicle Original SLOC point estimate 2,000,000 Sensor Original SLOC point estimate 500,000 Ground Control Station Original SLOC point estimate 1,000,000 10/07/2004 Code growth of -5%, 15% Code growth of reuse 15%, 10% reuse Code growth of 20%, 5% Reuse SLOC-Low SLOC-Most Likely SLOC-High 1,615,000 2,070,000 2,280,000 Code growth of -5%, 20% Code growth of reuse 15%, 10% reuse Code growth of 30%, no Reuse SLOC-Low SLOC-Most Likely SLOC-High 380,000 517,500 650,000 Code growth of -5%, 50% Code growth of reuse 10%, 25% reuse Code growth of 20%, 15% Reuse SLOC-Low SLOC-Most Likely SLOC-High 475,000 825,000 1,020,000 19th International Forum on COCOMO and Software Cost Modeling 18 19th International Forum on COCOMO and Software Cost Modeling Baseline Example – Productivity Ranges - Productivity (Hours/SLOC) PRODUCTIVITY (Hours/SLOC) Model Air Vehicle PRICE-S Sensor SEER-SEM Ground Control Station COCOMO High Productivity Medium Productivity 3.66 4.02 3.32 Low Productivity 4.48 4.91 4.06 COCOMO Assumptions: COCOMO – Normalized to include Security, HW/SW integration, System Integration. This adds approximately 50% to the raw COCOMO II estimate. The EAF = 3.32 for the Gnd. Cntl. Stn. The Equivalent New SLOC = 825,000. The assumed module level was 50,000 SLOC. The internal integration = 1.05; the external integration = 1.11 for the “Medium Productivity” (COCOMO) setting. 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 5.74 6.29 5.20 19 19th International Forum on COCOMO and Software Cost Modeling Baseline Example – Results Probability Software - Total Effort 50% Occurs at 15.1 Million Hours 100 90 80 70 60 50 40 30 20 10 0 - 5.00 10.00 15.00 Hours (Millions) 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 20.00 25.00 20 19th International Forum on COCOMO and Software Cost Modeling Baseline Example – Air Vehicle, Hours = (Hours/SLOC) X SLOC Probability SLOC, Thousands Hours/SLOC Hours, in Millions 0 1625 3.70 6.14 10 1794 4.07 7.83 20 1864 4.24 8.28 30 1914 4.37 8.59 40 1960 4.48 8.87 50 2002 4.61 9.16 60 2038 4.73 9.44 70 2071 4.87 9.76 80 2110 5.03 10.13 90 2164 5.22 10.64 100 2275 5.73 12.48 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 21 19th International Forum on COCOMO and Software Cost Modeling Baseline Example – Sensor, Hours = (Hours/SLOC) X SLOC Probability SLOC, Thousands Hours/SLOC Hours, in Millions 0 382 4.03 1.66 10 441 4.48 2.15 20 467 4.66 2.30 30 486 4.81 2.42 40 502 4.93 2.52 50 518 5.05 2.61 60 531 5.18 2.71 70 548 5.33 2.82 80 566 5.52 2.94 90 590 5.74 3.13 100 647 6.28 3.99 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 22 19th International Forum on COCOMO and Software Cost Modeling Baseline Example–Control Station,Hours=(Hours/SLOC)X SLOC Probability SLOC, Thousands Hours/SLOC Hours, in Millions 0 478.8 3.33 1.73 10 608.7 3.69 2.50 20 669.1 3.86 2.75 30 715.4 3.97 2.95 40 753.4 4.07 3.10 50 786.1 4.17 3.25 60 815.8 4.28 3.40 70 845.5 4.40 3.57 80 878.1 4.55 3.75 90 923.3 4.73 3.98 100 1017.1 5.18 5.06 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 23 19th International Forum on COCOMO and Software Cost Modeling Summary • Identified Major “local domain” Cost Drivers – COCOMO II, SEER-SEM, PRICE-S • Examined Economic Properties – Diseconomies of Scale, Schedule Compression – Productivity Ranges – Scenario Analysis using top 2-3 Cost Drivers • Illustrated Software Cost Estimation & Integration – Model Integration using Monte Carlo Probability Curves – Scenarios for Sizing – Model Output for Productivity 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 24 19th International Forum on COCOMO and Software Cost Modeling Object Oriented Analysis (OOA) • OOA – Unified Modeling Language (UML) – Several Standard Diagrams – Starts with “Use Case” (users view of the world) – Develops charts including: – Use Case (Actors use of information in the system) – Collaboration Diagrams (provide details of each Use Case) – Class Diagram – Specify Blueprints (for objects) – Attributes and Methods – Relationship between Classes – Objects are linked across a design – Object Interfaces – Encapsulates Local Variables within a “Black Box” – Inheritance of Classes 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 25 19th International Forum on COCOMO and Software Cost Modeling Snapshot of Unified Modeling Language (UML) – From Use Case, Flow of Information, to Class Diagrams of Systems Medical System – UML Use Case Collaboration Diagram - Make Appointment Doctor Login in Assume successful login. Cancel Appt View Patient History Cancel Appt :patient Approve Prescription Make Appt Insurance Patient Change Doctor Request co-pay 1: request appt. Provide co-pay Receive Authorization 3: send Insur. Co-Pay Amount :Controller 4: request schedule availability 6: book Appt. View Patient History Receive co-pay Receive notification :Insur_Obj. 2: request Insur. Co-Pay Pharmacist Calc co-pay Change Location 6: Display/save Appt. results w coPay info. on patient computer Nurse Receive Refill Request 5: display schedule availability :scheduler 7: confirm Appt. :Appt._Object Request Prescription May 9, 2004 UML Final, Harwick May 9, 2004 UML Final, Harwick View Patient History Medical System UML Class Diagram –Top View Appointment Person Name; Age; Address; ID View_Med_Hist() Cancel_Apt() Overview - Big Picture Doctor Link to Prescription Package Link to Messaging Package Patient Insur. Prescription Medical System UML Class Diagram–Appointment View Patient Primary_Doctor_Name Doctor Nursing Staff Dept.; Location Dept; Location Person Location Appoint. Messaging Nursing Medical Record Make_Apt() Chng_Doc() Chng_Loc() Req_Prescript.() Request_CoPay() Cancel_Apt() * View_Med_Hist() Write_Prescript.() Renew_Prescript.() Cancel_Apt * Update_Med_Hist() View_Med_Hist() Cancel_Apt * * An Override method in the subclasses. May 9, 2004 UML Final, Harwick 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling May 9, 2004 UML Final, Harwick 26 19th International Forum on COCOMO and Software Cost Modeling Object Oriented Analysis (OOA) Benefits • Object Oriented Analysis with Unified Modeling Language – Overall Project Cost Savings – Similar to System Engineering impact: – Mitigate Risk, Cost, Schedule Impacts – Decreases Defects – May increase Design work – But Avoids excessive re-work – Facilitates Integrate and Test – Better designed Interfaces – Promotes System Integration – If compatible UML used on all vendor designs – Helps Customer Understand Product Purchased – Sooner, Decrease Requirements Volatility – Need Data to Measure impact of OOA (UML) 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling 27 19th International Forum on COCOMO and Software Cost Modeling References • • • • Models: Price Systems, PRICE-S model, Los Angeles, California, 1996. Galorath Associates, SEER-SEM, Los Angeles, California, 1996. USC Associates, COCOMO II.1999, Los Angeles, California, 2004. • Books: • 1. Boehm, Reifer, et al., Software Cost Estimation with COCOMO II, Prentice-Hall, New Jersey, 2000. • 2. Boehm, Barry W., Software Engineering Economics, Prentice-Hall, New Jersey, 1981. • 3. Deming, W. Edward, Out of the Crisis, Massachusetts Institute of Technology, Cambridge, 1982. • 4. Cs3 Inc., Objected Oriented Analysis Using the Unified Modeling Language, (UML), Los Angeles, CA, 2000-2003. 10/07/2004 19th International Forum on COCOMO and Software Cost Modeling