Process Synchronization and Stabilization February 2007 Rick Selby Head of Software Products Northrop Grumman Space Technology Rick.Selby@NGC.com, 310-813-5570 Adjunct Professor of Computer Science University of Southern California Rick.Selby@USC.edu 0 © Copyright 2007. Richard W. Selby and Northrop Grumman Corporation. All rights reserved. “Process Owners” Define, Monitor, and Improve Products & Processes Using Metric-Driven Analyses What do managers control at different levels of an organization? Executives – Define, monitor, and improve: Vision, values, high-level policies, financial, etc. “Process owners” – Define, monitor, and improve: Processes using metric-driven analyses to improve products and services Project managers (or “product owners”) – Define, monitor, and improve: Products and services 1 © Copyright 2007. Richard W. Selby and Northrop Grumman Corporation. All rights reserved. Six Sigma Projects Decrease Process or Product Defects by Reducing Variances and Shifting Means Six Sigma projects typically focus on decreasing process or product nonconformances (“defects”) through reducing variances and shifting means of process performance or product quality metrics Projects decrease the variability of process performance or product quality metrics to improve predictability (such as smaller gaps in plans vs. actuals), efficiency (such as shorter cycletimes), and effectiveness (such as fewer defects) Six Sigma gets it name because it states a goal of achieving no worse than 3.4 defects per million opportunities, which is six process sigma (roughly analogous to standard deviations) from the mean (99.9997% accuracy) Commonly use DMAIC approach: Define, Measure, Analyze, Improve, and Control 2 © Copyright 2007. Richard W. Selby and Northrop Grumman Corporation. All rights reserved. Control Charts Identify Behavior Outside of Expected Control Limit Boundaries Exception process variation Exception process variation Upper Control Limit Project XYZ (Defect Density for Process X) Process Average Lower Control Limit Actual Performance Data from Project XYZ 3 © Copyright 2007. Richard W. Selby and Northrop Grumman Corporation. All rights reserved. Control Limits Define “Voice of the Process” and Specification Limits Define “Voice of the Customer” USL UCL LCL LSL Control limits represent “voice of the process”: Upper control limit (UCL) and lower control limit (LCL) Derived statistically using process or product data Help determine whether the process is stable Specification limits represent “voice of the customer”: Upper specification limit (USL) and lower specification limit (LSL) Represent goals, requirements, or targets Help determine whether the process is capable 4 © Copyright 2007. Richard W. Selby and Northrop Grumman Corporation. All rights reserved. Data-Driven Statistical Analyses Identify Trends, Outliers, and Process Improvements for Defects • Six Sigma Project Introduced New Peer Review Process • Provided Training on Process • New Web-Based Peer Review Tool • Provided Training on Tool These defects are action items resulting from peer reviews of software code and unit testing plans and results. Control chart of metric data from example Six Sigma projects focusing on fault (or defect) density in peer reviews of software components 5 Data from 10 systems © Copyright 2007. Richard W. Selby and Northrop Grumman Corporation. All rights reserved. Data-Driven Statistical Analyses Identify Trends, Outliers, and Process Improvements for Cycletimes • Web-based tracking tool deployed • Action tracking process started I Chart for Action Item Closure Performance 1 300 1 1 All actions are from SW Requirements Reviews. 1 100 11 1 11111111 1 11111 -100 1 Negative values indicate actions closed before their due date. 1 1 -200 1 1 1 0 100 9/ UCL=1.631 Positive values indicate Mean=-19.71 actions closed after their due LCL=-41.05date. 11111 1 1 Chart plots deltas between action item due date and closure date. Limit 1 11 1Spec 1 0 open date Specification limit defines goal for action item cycletime closure. 1 200 Days Past Due actions opened after April 1, 2004 /2 30 200 3 00 2/ 0 20 3/ 300 4 3 2 6/ /2 400 4 00 5/ 20 4/ 500 Two-sample t-test confirms that data after April 1, 2004 is statistically different than data preceding this date (α < 0.05) 04 Control chart of metric data from example Six Sigma projects focusing on action item closure cycletime in peer reviews of software components 6 Data from one system © Copyright 2007. Richard W. Selby and Northrop Grumman Corporation. All rights reserved. Data-Driven Statistical Analyses Identify Trends, Outliers, and Process Improvements for Cycletimes • Series of process improvements instituted and new control limits calculated Statistically significant improvement in process performance Control chart of metric data from example Six Sigma projects focusing on change request closure cycletime for software components 7 Data from one system © Copyright 2007. Richard W. Selby and Northrop Grumman Corporation. All rights reserved. Synchronize-and-Stabilize Timeline and Milestones: 12-36 Months from “Milestone 0” to Manufacturing Phases Timeline Milestones Major Reviews Documents and Intermediate Activities Milestone 0 Vis ion statemen t Specification documen t Planning 3-12 months 6-10 weeks • Cod e and optimizations • Testin g and d ebu ggin g • Featu re s tab ilization Sched ule co mp lete Subproject I Project plan approval Subproject II 2-5 weeks • Integration • Testin g and d ebu ggin g 6-16 months Development Development 6-12 months 2-5 weeks • Buffer time Subproject III Development Subproject 2-4 month s (1/3 of all features) Specificatio n review Implementation plan Milesto ne II release Milesto ne III release Vis ual freeze Code complete Stabilization 3-8 months Project review Milesto ne I release Feature complete Optimizations Tes ting an d debug ging Optimizations Tes ting an d debug ging Internal tes ting Buffer time Beta testin g Buffer time Zero bug release Releas e to manufacturing (Ship date) 8 Proto types Des ign feasib ility stu dies Tes ting strateg y Sched ule Pos tmortem documen t © Copyright 2007. Richard W. Selby and Northrop Grumman Corporation. All rights reserved. Incremental Software Builds Deliver Early Capabilities and Accelerate Integration and Test Figure 4.3-4. JIMO Incremental Software Builds We provide incremental software deliveries support integration and test activities and synchronize with JPL, Hamilton Sundstrand, and Naval Reactors to facilitate teaming, reduce risk, and enhance mission assurance. CY 2004 2005 2006 2007 2008 2009 2010 A B C ATP PMSR SM PDR SM CDR 11/04 1/05 6/08 8/10 Flight Computer Unit (FCU) Builds P FCU1 Prelim Exec and C&DH Software P FCU2 2011 2012 2013 D BUS I&T SM AI&T 8/12 8/13 JPL/NGC, Prelim. Hardware/Software Integration JPL/NGC, Final Hardware /Software Integration JPL, Mission Module Integration Final Exec and C&DH Software P FCU3 Science Computer Interface P FCU4 Power Controller Interface Reactor AACS (includes autonomous navigation) P FCU5 Thermal and Power Control P FCU6 Configuration and Fault Protection P FCU7 Science Computer Unit (SCU) Builds Note: Science Computer builds for common software only (no instrument software included) Prelim Exec and C&DH Software SCU1 SCU2 Final Exec and C&DH Software Data Server Unit (DSU) Builds DSU1 Prelim Exec and C&DH Software DSU2 Final Exec and C&DH Software P DSU3 Data Server Unique Software Ground Analysis Software (GAS) Computer Builds P Preliminary Ground Analysis Software GAS1 Final Ground Analysis Software GAS2 Legend: = 1 2 3 4 5 = 1 2 3 4 5 = 1 2 3 4 5 N Design Agent Performer of Activity N JPL P Prototype NGC Role/activity shared by Activity JPL and NGC Delivered to, Usage Power Controller NR, Reactor Integration NGC, AACS Validation on SMTB NGC, TCS/EPS Validation on SSTB NGC, Fault Protection S/W Validation on SSTB JPL, Prelim. Hardware/Software Integration JPL, Final Hardware/ Software Integration NGC, Prelim. Hardware/ Software Integration NGC, Final Hardware/ Software Integration NGC, HCR Integration on SMTB JPL, Prelim. Integration into Ground System JPL, Final Integration into Ground System N is defined as follows: 1 Requirements 2 Preliminary Design 3 Detailed Design 4 Code and Unit Test/Software Integration 5 Verification and Validation 04S01176-4-108f_154 9 © Copyright 2007. Richard W. Selby and Northrop Grumman Corporation. All rights reserved. Analyses of Software Defect Injection and Detection Phases Reveals Distributions and Gaps Software Defect Injection and Detection Phases Defects Injected (cumul. %) Defects Detected (cumul. %) 100.0% Defects (cumul. %) 90.0% 80.0% 70.0% 60.0% 50.0% 40.0% 30.0% 20.0% 10.0% e U ni In tT te es gr t at io n SW Te st V er ifi ca Su tio pp n or tt o I& M T ai nt en an ce O pe ra tio ns C od SW Ex t er Pr op na os lR al eq .S ou R rc eq e u Pr ire el m im en in ts ar y D D es et ig ai n le d D es ig n 0.0% System Development Phase Cumulative distribution of software defect injection and detection phases based on using peer reviews across 12 system development phases 3418 defects, 731 peer reviews, 14 systems, 2.67 years 10 © Copyright 2007. Richard W. Selby and Northrop Grumman Corporation. All rights reserved. Analyses of Software Requirements Shows Leading Indicators for Implementation Scope Ratio of Implementation Size to Requirements Size / Requirements Average (ex. #14) Average + 2 std. (ex. #14) 550 Source-Lines-of-Code / Requirements 500 450 400 350 300 250 200 150 100 50 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 System Data from 14 NASA systems Ratio of implementation size to software requirements has 81:1 average and 35:1 median; Excluding system #14, the ratio has 46:1 average and 33:1 median Ratio of software requirements to system requirements has 6:1 average 11 © Copyright 2007. Richard W. Selby and Northrop Grumman Corporation. All rights reserved. Five-Phase Process Defines Structured Approach for Six Sigma Improvement Projects Phase Define Measure Analyze Improve Control Exit Criteria Duration Project scope defined, business case established, and stakeholders engaged Metrics defined, data collected, and data validated 5 weeks 3 weeks 3 weeks Processes and products analyzed, root cause analyses completed, and sources of variation understood Potential solutions identified, recommended solution 5 weeks piloted, and improvements documented using data Ongoing monitoring using statistical methods such as Ongoing control charts, supported by special cause and common cause analysis of violations of control limits and specification limits Total 16 weeks + Control Phase Six Sigma implementation approach can span 16 weeks and include tollgate reviews for define, measure, analyze, improve, and control phases Tollgate reviews provide checkpoints for progress, evaluation, and feedback 12 © Copyright 2007. Richard W. Selby and Northrop Grumman Corporation. All rights reserved. Synergistic Strategies Help Enable Large-Scale Software System Development and Management Analysis Infrastructure and techniques for system modeling, analysis, and simulation Modeling Infrastructure & Techniques Evaluations & Feedback System modeling, evaluation, tradeoff, and prediction using simulations and empirical studies Analysis Capabilities Processes & Architectures 13 Requirements & Opportunities Synthesis Flexible lifecycle process models, extensible system architectures, and pro-active development guidance mechanisms Models, Relationships & Feedback Research focus: Large-scale, mission-critical embedded software systems Research themes: Early lifecycle, system perspective, frequent design cycles, multi-artifact integration, scalable modelware © Copyright 2007. Richard W. Selby and Northrop Grumman Corporation. All rights reserved.