Systems Engineering Division NDIA Software Industry Experts Panel Paul R. Croll, Chair NDIA Systems Engineering Division NDIA Software Industry Experts Panel 14 July 2008 1 Who We Are Systems Engineering Division ● ● ● ● The NDIA Software Industry Experts Panel acts as a “voice of industry” in matters relating to DoD software issues The Panel helps identify and resolve software acquisition and development issues facing the industry and its Government customer base The Panel may also, from time to time, identify for investigation certain technologies or practices that promise to improve industry responsiveness to DoD needs. Members – – – – – – – – Paul Croll, CSC, Chair JoAn Ferguson, General Dynamics Gary Hafen, Lockheed Martin Blake Ireland, Raytheon Al Mink, SRA International Ken Nidiffer, SEI Shawn Rahmani, Boeing Rick Selby, Northrop Grumman NDIA Software Industry Experts Panel 14 July 2008 2 Systems Engineering Division ● ● ● ● What We Do Investigate, analyze, and develop recommendations concerning software issues, in response to NDIA and Government requests Provide industry comments on Government positions, initiatives, or work products Develop industry white papers and position papers Reach out to relevant stakeholders through NDIA conferences and other venues as appropriate NDIA Software Industry Experts Panel 14 July 2008 3 Systems Engineering Division ● NDIA Top Software Issues Workshop 2425 August 2006 Identify Top 5 Software Engineering problems or issues prevalent within the defense industry – Document issues • Description and current state • Rationale and SW impacts – Develop recommendations (short term and long term) – Generate task report – Submit to OSD NDIA Software Industry Experts Panel 14 July 2008 4 Systems Engineering Division Top Software Issues 1. The impact of requirements upon software is not consistently quantified and managed in development or sustainment 2. Fundamental system engineering decisions are made without full participation of software engineering 3. Software life-cycle planning and management by acquirers and suppliers is ineffective 4. The quantity and quality of software engineering expertise is insufficient to meet the demands of government and the defense industry 5. Traditional software verification techniques are costly and ineffective for dealing with the scale and complexity of modern systems 6. There is a failure to assure correct, predictable, safe, secure execution of complex software in distributed environments 7. Inadequate attention is given to total lifecycle issues for COTS/NDI impacts on lifecycle cost and risk NDIA Software Industry Experts Panel 14 July 2008 5 Defense Software Strategy Summit 18-19 October 2006 Systems Engineering Division ● ● ● ● Keynote Address, the Honorable Dr. James I. Finley, Deputy Under Secretary of Defense (Acquisition and Technology) Program Executive Officer and Service/Defense Agency panels on software related acquisition issues and initiatives Plenary Session Topics -- NDIA Top Software Issues, Software Industrial Base Study, and Software Producibility Workshops – Software Acquisition and Sustainment • • – Policy • • • – Mr. Jim Clausen, DoD CIO, Office of Commercial IT Policy Col Peter Sefcik, Jr., USAF Chief, Air Force Engineering Policy and Guidance Team Lt Col Mark Wilson, SAF/AQR Systems & Software Engineering Human Capital • • – Mr. Mike Nicol, Air Force Aeronautical Systems Center Mr. Lawrence T. Osiecki, US Army, Armament Software Engineering Center Dr. Kenneth E. Nidiffer, Fellow, Systems and Software Consortium Mr. George Prosnik, Defense Acquisition University E&T Center Software Engineering Practices • • Mr. Grady Campbell, Software Engineering Institute Mr. Paul R. Croll, CSC, Industry Co-Chair NDIA Software Committee NDIA Software Industry Experts Panel 14 July 2008 6 DoD Software Summit Issues Systems Engineering Division Software Acquisition and Sustainment ● ● ● ● ● ● Human Capital Experienced system & software engineers seem missing from key DoD leadership positions Shortage of highly experienced software managers, architects, domain and technical experts Eroding depth and breath of experience for personnel in DoD Young people may consider system and software engineering as a career dead end Emerging skill set may be needed for future complex DoD systems, e.g., systems of systems ● ● ● ● ● ● ● ● ● ● ● NDIA Software Industry Experts Panel 14 July 2008 Software Engineering Weak linkage between software requirements and capabilities/portfolios System development methods do not properly leverage software ability to rapidly field new capability Systems and software engineering lifecycles not always consistent or harmonized Software considerations not consistently addressed in architectures Inadequate software estimating methods, e.g., COTS/NDI; best practices not applied ● Software issues not addressed early in lifecycle Software requirements not well defined at program start Management has limited visibility into software development processes and status Risk areas – single point failures not adequately addressed, e.g., single software providers, incomplete data rights, key personnel stability, life cycle support of COTS Acquirers do not adequately address software sustainment and total life cycle early in the program Some agencies contract before engineering is complete, prior to system design and development ● ● ● ● ● Policy PMs need assistance with software policy and analysis Arbitrary separation of weapon and information technology software policies Policy implementation guidance and follow-up monitoring is limited Department needs software group with good expertise to oversee and implement policy Need capability to share policy and guidance information Reaffirmed NDIA Top Software Issues 7 Software Issues/Gaps Workshop Findings Systems Engineering Division *based on NDIA Top SW Issues, OSD Primary Software Focus Areas* Software Development Techniques Software Acquisition Management Standards – O, N DAG Ch 4/7 – O, AF Prog Spt – O, All Contract Language – A, M, N SW Estimation – GAP Lifecycle Policy – AF Risk Identification - GAP Ongoing Initiative Owners O – OSD/SSA A – Army N – Navy AF – Air Force M – MDA SEI DCMA DAU L&MR GAP – No activity Agile – O, SEI Architecture – A, SEI COTS – SEI Open Source – AF Sustainment – GAP SW Interoperability – GAP SW Test - GAP Knowledge Sharing DAU Software ACC – DAU Best Practices Clearinghouse – DAU, O SW Inventory – LMR Lifecycle Guides – M, N Root Cause Analysis – O Local Knowledge Portals - N NDIA Software Industry Experts Panel 14 July 2008 Program Support Reviews, and DoD Software Summit findings SW & SE Integration Requirements – GAP SE/SW Process Int – O SW Council – N SW Dev Plan – N SW in SEP – N SW in Tech Reviews – N SW Quality Attributes - GAP Data and Metrics SW Metrics – A, O SW Cost – O SW EVM – DCMA SW Estimation - GAP Source: Kristen Baldwin, Deputy Director, Software Engineering and System Assurance, OUSD(AT&L), April 18, 2007 Human Capital Education Sources – N, A Leadership Training – A, SEI SETA Quals – GAP SW Human Cap Strategy – GAP Industrial Base – O University Curriculum – O Worforce Survey - AF 8 Systems Engineering Division ● ● ● ● ● ● ● ● ● DoD Software Gaps Estimation Risk Identification Sustainment Interoperability Test Requirements Quality Attributes SETA Qualifications Human Capital Strategy NDIA Software Industry Experts Panel 14 July 2008 Source: Kristen Baldwin, Deputy Director, Software Engineering and System Assurance, OUSD(AT&L), April 18, 2007 9 Systems Engineering Division ● ● ● ● ● ● ● ● ● DoD Software Gaps Estimation Risk Identification Sustainment Interoperability Test Requirements Quality Attributes SETA Qualifications Human Capital Strategy NDIA Software Industry Experts Panel 14 July 2008 Source: Kristen Baldwin, Deputy Director, Software Engineering and System Assurance, OUSD(AT&L), April 18, 2007 10 Systems Engineering Division ● Software Industry Experts Panel Action Plan Software Points of Influence list – bi-directional commitment ● Software interested parties list – information awareness ● Supporting resolution of identified DoD Software Gaps – – – – – Human Capital, Requirements Risk, Quality Attributes Test, Estimation Sustainment Interoperability, SETA NDIA Software Industry Experts Panel 14 July 2008 11 Systems Engineering Division ● Overall Workshop Objectives Three Workshop Panels – Software Requirements – Software Risk and Estimation – Software Quality Attributes ● Workshop Objectives For Each Panel – Define a specific plan to crystallize concrete progress within the next 6-18 months – Define work products and a plan to develop them over 6, 12 and 18 month periods – Identify stakeholders relevant to each of these work products NDIA Software Industry Experts Panel 14 July 2008 12 Systems Engineering Division Software In Acquisition Workshop Tuesday, October 16, 2007 Wednesday, October 17, 2007 0800-0815 Welcome: Dr Finley - DUSD(A&T) 0800-1500 Workshops - continued 0815-0900 Keynote: Carl Siel – Navy Chief SE 1515-1600 Keynote: Dr Myers, DoD Deputy CIO 0900-1015 Industry Presentation – NDIA Software Experts Panel 1600-1645 Workshop Outbriefs (3) 1025-1445 Presentation Tracks (12 Presentations) 1645-1700 Closing Remarks 1500-1700 Focused Workshops Attendance: - 100+ attendees - Services, Agencies, Industry, Academia, FFRDC, NASA Workshop Topics: - Software Requirements - Software Estimation / Software Risk - Software Quality Attributes NDIA Software Industry Experts Panel 14 July 2008 13 Systems Engineering Division ● Requirements Workshop Recommendations Define an effective “software portfolio” management framework – Protect the continuity of systems/software and requirement engineering throughout the software life cycle ● ● Implement the techniques we know will work and identify any shortcomings Find ways to leverage the malleability of software – Software has the ability to adapt to changing requirements ● ● Change our view/perspective of “sustainment” to “continuous evolution” Establish a research program NDIA Software Industry Experts Panel 14 July 2008 14 Systems Engineering Division ● ● ● ● ● Software Estimation/Risk Recommendations Establish Work Breakdown Structure guidance to better highlight Software Engineering activity Developing and evolving an integrated software data repository and related tools Conduct Root Cause analysis studies to understand the problems in software estimation and the use of estimates in the acquisition process Develop and implement an incremental acquisition approach (as well as the overall acquisition framework) that accommodates the uncertainty associated with early software estimates and allows for adjustment and refinement over time Establish policy, related guidance, and recommended implementation approaches for software data collection and analysis across all DoD acquisition programs NDIA Software Industry Experts Panel 14 July 2008 15 Systems Engineering Division ● ● ● ● ● Software Quality Attribute Priority Recommendations Develop engineering guidance on quantitatively identifying, predicting, evaluating, verifying, and validating Quality Attributes – Address tie-in to KPPs and TPMs – Identify methods for predicting quality attribute outcomes for the delivered system, throughout the life cycle Improving OSD/Service-level acquisition policy regarding Quality Attributes – Identify benefits of addressing software quality attributes as part of an acquisition risk reduction strategy – Address gaps in SEP, TEMP, JCIDS, DAG, RFP language – Define expectations for Quality Attribute review during Acquisition Milestone Reviews (e.g. PDR) Develop taxonomy of software quality attributes and how they are related Develop Program Manager guidance on Introduction to Software Architectural Evaluation of Quality Attributes Develop Collaboration site for collecting data, sharing work products, facilitating on-going discussion NDIA Software Industry Experts Panel 14 July 2008 16 Systems Engineering Division ● ● Software In Acquisition Spring Workshop 2008 The purpose of this workshop was to be a “touch point” for the actions that resulted from the SSA annual software workshop in October of last year Review of issues and recommendations in each of the areas covered under the October workshop: – Requirements – Risk/Cost – Quality Attributes ● Three Working Groups – Software Requirements – Software Risk and Estimation – Software Quality Attributes ● Objectives For Each Working Group – Task Statements – Deliverables – Schedule NDIA Software Industry Experts Panel 14 July 2008 17 Systems Engineering Division ● ● Outcome Knit Together Selected “Big Ideas” into Unified Proposed Initiative: Leveraging Competitive Prototyping through Acquisition Initiatives in Integrated Systems and Software Requirements, Risk, Estimation, and Quality NDIA Software Industry Experts Panel 14 July 2008 18 Systems Engineering Division ● ● ● ● ● ● Task Definition Task 1: Conduct surveys and interviews of leading software professionals (government, industry, academia) to gather ideas, assess impacts, and sense expectations for Competitive Prototyping Task 2: Provide amplification of Competitive Prototyping memo for integrated SE/SW, [including where in the lifecycle there are opportunities for Competitive Prototyping [and how they can be contractually achieved] Task 3:Identify first adopters of Competitive Prototyping and facilitate and gather insights on effective usage, including collecting and analyzing data Task 4: Develop guidance for early selection and application of integrated SE/SW quality systems for Competitive Prototyping [for RFP-authors] Task 5: Develop Software Engineering Handbook for Competitive Prototyping including material explicitly targeted to different audiences (acquirer, supplier, etc.). Task 6: Develop training assets (materials, competencies, skill sets, etc.) that capture best-of-class ideas/practices for Competitive Prototyping NDIA Software Industry Experts Panel 14 July 2008 19 Systems Engineering Division ● Competitive Prototyping Survey Purpose – “To gather recommendations, assess impacts, and sense expectations for Competitive Prototyping" from key members of government, industry, and academia ● ● Conducted by the Center for Systems and Software Engineering (CSSE), at the University of Southern California (USC) The domain of interest for the survey and interviews comprise those projects that are considered large-scale "softwareintensive systems" (SiS) – Systems for which software is a principal and defining component – Where software likely represents the key source of technological and programmatic risk in its development – The software development costs of projects of interest should be valued at $100 million, or higher NDIA Software Industry Experts Panel 14 July 2008 20 Systems Engineering Division ● OUSD(AT&L)/SSE-USC/CSSE CP Workshop OUSD(AT&L)/SSE-USC/CSSE Workshop on Integrating Systems and Software Engineering under Competitive Prototyping with the Incremental Commitment Model – Washington, DC, July 14-17, 2008 ● Discussion of the way forward from the survey through the remainder of the tasks NDIA Software Industry Experts Panel 14 July 2008 21 Systems Engineering Division For More Information . . . Paul R. Croll CSC 17021 Combs Drive King George, VA 22485-5824 Phone: Fax: e-mail: +1 540.644.6224 +1 540.663.0276 pcroll@csc.com NDIA Software Industry Experts Panel 14 July 2008 22