Capability Analysis Capability Determination Capability Prioritization Feasibility Assessment Architecture Assessment Economic Analysis Capibility Assurance Methodology For DBSAE Version 3.0 July 13, 2009 This report is prepared by the Interoperability Clearinghouse Interoperability Clearinghouse (ICH) was charted in 2000 by the Department of Defense as a 501C6 solution architecture research institute to assure the solution engineering of commercial items into mission systems. The ICH Capability Assurance Methodology (CAM) and virtual Solution Architecture Integration Lab together provide a collaborative honest broker that can efficiently inform the IT planning, architecture and acquisitions processes. CAM is utilized within this document and is licensed free to the ICH Membership; see ichnet.org for CAM updates. ICH provides Program Managers with a proven processes and access to a wide range of expertise not available through traditional contracting mechanisms. ICH provides “conflict free zone” to all members of the IT value chain: government agencies (federal, state, local), academia, standards bodies, commercial users, and solution providers (large and small) who work together to define solution architecture standards of practice with associated performance metrics required for modeling, vetting and sharing proven IT capabilities (COTS/GOTS/SOA Components). With IT failure rates in government tracking at 72%, and over a 1/3 attributed to the inability to align common business needs with proven technical solutions, ICH transforms and informs the solution engineering and portfolio management process. Interoperability Clearinghouse 704 Clifton Road Alexandria, VA 22308 www.ichnet.org 703-768-0400 TABLE OF CONTENTS Contents TABLE OF CONTENTS ............................................................................................................................................. i TABLE OF FIGURES ................................................................................................................................................ ii Importance of the Capability Assurance Guide .......................................................................................................... 1 PROCESS GUIDE ...................................................................................................................................................... 6 SA Process 1: Capability Analysis ......................................................................................................................... 7 SA Process 2: Capability Determination ............................................................................................................... 11 SA Process 3: Capability Prioritization ................................................................................................................ 14 SA Process 4: Feasibility Assessment................................................................................................................... 18 EA Process 1: Determine Model ........................................................................................................................... 23 EA Process 2: TRANSLATING THE SCOPE OF THE EA TO THE FINANCIAL MODEL ........................... 25 EA Process 3: DTERMINE THE ALTERNATIVES ........................................................................................... 31 EA Process 4: COLLECT DATA AND BENCHMARK METRICS ................................................................... 34 EA Process 5: CONDUCT TCO ANALYSIS ...................................................................................................... 37 DBSAE BRIEFING INSTRUCTIONS ..................................................................................................................... 39 DECISION POINT/RESULTS PRESENTATION ................................................................................................... 40 APPENDICES LIST ................................................................................................................................................. 41 APPENDIX A: CAM SOLUTION ASSESSMENT REPORT TEMPLATE ...................................................... 42 APPENDIX B: DETAILED DESCRIPTION OF PROCESS ............................................................................. 47 APPENDIX C: SOURCES FOR COLLECTING DATA .................................................................................... 53 APPENDIX D: ACRONYMS ............................................................................................................................. 54 APPENDIX E: GLOSSARY ............................................................................................................................... 56 APPENDIX F: SAMPLE PROJECT PLAN ........................................................................................................ 57 i TABLE OF FIGURES Figure A-1: Capability List .................................................................................................................................................... 10 Figure A-2: Illustration of Multiple Levels of Capabilities .................................................................................................... 12 Figure A-3: Analysis Group ................................................................................................................................................... 12 Figure A-4: Level-1 Prioritization .......................................................................................................................................... 15 Figure A-5: Weights of the Sub-Capabilities ......................................................................................................................... 16 Figure A-6: Scoring Scale ...................................................................................................................................................... 19 Figure A-7: Matrix 1: Scored Capabilities and Sub-Capabilities ........................................................................................... 20 Figure A-8 Economic Analysis – an 8 Step Approach to Total Cost of Ownership .............................................................. 24 Figure A-9 TCO Chart of Accounts ....................................................................................................................................... 24 Figure A-10 Quantities and Scope Analysis ........................................................................................................................... 26 Figure A-11 Sub-Models: Direct, Indirect, Migration Costs and Savings ............................................................................ 27 Figure A-12 Server-based Computing Artifacts Cost Calculations ........................................................................................ 29 Figure A-13: AoA Variances: ROIs, Breakeven, Cash Positive .......................................................................................... 32 Figure A-14 Constructing Results of TCO Analysis .............................................................................................................. 38 Figure A-15 Capabilities Associated with Specific Different Technology Groups ................................................................ 50 Figure A-16 Analysis Model 1: Collaboration CDSs ............................................................................................................. 50 Figure A-17 Analysis Model 2: Infrastructure ....................................................................................................................... 50 Figure A-18 Analysis Model 3: Visualization ........................................................................................................................ 51 ii CAM for DBSAE Importance of the Capability Assurance Guide A Capability Acquisition Method ICH’s Method (CAM) is a formal acquisition process that has been customized for Defense Business Systems Acquisition Executive (DBSAE). The process provides guidance to Program Managers (PMs), their staff and contractors in executing the Business Case for achieving Milestone decisions, technology insertion and Engineering Change Process. The value of the CAM to BTA is achieved by streamlining the execution of the Business Case Life-cycle by utilizing a set of standardized assessment tools and the elimination of often redundant analyses. The implementation of the CAM is primarily targeted at ACAT programs in compliance with the DoDD 5000.1, DoDI 5000.2 and CJCS 3170 and statuary compliance documents as the Clinger-Cohen Act (CCA) – Title 40. CAM contains instructions on DBSAE management controls – what is to be reported, to whom, what is to be placed on the dashboard, and what is to be placed into a repository. The repository is essential to a knowledge management approach that expedites the process for PMs on future programs. The essence of CAM is to assure all parties of what is to be expected and how to get there by eliminating duplicate efforts, which are inherent in today’s programs. The CAM process below shows the components of CAM. The criteria from which to judge the efficacies of the CAM are in its ability to provide added value which will reduce the overall time to implement a capability assessment by: Increasing the accuracy/vitality of the technical assessment by vetting vendor assertions against real-life lessons learned through the COTS assessment framework within the BTA Enterprise A brief description of CAM products follow: Capability-1: Comply with Industry Standards 1 DoD Standards 2 Industry Standard Capability-2: User must be able to collaborate 3 email 4 Instant Messaging/Chat/Blog 5 Voice/IP Capability-3: The solution must be able to support large numbers of documents, their accessibility and access privileges 1 Google-like search Customer Service System Monitoring Workflow Document/Records Management Products Personalization Capability Determination (CD) – The Capability Determination process defines “what” capability gaps are to be evaluated, and by “what” technologies. This is a process that creates groupings (tables) of capabilities and technology or solutions that satisfy the capability gaps. This is an important step, which establishes the plan for how the assessment will be conducted. 2 Support federated searches across repositories Comply with Industry Standards Capability Analysis (CA) ─ the initial process of the CAM identifies the requirements/ capabilities for the program and further defines the problem statement and scope of the effort. Capabilities are defined at the Program level as a basis of the business case. This analysis ensures that there is sufficient data to understand the viability of technology and sufficient data to develop the Total cost of ownership (TCO) for the materiel solution. CAPABILITIES Content Management and Delivery Search and Indexing Collaboration Eliminating duplicate and redundant Technology Assessments process, while assuring that technology would be judged by its ability to perform DoD business and mission capabilities Standardizing the capability assessment process of technology, including managerial processes to create an executable, measurable and sustainable process Reducing the time required to conduct research through the use of repositories/libraries and industry data sources Providing visibility into the status of related assessments within the standardized processes through in-depth reporting on the BTA dashboard Knowledge Management Product1 Product2 Product3 Product4 ICH Proprietary - 1 CAM for DBSAE Capability Prioritization (CP) – The CP process is used to assess the comparative value of the capabilities under review to the various activities/roles (use case) of the organization. This process of elimination of low priority business case requirements increase the viability of a COTS solution, reduce time/cost of implementation and decrease the risk of failure. Some military programs have seen a 20% decrease in capability requirements by using the ICH’s CAM approach. Weight Importance 500 CAPABILITIES 150 Capability-1: Comply with Industry Standards 1 DoD Standards 2 Industry Standard Capability-2: User must be able to collaborate 3 email 4 Instant Messaging/Chat/Blog 5 Voice/IP Capability-3: The solution must be able to support large numbers of documents, their accessibility and access privileges 1 Google-like search 100 2 300 200 250 100 100 50 250 Support federated searches across repositories Feasibility Assessment (FA) ─ Feasibility Assessment is a process for analysis of emerging/ innovative technology products regarding the degree to which they will satisfy the capabilities or gaps identified. Architecture Assessment (AA) ─ Just like the FA, the AA process analyzes emerging/ innovative technologies and identifies the degree to which they would satisfy the capabilities or gaps identified. However, the AA process supports ACAT programs for a more detailed analysis of the alternative technology/solutions being considered for the program’s architecture, where these capabilities are further decomposed into more details than the FA process. Economic Analysis (EA) ─ a minimal decision support process that identifies alternatives and provides business and technical arguments for selection and implementation to achieve stated organizational objectives. The Economic Analysis is a simplified Business Case Analysis, which provides an analytical and uniform foundation upon which sound decisions are made. This Guide provides the Program Manger with the information needed to steer the Program towards a standardized approach to conducting a capability assessment. The Flow Chart below illustrates the process, which is the organizational approach for the Guide. Each Process is subsequently described in detail in the Process Guide. ICH Proprietary - 2 CAM for DBSAE This Guide provides the Program Manager with the information needed to steer the Program towards a standardized approach to conducting a capability assessment. The Flow Chart below illustrates the process, which is the organizational approach for the Guide. Each Process is subsequently described in detail in the Process Guide. The implementation of the CAM is primarily targeted at ACAT programs in compliance with the DoDD 5000.1, DoDI 5000.2 and DoD 5000.02 (2009) and CJCS 3170 and statuary compliance documents such as the Clinger-Cohen Act (CCA) – Title 40. Receipt o CARE Project Strategy Sponsor DoD 5000 Perform Capability Analysis no Step C1 – Build JOPsc Matric Step C2 -- Functional Capability Matrix Capability Alignment Milestone: A - FAA Milestone: B – CDD Perform Capability Determination Step C3 – Market Survey of Technology Milestone: B – AoA no Step C4 – Create Analysis Model(s) Perform Capability Prioritization Step C5 – Create Prioritization Criteria Step C6 – Weight Capabilities by Importance Economic yes Analysis Milestone: A – FAA Milestone: B – CDD no Step E1 – Setting Up the Model Perform Fesability Assessment no Step T1 – Create Scoring Criteria Step T2 – Score Technologies by Evidence of achieving Capability Translating the Scope to the Financial Model: yes Step E2 – Determine the Quantities Step E3 – Setting up the Sub-Models no CCA Review Area 1: Acquisition supports core priority functions [Linked to DOTMLPF & JOPsC] CCA Review Area 2: Performance measures linked to strategic goals CCA Review Area 4: No Private Sector or Government source can better support the function CCA Review Area 1: Acquisition supports core priority functions AoA Determine the Model: no Clinger Cohen Act Compliance or Milestone: A – FAA Milestone: B – FSA, CDD, CPD, AoA, CONOPS CCA Review Area 5: An analysis of alternatives has been conducted Perform Architecture Assessment Step E4- Developing the ROI Cost and Returns no CCA Review Area 3: Redesign the processes that the system supports to reduce costs, improve effectiveness and maximize the use of COTS technology Step T1 – Create Scoring Criteria Step T2 – Score Technologies by Evidence of achieving Capability Determine the Alternatives no yes Step E5 - Determining the Alternatives Step E6 – Determining the Financial Indicators Milestone: B – Economic Analysis, LCCE Collect Data & Benchmark Metrics Step E7 – Collect the Model’s Data & Assumptions no CCA Review Area 6: An economic Analysis has been conducted that includes a calculation of the return on investment or for non-MAIS programs a LCCE has been conducted Conduct the EA Analysis no Step E8 - Conduct the EA Analysis no Approve PM No Approve MDA Yes Implement CAM is the FIRST set of Analytical Tools available and Sponsored by the AF for conducting relevant portions of the DoD 5000 and Clinger Cohen Act Compliance More than WHAT to do, but HOW TO! ICH Proprietary - 3 CAM for DBSAE PM Checklist Solution Assessments • Complete Capability Analysis Step 1 – Assessment of the Problem Statement Step 2 – Determine JOPsC Capabilities, Functional requirements, and DOTMPLF Requirements related to Problem Statement STEP 3 – Develop Capability Analysis Report • Complete Capability Determination Step 1 – Market Survey Step 2 – Build Assessment Model(s) • Complete Capability Prioritization Step 1– Create Prioritization Criteria Step 2 – Weight Capabilities by Importance – Lower Level Capabilities Step 3 – Group Normalization • Complete FA Assessment Step 1 – Setting the Value Criteria Step 2 – Conducting the Value Assessment Step 3 – Sensitivity Analysis Step 4 – Industry Audits: Strength of Evidence Indicators Step 5 – Final Report Economic Analysis • Determine the Model Step 1 – Setting Up the Cost Model • Translating the Scope of the EA to the Financial Model: Step 2 – Determine the Quantities Step 3 – Setting up the Sub-models Step 4- Developing the ROI Cost and Returns • Determine the Alternatives Step 5 - Determining the Alternatives Step 6 – Determining the Financial Indicators • Collect Data & Benchmark Metrics Step 7 – Collect the Model’s Data & Assumptions • Conduct the EA Analysis Step 8 - Conduct the EA Analysis • Approve TA/TCO Go/No Go ICH Proprietary - 4 CAM for DBSAE Introduction to the User Guide document Each process in the Flowchart on Page 4 is presented in a separate tab in the guide with the following content presented in a standardized process description template containing: Process Identifier Process Flow Diagram Application to DoD 5000 document on a Milestone basis Required compliance on the document to meet Title 40 – CCA Entry Criteria of prior document that must have been approved to enter this Process Decision Criteria for completing this Process Exit Criteria for documentation that must be completed and approved for moving to the next Process A description of the Process A description of how-to’s and examples that will allow completion of the Process The processes are segmented into two parts: Solution Assessment Economic Analysis Additionally the following Appendices are provided: DOTMLPF Multiple Product Solutions Instructions Clinger-Cohen Act Compliance Acronyms Glossary ICH Proprietary - 5 CAM for DBSAE Process Guide ICH Proprietary - 6 CAM for DBSAE SA Process 1: Capability Analysis SA Process 1: Capability Analysis Identify mission capability requirements Touch Points: Applicability: Compliance Requirements: Functional Sponsor/ TP&R development DBSAE PEO approval PM approval Milestone: A - FAA Milestone: B – CDD CCA Review Area 1: Acquisition supports core priority functions CCA Review Area 2: Performance measures linked to strategic goals DoDD 5000.1 DoDI 5000.2 CJCS 3170-JOPsC Clinger-Cohen Act (CCA) – Title 40 DOTMLPF Requirements Step 1: Assessment of the Problem Statement Step 2: Determine JOPsC Capabilities, Functional requirements, and DOTMPLF Requirements related to Problem Statement Step 3: Develop Capability Analysis Report Entry Criteria: Exit Criteria: Artifacts/Deliverables: Agreed to Project Plan Approval of CAR Adequacy of Capabilities or plan for correction CAR (Requirements and their Justification) Work papers on: – Justification of Capabilities – Documentation supporting DOTMLPF and JOPcS if required Capability Analysis 1 Technology Assessments require a specification of the capabilities required by the Program and the scope under which to operate. This may be determined in a formal requirements process within BTA. There are 2 key entry requirements which initiate the Capability Analysis process: 1. Triggering Events─ There are several triggering events that would initiate a Technology Assessment. It is important to understand the different requirements for each program in order to establish the proper Problem Statement. – – – – 1 Pre Milestone A/FSA in the DoD 5000 series Pre Milestone B/AoA in the DoD 5000 series Technology Refresh or ECP Non-ACAT Infrastructure Programs An ICH CAM Product ICH Proprietary - 7 CAM for DBSAE 2. Initiation of a Problem Statement by a Functional Sponsor– This should answer the question, “What Business Enterprise Architecture capabilities are required for the identified solutions?” and pave the way for building a successful technology assessment and economic analysis for the proposed solution. This is the Entry criteria for an in-depth analysis of business and mission capabilities to be conducted. Step 1 – Assessment of the Problem Statement The Problem Statement would be provided by the Functional sponsor. A good “Problem Statement” should eliminate risks as early as possible. It must answer the questions below: 1. 2. What is the problem we are trying to solve and should explain? Who has the problem or who is the client/customer (organization) and explain who needs the solution and who will decide that the problem has been resolved? 3. What is the mission and management direction of the organization/s with the problem (Primary and secondary missions)? 4. What are the organizational principles and priorities? 5. Is the organization adequately staffed and funded to deal with the operational deficiencies or/ and its primary effects? 6. Is upper management aware of the mission/functional deficiencies? 7. Have the mission/functional deficiencies been identified in any other departments within the organization and if so, are these deficiencies being resolved? If these deficiencies are not being resolved, why? 8. Who is aware of the impact of these mission/functional deficiencies? 9. What form can the resolution be? What is the scope and limitations (in time, money, resources, and technologies) that can be used to solve the problem? 10. Are functional requirements already contained in other ICDs or CDDs. The Program Manager (PM) and the PM System Engineering Team (PM-SET) has the responsibility to determine if the focus of the Problem Statement is too narrow or the scope of the solution too limited, keeping in mind that the creativity and innovation of the solution can be stifled; therefore a balance must be created. The Scope of the Program includes the conditions under which the Program must operate and is essential for conducting the Total Cost of Ownership (TCO). The “Problem statement” presents a clear concise description of the issues that need to be addressed by the PM’s System Engineering Team and should be presented to them (or created by them) before they try to solve the problem. The Problem Statement must also clearly describe operational and performance objectives and the scope of the Problem Statement needs to be stated in the following three dimensions: Organizational Level - The level at which solutions to capability gaps will be implemented. Common organizational levels for setting goals include: BTA-wide - Business case to analyze the scalability or TCO of an implementation across the BTA Program-wide - Implementation across a specific base(s) Application-wide - Implementation across a specific application such as ERP Pilot-wide - Implementation across a pilot which includes a set of users on a base(s) Special - Implementation across a specially designated set of users Use Case - At this level, the organizational level is further defined by which user/role will be using the capability. Time Periods - Establishing appropriate and realistic target dates for goals ensures that they are meaningful and promote change. A combination of short and long term goals can be effective and ICH Proprietary - 8 CAM for DBSAE capabilities may be expressed in either form, i.e., short term by end of one year and long term by the end of four years. Step 2 – Determine Joint Operations Concepts (JOPsC) Capabilities, Functional Requirements/Capabilities and DOTMPLF related to the Problem Statement During Step 2, consideration of the following element must be developed in the analysis: JOPsC Capabilities: These provide a “tied back” to the Joint Staff requirements and are necessary for Milestone decision. To deal with uncertain future threats, JOPsC provides capabilities-based, conceptsdriven force planning. Identification of capabilities from these documents is required for all ACAT programs in the ICD/JCID documentation. Functional Requirements/Capabilities: The PM is responsible for developing the Functional Requirements, i.e., the Business capabilities that are decomposed into System Capabilities. Typically, Business Capabilities are decomposed to 2 levels of granularity and used in the initial business case in the ICD. Further decomposition for Milestone B decisions, with up to 2 additional levels; provide the high level system functions that the solution is to meet. The functional Sponsor must authorize that the capabilities identified will provide the solution required by the Problem Statement. A source for Functional Capabilities may be CDDs in current military repositories or industry templates. Clinger-Cohen Act Compliance (CCAC): The CAR must demonstrate CCAC to the following: The acquisition supports core, priority functions of the Department Outcome-based performance measures are linked to strategic goals Redesign the processes that the system supports to reduce costs, improve effectiveness and maximize the use of COTS technology (BPR) The program has an information assurance strategy that is consistent with DoD policies, standards and architectures to include relevant standards DOTMLPF: Doctrine, Organization, Training, Materiel, Leadership, Personnel, Facilities implications may impact the Functional Requirements and guide the Capability Analysis process by identifying factors that contribute to the deficiencies or gaps from a full operational view, and how these deficiencies can be resolved and with what technology or solution. DOTMLPF implications may affect the total cost of any solution and its impact must be included in the Economic Analysis. ICH Proprietary - 9 CAM for DBSAE An example from an evaluation illustrating a Capability list is shown in Figure A-1. Figure A-1: Capability List CAPABILITIES Capability-1: Comply with Industry Standards Capability-2: User must be able to collaborate Capability-3: The solution must be able to support large numbers of documents, their accessibility and access privileges STEP 3 – Develop the Capability Analysis Report (CAR) The Capability Analysis Report (CAR) is the report provided to the DBSAE, after the completion of the Capability Analysis process. This report documents the Problem Statement and identifies the list of capabilities and available solutions to satisfy the capabilities. Based on this report, the DBSAE will decide how to proceed with the Technology Assessment. When the e assessment process starts, each of the processes that follow will provide additional information to facilitate a history of the process to be documented upon completion of the assessment. The CAR will provide a Capability Matrix and the artifacts used for its Development. The six elements of the CAR Report are: 1. Problem Statement 2. Description of the Capability 3. Justification for the Capability 4. Capability Matrix 5. Sizing of the Problems (quantity of users, locations, differentiate Capabilities by location and use case 6. Timing of Capabilities (when needed) ICH Proprietary - 10 CAM for DBSAE SA Process 2: Capability Determination SA Process 2: Capability Determination Transforming the list of Capabilities into Analysis Models Touch Points: Applicability: Compliance Requirements: DBSAE PEO approval PM development Milestone: B – AoA CCA Review Area 4: No Private Sector or Government source can better support the function DoDD 5000.1 DoDI 5000.2 Clinger-Cohen Act (CCA) – Title 40 Step 1: Market Survey Step 2: Build Assessment Models Entry Criteria: Exit Criteria: Artifacts/Deliverables: Approval of CAR Approval of Analysis Groups Adequacy of industry metric or plan for correction Analysis Group ( Capability: Solution Tables) Work papers on: – Results of the Market Survey – Industry Benchmarking Data Description of Process Capability Determination2 Capability (Solutions) Determination (CD) process turns a set of capabilities that were identified in the Capability Analysis Process into a canonical form required for the CAM – referred to as an Analysis Model. Analysis Models are 2-dimensional tables with capabilities on one dimension and the technologies or products that are being reviewed on the other dimension as shown in Figure A-2. This process of creating Analysis Models creates an Analysis Plan. How To: Step 1: Market Survey The CAM requires the collection of research information, studies, and vendor documentation to conduct the Technology Assessment and produce an analytical rating (score) of each technology considered. This effort needs to be in compliance with Federal Acquisition Regulations (FAR) and not show preference to any contractor(s). If market data is insufficient, a Request for Information (RFI) or third-party information may be required to collect sufficient data. 2 An ICH CAM Product ICH Proprietary - 11 CAM for DBSAE Step 2: Build Assessment Models This effort breaks the capabilities into one or more solution sets relevant for conducting an analytical technology assessment. The simplest solution set is a set of capabilities that can be evaluated by a set of technology products. In more complex technical assessments, several analysis models may be required. Analysis Models can be constructed based on use-cases or by subsets of capabilities that are defined by a set of products. The next level of granularity (Level-2) is added to the capability list as shown in Figure A-2. To make the tables more functional, short names are used for the capabilities in the example and throughout the rest of the steps. Figure A-2: Illustration of Multiple Levels of Capabilities CAPABILITIES Capability-1: Comply with Industry Standards 1. DoD Standards 2. Industry Standard Capability-2: User must be able to collaborate 1. email 2. Instant Messaging/Chat/Blog 3. Voice/IP Capability-3: The solution must be able to support large numbers of documents, their accessibility and access privileges 1. Google-like search 2. Support federated searches across repositories The table above shows an example of Capabilities decomposed to 2 levels as will be done in the Feasibility Assessment Process. In the Architecture Assessment Process capabilities are decomposed to greater than 2 levels. Based on the Market Survey and a Request For Information (RFI, if conducted), an Analysis Model is formed, mapping capabilities to Solutions as shown in Figure A-3. Customer Service System Monitoring Document/Records Management Workflow Content Management and Delivery Search and Indexing Collaboration Knowledge Management Personalization Products Comply with Industry Standards Figure A-3: Analysis Group Product1 Product2 Product3 Product4 ICH Proprietary - 12 CAM for DBSAE More complex solutions may require multiple analysis models which are described in Appendix B. The Need for a “well-formed” Analysis Model Keep in mind, a flawed analysis plan can easily skew the results of any assessment. The Analysis Plan, which consists of a set of Analysis Models, should contain sufficient information for conducting a FA/AA. If any of the elements are ambiguous, incomplete or omitted, the PM will attempt to assist the originator in clarifying the issues. This could include reiteration of the capability analysis process described earlier in the Guide. The issues(s) could be with the problem statement, scope or capabilities/gaps. Review Upon completion of the CD phase, an Initial Product Review (IPR) is held with the touch-points to address how all IPR action items must be resolved before finalization of this phase. ICH Proprietary - 13 CAM for DBSAE SA Process 3: Capability Prioritization SA Process 3: Capability (CP) Prioritization CP is a value-based prioritization of the Program’s capabilities Touch Points: Applicability: Compliance Requirements: DBSAE PEO approval PM approval PM’s System Engineering Team development Functional Sponsor/TP&R/ stakeholders approval Milestone: A – FAA Milestone: B – CDD CCA Review Area 1: Acquisition supports core priority functions DoDD 5000.1 DoDI 5000.2 Clinger-Cohen Act (CCA) – Title 40 Perform Approve Capability Prioritization CP Yes Define Deficiencies No Step 1: Create Prioritization Criteria Step 2: Weight Capabilities by Importance– Lower Level Capabilities Step 3: Group Normalization Entry Criteria: Exit Criteria: Artifacts/Deliverables: Approval of CAR Adequacy of Capabilities or plan for correction Approved Capability Prioritization Matrix Capability Prioritization Matrix Work papers on: – Prioritization Scale – Rationale for each weight cess 3: Capability Prioritization Description of Capability Prioritization Capability Prioritization3 The goal of the capability prioritization process is to determine the value of each capability in the analysis process and estimate its importance to the technology or solution. Capability Prioritization (CP) is an important tool in understanding the extent of program requirements and should proceed from a risk management perspective. CP is conducted with the key stakeholders to create an analytical measure of the value of the capability to the enterprise/program/project. Prioritization is the key to understanding the scope of program objectives which in turn drive the ordering of requirements. 3 An ICH CAM Product ICH Proprietary - 14 CAM for DBSAE The Capability Prioritization process assigns a numerical weight that represents the value of each individual capability to the solution. This effort produces a consensus on the prioritization of the capability values. A byproduct of this effort is a set of vetted evaluation criteria that can be used in future acquisitions. The value of the assessment may then be used in the Analysis of Alternatives or Economic Analysis to determine low risk ─ lower than 80% solutions. Stakeholder participation is essential to task delivery as we progress into the technical assessment process. To better reach a consensus on the list of capabilities and whether the list addresses the Problem Statement, a collaborative environment, which consists of the PM and the PM’s System Engineering Team, is assembled with a set of rules to facilitate data/information exchange among the group for the purpose of defining, developing and evaluating a capability. The PM’s System Engineering Team is composed of at least two groups, with each group assessing a different solution component. The PM’s System Engineering Team uses a group jury-style or Defense Acquisition University (DAU) techniques to discuss why particular scores are assigned, and defend their position until there is convergence of the entire team. This is called group normalization. The framework of the group provides specific aspects of collecting mission capabilities which would address the Problem Statement of each solution component. If multiple evaluation teams are used, they will have to go through this normalization process among each other. At that point it should be easier to determine how to assign value to each group. Again, this is how important each capability is to the mission or the problem that is being solved. The final step is to normalize the valuing process, ideally based on firm estimation and response planning techniques, where each of the participants will discuss/defend why they weighted the capabilities as they did, with the purpose of reaching a consensus. The issues addressed by the PM’s System Engineering Team needs to be clearly delineated and understood by each member of the Group to construct a useful problem statement. In formulating the problem statement, it should become clear as to why the skill of the various Group members is needed. All these questions have to be answered with key emphasis placed on lessons learned. A decision whether to advance to the next phase of the assessment will be reached in the development cycle, which will eliminate risks as early as possible. How To: Step 1: Create Prioritization Criteria Priority weights will be used to assess the feasibility of COTS utilization in determining how much value an 80% solution can provide and which low value capabilities can be eliminated. Figure A-4 illustrates the process of assigning weights or value to the capabilities. Figure A-4: Level-1 Prioritization Weight Importance CAPABILITIES 500 Comply with Industry Standards • DoD Standards • Industry Standard 250 User must be able to collaborate • Email • Instant Messaging/Chat/Blog • Voice/IP 250 The solution must be able to support large numbers of documents, their accessibility and access privileges • Google-like search • Support federated searches across repositories ICH Proprietary - 15 CAM for DBSAE Let us assume that a value weighting is to allocate 1000 value points to the capability categories. The capability of “Comply with Industry Workflow Standards” was rated as the most important and received nearly half (440) of the available 1000 points. “Migration Path” and “Production Support” were next in importance and each received weights of 200. The remaining capabilities were of less importance and split the remaining points. Step 2: Weight Capabilities by Importance– Lower Level Capabilities The approach in prioritizing sub-capabilities is similar to the approach used for Level 1 capabilities. In prioritizing the sub-capabilities, the points allocated to each individual capability are distributed across the sub-capabilities according to their importance to the capability. Figure A-5 illustrates a sample allocation. Figure A-5: Weights of the Sub-Capabilities Weight Importance 500 300 200 250 100 100 50 250 200 50 CAPABILITIES Comply with Industry Standards • DoD Standards • Industry Standard User must be able to collaborate • Email • Instant Messaging/Chat/Blog • Voice/IP The solution must be able to support large numbers of documents, their accessibility and access privileges • Google-like search • Support federated searches across repositories In this example, the 500 points allocated to “Comply with Industry Standards” are assigned to the sub-capabilities (DoD Standard, Industry Standards) according to their relative importance to the capability. This process is done for each of the capabilities allocating each level 1 capability’s weight across the sub-capability. This provides better visibility of the actual value of the sub-capabilities. In this example, the visibility of the 2nd tier weights may provide a rationale for adjusting the 1 st level weights during the Group Normalization process. Step 3: Group Normalization For the third step, the evaluation team uses a “group-jury” style approach to discuss why particular weights or scores were assigned, and defend their positions until there is a convergence of the entire team - Group Normalization. Each of the participants will discuss/defend why they weighted the capabilities as they did, with the purpose of reaching a consensus, which may later be used during the Feasibility/ Architecture Assessment Process. Each dependent capability needs an indication of its value to the problem statement: CP is an input tool for assessing how a capability can be met based on the availability of existing (COTS/GOTS) technology The goal is to look at the value of each capability/objective in the environment and assign numerical priorities representing the importance of each individual capability Each capability must be assessed as to its overall contribution or value to the solution being assessed All capabilities are not equal ICH Proprietary - 16 CAM for DBSAE CP is conducted with the key stakeholders to create an analytical measure of the value of the capability to the enterprise/program/project The outcome is an agreed-upon prioritization of the value of the capability Priority values are used in the Technology Assessment Process to determine how much value an 80% solution can be provided by the elimination of low value capabilities Prioritization can be reused as weighted evaluation factors in RFP’s Review Upon completion of the CP phase an Initial Product Review (IPR) is to be held with the touchpoints where all IPR actions items must be resolved before finalization of this phase. Remarks If the prioritization is being done in a group session, the team members need to agree on the weights and sign off on them. If the prioritization is being done individually, then there will have to be a session to normalize the individual weighting and the team members need to agree on the weight and sign off on them. PM’s System Engineering Team needs to document each weighting choice within the electronic log for maintaining the weighting artifacts. ICH Proprietary - 17 CAM for DBSAE SA Process 4: Feasibility Assessment SA Process 4: Determine the ability of a technology to satisfy the business or mission Feasibility/Architecture capability Assessment (FA/AA) Touch Points: Applicability: Compliance Requirements: Vendors - input Third Parties Industry Audits - inputs DBSAE PEO approval PM approval PM’s System Engineering Team development Milestone: A – FAA Milestone: B – FSA, CDD, CPD, CONOPS, AoA CCA Review Area 3: Redesign the processes that the system supports to reduce costs, improve effectiveness and maximize the use of COTS technology CCA Review Area 5: An analysis of alternatives has been conducted DoDD 5000.1 DoDI 5000.2 Clinger-Cohen Act (CCA) – Title 40 Step 1: Step 2: Step 3: Step 4: Step 5: Setting the Value Criteria Conducting the Value Assessment Sensitivity Analysis Industry Audits: Strength of Evidence Indicators Final Report Entry Criteria: Exit Criteria: Artifacts/Deliverables: Approved Capability Prioritization Matrix Approval of Feasibility Assessment Report Analysis of Alternative (Technologies that best-fit Capability Requirements Work papers on: – Scoring Plan – Scoring Rationale – Sensitivity analyses performed Description of Process Feasibility/Architecture Assessments4 Feasibility/Architecture Assessments are based on the ability of a technology to satisfy the business or mission capability and further validate that the COTS solution is viable. The assessments accomplish the traditional technology assessment analysis portion of the pre-acquisition process. There are 2 types of assessments which occur at different phases of this Acquisition Process: Feasibility Assessment (FA) analyzes the degree to which existing technologies meet the identified capabilities (sufficiency) from the CP process. The FA is used to determine whether there are sufficient COTS products in the marketplace for procurement. If there are insufficient COTS products available, then the analysis would suggest a custom build solution “make/buy” decision or wait until the market is matured. Generally, FA is a Milestone A assessment process and has a level of granularity of 1 - 2 levels for each capability. The process does not require a comprehensive evaluation of all products on the market but only representative technology. The approach ICH Proprietary - 18 CAM for DBSAE maximizes hi-value capabilities (functional capabilities) and determines the “best value” for the solution and identification of a risk-reduced “80%” solution. FAs can be used in Capability-Based Acquisitions as well. The analysis places more emphasis on existing products rather than building custom solutions which are prone to much higher implementation risk. FA will produce an analytical rating of each technology considered from no risk to high risk using CAM analytical process. Architecture Assessments (AA), although the process is based on the ability of a technology to satisfy the business or mission capability, it provides a detailed/in-depth analysis of technology solutions and their alternatives as well. The difference between FA and AA is the comprehensive analysis the AA process provides for the support of the PMs required to conduct AoAs on technology solutions being considered in Milestone B decisions (Milestone A in BCL). The AA process assesses alternative sets of products identified by an architecture team on the product’s ability to satisfy the set of capabilities identified (in the CD process). In general, the architecture assessments process utilizes a lower level of detail for each capability and decomposes up to four levels. Selection Assessment (SA) ─ although not officially part of this pre-procurement user guide, a selection assessment uses the same methodology of the architecture assessment, but evaluates the bid products. In addition, the assessment evaluates product with the highest value or score (1 - 5). Step 1: Setting the Value Criteria This step sets the criteria by which a capability can be measured against the solution. This requires the establishment of a relationship between the importance of a capability and a relative score to be assigned as the ranking of the capability. In this example, capabilities are rated on a scale of one to five based on their relative importance to the products or technology. This assessment is managed by the PM’s System Engineering Team (PM-SET) ideally based on solid estimation and response planning techniques. Through a negotiation among product stakeholders, an agreement defines the scoring. Assessments can be made on a numerical scoring system according to each of the capabilities. An example of such a scoring system would be a scale of 1 to 5, in which a value of 1 is indicative of a “Meets all requirements”/ No Risk” capability while a value of 5 represents “Meets none of the requirements”/High Risk capability. The PM’s PM-SET, using the group jury-style approach to discuss why particular scores are assigned, defends their position until there is convergence of the entire team (Normalized). If multiple evaluation teams are used, they will have to go through this Group Normalization process among each other. In Figure A-6 capabilities are rated on a scale of one to five based on their relative importance to the solution. The PM’s PM-SET develops a scoring system from 1 to 5. Examples are shown in Figure A-8. Detailed information on why a score was assigned is to be maintained on an electronic log to preserve the project’s artifacts on scoring. Figure A-6: Scoring Scale CAPABILITY Almost no risk to satisfy Moderate risk Manageable risk Significant risk High risk SCORE 1 2 3 4 5 Guideline for Technology Scoring Building an analysis plan for scoring products is not a trivial exercise; incorrectly formed, an analysis plan can easily skew the results of any assessment. Problems to avoid and guidelines for constructing a well-formed Analysis Plan Make sure the criteria for scoring a product is based on a specific capability, which actually differentiates the capability from the product. Make sure the instructions are not ambiguous and the rationale for giving a score does not accidently give credit for partial compliance, making the capability concise prevents the results from being skewed. This is a technical assessment of functions in technology products that can deliver a solution that provides a capability. While architectural requirements are important, they should not be more than 75% of the weight of the assessment (in Capability Prioritization). If they are, then the functional-oriented capabilities and the architecture-oriented capabilities should be separated into two analysis groups. This will avoid minimizing the technology functionality to the point that it has no impact on the technology assessment. ICH Proprietary - 19 CAM for DBSAE Step 2: Conducting the Value Assessment At this point it is important to score the capabilities against the products. Utilizing the data collected during the market survey every capability-product combination is scored. The strength of the PMs PM-SET in understanding the capabilities and the technologies is a key determinant in the quality, accuracy and scoring of the assessment, which is illustrated in Figure 7. Requires access to multiple security domains Dominant Value User fits in wider functional community Weighted Value Files stored locally Remote Access User Files stored on network Roaming User Requires access to SDC and distributed functional applications Requires access to functional applications not suitable for SBC Mobil User CAC & PKI Hybrid Power User Capability Hybrid Functional User VALUE ASSESSMENT Scale: 1= lowest risk 5= highest risk Baseline User Figure A-7: Matrix 1: Scored Capabilities and Sub-Capabilities 0.17 1 1 1 5 3 5 5 0.17 1 1 2 2 3 4 5 0.17 1 4 1 1 3 3 1 0.17 1 4 3 1 1 5 5 0.17 1 2 5 1 5 5 5 0.17 1 1 4 1 3 5 5 1.00 2.17 2.67 1.83 3.00 4.50 4.33 1 1 1 1 3 3 1 Blue = Low Risk Green = Moderate Risk Yellow = Some Risk Red = High Risk The raw scores for the 6 use-cases under evaluation are shown in columns 3 through 8. To calculate the weighted average for a capability, follow the steps shown below for all sub-capabilities: Multiply the weight by the raw score for the sub-capabilities; Sum the sub-capability scores for the capability and divide by the capability weight; Record result as the weighted score for the capability. Repeat his process for each capability and product. Finally, Figure 8 below shows the scored Capabilities Matrix with the sub-capabilities and the table transposed with products on the left and capabilities across the top. The final scores for each product are computed as follows: Final Score = (Raw Scores x Weight Percent) The next step in the process is to calculate the weighted average of the scores for each product and provide an overall rating (score) as Shown in Figure A-10. Figure A-8: Final Results ICH Proprietary - 20 CAM for DBSAE Categories include “No Risk” for a product which has a score of 1 – 1.99. To be considered “Moderate Risk”, a product must have a score of 2 – 2.99. To be considered “Manageable Risk”, a product must have a score of 3 – 3.99. If a product has a score in the range of 4 – 4.99 it is considered “High Risk”. Step 3: Sensitivity Analysis Sensitivity analysis looks at “what if” questions - what if a capability is assigned a higher weight; does that change the product’s overall score. The question is, what if the weight were different, would that better represent the Problem Statement? Technical Assessment Recommendation Based on the Sensitivity Analysis certain recommendations can be provided: Discontinue investment Select one of the identified alternatives Defer decision pending additional information or changes in business or market conditions Identify specific actions that are recommended to manage risks Categorize the assessment into one of 4 options: − − − − Category-1: Technology is mature, has value to BTA and has funding Category-2: Technology is mature, has value to BTA but is not funded Category-3: Technology is mature, has value to BTA but wait for industry improvement or investment Category-4: Has no value to BTA In Category-1 or -2 recommendations, there are two additional considerations that could reduce program risk: 1. 2. A pilot of a representative technology can be conducted to further assess potential recommendations Evidence-based research of industry implementations of similar capabilities can be conducted, which may adjust original scoring. The preferred method is Category - 2 since it is the least costly and represents real situation, because pilots are very contained examples not “real world” situations. Step 4: Industry Audits: Strength of Evidence Indicators The last step is to determine whether the assessment requires additional evidence of the products’ ability to satisfy the capability. This may be conducted through an industry audit or lab testing. Industry audits provide the PM’s PM-SET with an industry supported research and validation approach that details market capabilities via contextual and evidenced based solution sets. Audits provide the PM’s PM-SET with a view of past implementation and testing data that improves the understanding of market capabilities and best practices, thereby reducing risk and cost associated with COTS viability. After the initial scoring of the technology assessment, the next step is to determine the degree or the strength of the evidence that is needed to complete the assessment. “Strength of Evidence” options are shown below: Low strength of evidence - Vendor claim: No way to verify if the technology features have ever been used and the extent to which the features have actually delivered the required capability. Medium strength of evidence - Lab testing: identifies some risk in a vendor product but usually not scalable, secured or matured. Scoring will be adjusted based on the lab test. High strength of evidence - Industry Audits: a reality check, by verifying what customers of the technology products have actually been able to accomplish and more specifically how well the product performed e.g. too difficult to use, did not perform well, not stable, met expectations, did not use feature, etc. Scoring will be adjusted based on the industry audit. ICH Proprietary - 21 CAM for DBSAE Step 5: Final Report On completion of the Solution Assessment the final item is to produce the Solution Assessment Report (SAR). ICH Proprietary - 22 CAM for DBSAE EA Process 1: Determine Model EA Process 1: Determine Model Determine the model in which the required analysis are captured Touch Points: Applicability: Compliance Requirements: PM approval PM’s Cost Team development Milestone: B - Economic Analysis, LCCE CCA 6 - An economic analysis has been conducted that includes a calculation of the return on investment; or for non-MAIS programs, a Life-Cycle Cost Estimate (LCCE) has been conducted DoDD 5000.1 DoDI 5000.2 Clinger-Cohen Act (CCA) – Title 40 Step 1: Setting up the cost model Entry Criteria: Approved Business Case Part 1 Exit Criteria: Completion of hi-level financial model structure Organizational agreement to continue process PM Approval Artifacts/Deliverables: Work papers on: – Model Documentation – Documentation rationale of the Model Description of Process: The BTA Economic Analysis (EA) is an analytical process that provides decision quality data and financial data to determine a solution while identifying risk mitigation strategies. The EA is compliant with DoD Directive 5000.1, and DoD Instruction 5000.2. The Defense Acquisition System references lifecycle cost and total cost of ownership as: Lifecycle costs consist of research and development costs, investment costs, operating and support costs and disposal costs over the entire lifecycle. Total cost of ownership consists of the elements of a program's lifecycle cost and other infrastructure or business process costs not necessarily attributable to the program. These costs include not only the direct costs of the acquisition program, but also include indirect costs that would be logically attributed to the program. In Figure A-11, there are five processes in performing the EA. These incorporate an eight step methodology as illustrated. ICH Proprietary - 23 Economic Analysis CAM for DBSAE Figure A-8 Economic Analysis – an 8 Step Approach to Total Cost of Ownership The EA process utilizes a Total Cost of Ownership model (TCO) with financial indicators to assess the cost of fulfilling mission/business gaps under various alternatives. STEP 1 – Setting up the cost model The EA methodology consists primarily of a cost model that includes a chart of accounts that allows for the collection of data on direct costs, which include materiel cost, indirect cost and one-time startup costs referred to as migration costs. The cost model is set for a specific evaluation period. The first objective is to build a cost model with a chart of accounts across the period of the evaluation. As with all cost models some costs may increase over the evaluation period while others may decline, based on the different alternatives scenarios being evaluated. The goal of the cost model is to determine the savings from the baseline (as declining cost) or between alternatives (saving from cost difference). The basic structure of the Cost Model is shown in Figure A-12. Figure A-9 TCO Chart of Accounts Yr1 Yr2 Yr3 Yr4 Total Direct Cost Indirect Cost Migration Cost Total Cost Savings COST MODEL 100 80 60 40 20 0 TCO PC T-Client Blade PC 1st 2nd 3rd 4th Qtr Qtr Qtr Qtr MANAGE The TCO’s Chart of Accounts includes the Direct Costs, Indirect cost, Migration Costs summing to the Total Costs. This basic model will be expanded to detail cost elements in the sub-models and then portions of the cost elements will be designated as savings or returns and other cost elements will be designated as investment for determining the financial indicators. ICH Proprietary - 24 CAM for DBSAE EA Process 2: TRANSLATING THE SCOPE OF THE EA TO THE FINANCIAL MODEL EA Process 2: This process sets the scope of the analysis and the PM’s EA Team f TRANSLATING THE SCOPE OF THE EA TO THE FINANCIAL MODEL This process is based on the scope and sets the quantities and the capability timelines into the financial model will guide the analysis. Touch Points: Applicability: Compliance Requirements: PM approval PM’s Cost Team development Milestone: B - Economic Analysis, LCCE DoDD 5000.1 DoDI 5000.2 Clinger-Cohen Act (CCA) – Title 40 CCA 6 - An economic analysis has been conducted that includes a calculation of the return on investment; or for non-MAIS programs, a Life-Cycle Cost Estimate (LCCE) has been conducted Step 2: Determine the Quantities Step 3: Setting Up the Sub-models Step 4: Developing the ROI Cost and Returns Entry Criteria: Exit Criteria: Artifacts/Deliverables: Completion of hi-level financial model structure Organizational agreement to continue process PM Approval PM Approval Cost model volumes and timing of capabilities structure completed Work papers on: – Model Documentation on volumes and timings – Sub-models – Financial Indicators to be used Description of Process: The initial process─ Definition─ sets the scope of the analysis. During the definition stage, the PM’s Cost Team formulates the assumptions and constraints that will guide the analysis. The PM’s Cost Team also identifies the number of alternatives the EA will consider. The definition stage can often make or break the Economic Analysis and it lays the groundwork for communicating to stakeholders the reasoning of the EA Team, which establishes the credibility of the Economic Analysis. EA's are always forward-looking, and since the future is always difficult to predict, assumptions must be stated, which will guide the EA process. Stating assumptions allows stakeholders to measure the reasonableness of all conclusions. At a minimum, there will always be at least two options or potential outcomes from the analysis, even if they are to adopt a specific decision or to maintain the status quo. Frequently, there will be multiple alternatives under consideration, which make it critically important to concisely identify the key characteristics or defining features of each option. ICH Proprietary - 25 Economic Analysis CAM for DBSAE STEP 2 -Determine the Quantities The model is further formed by determining the quantities to be evaluated for each cost element in each year and for each alterative being considered. Figure A-13 creates the quantities for the model and demonstrates how multiple scope models can be analyzed. Figure A-10 Quantities and Scope Analysis Quantities are captured from the Capability Analysis process, which will include population sizes for the scope of the EA under consideration by: Organizational Level, Use Case Level and Short or Long Term Levels. Organizational Level: The level at which capability gaps will be implemented. Common organizational levels for setting goals include: Organizational-wide: an implementation across the entire organization e.g. Air Force Base-wide: an implementation across a specific base(s) Application-wide: an implementation across a specific application such as Global Command and Control Systems (GCCS). Pilot-wide: an implementation across a pilot which includes a set of users on a base(s) Special: an implementation across a specially designated set of users Use Case Level: At this level, the organizational level is further defined by which user/role will be using the capability. During this process it is important to identify the “Problem statement” and set the scope of the analysis at which time the PM’s Cost Team formulates the assumptions and constraints that will guide the analysis. The PM’s Cost Team also identifies the number of alternatives the EA will consider. This stage can often make or break the EA as it lays the groundwork for communicating to decision-makers the reasoning of the team which establishes the credibility of the EA. Short-term or Long-term Level: A situation where some capabilities will be delivered earlier than others. In addition, implementation strategies and extended evaluations can affect how the evaluation results should be analyzed: a b Determine Model’s Implementation Strategy for the Quantities. The implementation strategy includes: Quantities staggered evenly over the Life-cycle Quantities staggered by fielding plan over the life-cycle, such as replacement of old units at the end of their system-life All Quantities delivered at one-time, in Year 1 Determine Timeframe (Years) for the Evaluation The time frame for measuring TCO has traditionally been anywhere between three years and five years for typical IT projects, with rare instances of seven to ten year analyses for investments with large capital outlays such as extensive Enterprise Resource Planning (ERP) reengineering initiatives. ICH Proprietary - 26 CAM for DBSAE STEP 3 – Setting Up the Sub-Models As noted, the EA model uses the chart of accounts as collection “buckets” and these buckets are linked to submodels. These sub-models calculate costs from two approaches. The first approach uses established industry verified metrics, such as industry migration cost which are averaging 21% of the cost of implementation. The second (if metrics are not available) uses a bottoms-up build of all acquisition cost elements to be developed. In Figure A-14, each EA account requires a sub-model. Figure A-11 Sub-Models: Direct, Indirect, Migration Costs and Savings Building an effective chart of accounts is dependent on using the right “combination” of cost elements sufficient to capture the essence of the analysis. There are several cost elements that have to be considered as well as DoD DOTMLPF (Doctrine, Organization, Training, Materiel, Personnel, Leadership, and Facilities) compliance issues that may in turn affect the implementation of any solution. DOTMLPF Implications– in reviewing the cost implications of DOTMLPF, all associated costs must be captured in sub-models that pertain to the full LCCE of the Economic Analysis which is detailed in the Appendix B, Part 1 of this User Guide and illustrated below: D for DOCTRINE – Are there any doctrines or procedures in place which are contributing to the identified gaps or capabilities. O for ORGANIZATION – What is the problem we are trying to solve and should explain? Who has the problem or who is the client/customer (organization) and should explain who needs the solution and who will decide that the problem has been resolved? T for TRAINING – Are the identified deficiencies or gaps caused at least in part, by the complete lack of training, or inadequate training? M for MATERIEL – Are the deficiencies or gaps caused by inadequate systems or equipment? L for LEADERSHIP – Are the deficiencies or gaps in part caused by the inability or decreased ability to cooperate/coordinate/communicate with upper management or external departments? P for PERSONNEL – Is the operational deficiency caused in part by the inability to place qualified and trained personnel in specialized occupations. F for FACILITIES – Is the deficiency or gap caused in part by inadequate infrastructure and if so, was the problem a result of: − − Aging/wear? New engineering processes didn’t meet needs? ICH Proprietary - 27 CAM for DBSAE Direct costs─ are considered to be fixed costs and include infrastructure and Information Technology components. Costs include hardware and software expenses and communications equipment. The direct cost represents typical costs and captures actual costs for all direct expenses related to the clients mobile and desktop computers, servers, peripherals, and network in the distributed computing environment which serves the distributed computing users. Elements of direct cost include: Component Description Hardware and Software The capital expenditures on hardware and software as well as lease fees for hardware as it pertains to the company’s distributed computing assets including servers, client computers (desktops and mobile computers), peripherals, and network components. IT hardware and software expenses are included as well. Expensed/ Depreciated Lease Accounts for the annualized acquisition cost amount of server, client, peripheral and network hardware. Portions of fees that are paid for services should be separated from the fees entered here, and allocated to the appropriate operations labor category. Messaging and groupware Accounts for the annualized capital expenditures on new and upgraded email, groupware and collaboration software. Network, systems, storage and asset management Accounts for the annualized costs of all software used by Operations staff to control the distributed computing environment and assets. Expenses for Network, Systems, Storage, and Asset Management software tools that are purchased rather than developed should be included. Other Accounts for the total annualized capital expenditures for other software, such as new operating systems on production servers and client computers, and for operating system upgrades or additional licenses on user based licensed systems. Also, accounts for the annualized expenditures for new and upgraded software foundation or middleware such as internet/intranet, e-commerce, transaction processing products, and remote access software that facilitate business applications. Indirect costs─ Indirect costs are a second order effect of the IT spending. Indirect cost elements to consider include: Component End-User Operations System Administration Training Description End-user Operations is designed to capture and compare lost productivity costs in this area. End-user operations costs consist of peer support, casual learning, formal learning, application development, and file and data management. The direct labor staffing, activity costs, and outsourced fees spent by corporate, or business unit IT to deliver technical support and infrastructure operations for distributed computing users. Accounts for the annual labor expenses of end users training and supporting themselves in lieu of formal training and support programs. Time includes reading manuals, using online help, trial and error, and other self-learning ICH Proprietary - 28 CAM for DBSAE File and data management methods to learn programs and resolve issues. Typically, costs are higher for casual learning/support than formal learning (accounted for in Support), and inadequate formal training will result in high casual learning expenses here. Accounts for the annual labor expenses for the end-user time spent managing files and data including end-user performed file system maintenance, organization, optimization, backup and recover. Industry Metrics (calculated costs). In many cases there is industry data available on the cost to calculate an element in the sub-model. If this data is to be substituted for an actual cost estimate, it must be stated as an assumption and the calculation method shown. As illustrated in Figure A-15, an example of cost calculation assumptions is shown from the Server-based Computing artifacts. Figure A-12 Server-based Computing Artifacts Cost Calculations Direct cost of thin client at 21% of Unmanaged PC costs - Burton Group Migration Cost are 20% of the implementing of the total cost – IDC Managed PCs add a management server to the Unmanaged PC environment – Burton The ratio between direct (fixed ) costs and indirect (variable costs) are 50:50 in Unmanaged PC environment – Gartner, Mitre The ratio between direct (fixed ) costs and indirect (variable costs) are 65:35 in Managed PC environment – Gartner, Mitre The ratio between direct (fixed ) costs and indirect (variable costs) are 74:26 in Thin Client environment – Gartner, Mitre Variable cost will increase each year in-line with increasing SBC population STEP 4 - Developing the Cost and Returns (Saving) for Calculating ROI After the financial model is established, the next step is to determine which cost elements represent investment and which elements represent return. Savings from lower costs are considered a return on investment. This part of the EA can calculate return on investment, breakeven point, and cash-positive month on an end-item delivery component or between alternative approaches in an AoA. Modeling consideration based on Sustaining vs. Disruptive Technologies Determine whether the TCO model will be based on a Sustaining or Disruptive technology. A disruptive technology can be a totally new technology direction that will most likely “disrupt” the status quo and require significant new investment. Sustaining technologies relate to life-cycle replacement of existing systems with upgraded components, producing similar or improved capabilities within existing expenditures based on a product’s life-cycle sustaining budget. The significance of the two models is the difference in how to treat the calculation of investment costs and the return on investment in the cost model. These types of scenarios are described below providing guidelines on the calculations of investment and return elements: Disruptive Technology - If the technology under review is classified as a disruptive technology, such as Unified Communications, new cost elements are introduced into the system Sustaining Technology - Every product has a life cycle. It is launched; it grows, and at some point, may die. Desktop clients are considered a sustaining technology. Upgrades or new kinds of clients are typically still part of the sustainment of end user capabilities. In the case of Server-based Computing, replacement of client devices at normal life-cycle refresh periods would not be considered an investment. However, client device cost savings would be considered in the calculation on the return on investment. Baseline Data Availability - Where baseline data on cost is available, TCO costs and ROIs can be developed between the baseline and alternative replacements for the baseline. This is accomplished by taking the cost of the alternative (the investment) and using the savings between the baseline and the alternative. This can be ICH Proprietary - 29 CAM for DBSAE accomplished for each alternative to determine the ROI differences for each alternative. If no baseline data exists, the lowest TCO would be considered the baseline. Other Considerations Finally, how many technical refreshes during the life-cycle need to be considered? Capabilities are stated in the Capability Analysis process in terms of short or long-term goals so that the EA can assign the associated cost to the proper time period. How to: A. Chart of Accounts The process is as follows: Begin with the problem statement in mind. Consider organizational growth and reductions plans Start with a basic structure and then enhance. However, ensure there is a match between level of detail and ability to maintain this – KISS. Gain consensus with the stakeholders and eventual user Employ a third party review to determine accuracy and consistency on more sophisticated evaluations (IV&V). B. Model of Specific Products vs. Generic Normalized technology If Budget constraints exist on first year cost, each product under consideration could have a TCO, as part of the decision data when there are cost-constraints during the first year. ICH Proprietary - 30 CAM for DBSAE EA Process 3: DTERMINE THE ALTERNATIVES EA Process 3: DETERMINE THE ALTERNATIVES Define the alternatives that will be supported in the analysis T Touch Points: Applicability: Compliance Requirements: PM approval PM’s Cost Team development Functional Sponsor Approval Milestone: B - Economic Analysis, LCCE DoDD 5000.1 DoDI 5000.2 Clinger-Cohen Act (CCA) – Title 40 CCA 6 - An economic analysis has been conducted that includes a calculation of the return on investment; or for non-MAIS programs, a Life-Cycle Cost Estimate (LCCE) has been conducted Step 5: Determining the Alternatives Step 6: Determining the Financial Indicators Entry Criteria: Exit Criteria: Artifacts/Deliverables: PM Approval Cost model volumes and timing of capabilities structure completed Completion of financial model alternatives PM Approval Functional Sponsor Approval Work papers on: – Documentation of each Alternative Description of Process: Business case development is a process of analyzing the objectives defined in the given Problem Statement to derive quantifiable as well as qualitative savings and to determine likely costs under alternative “solution” scenarios. Having the financial model in place, the next step is to consider the alternatives that will be applied to the model. Once all potential alternatives have been identified, the PM’s Cost Team must follow a process to narrow the realm of possibilities down to a few viable alternatives. Using primarily the Problem Statement from the Capability Analysis report, alternatives can be determined based on their ability to fill the gaps between where BTA is now and where it wants to be in the future. Asking whether the organization can absorb the change and extrapolate the probable long-term success of the investment are critical issues to be addressed before calculating costs and savings. Other factors used to determine viability include technical or programmatic feasibility, cost, regulatory compliance, etc. This first analysis can reduce the range of alternatives to a manageable number that can then be more fully quantified. ICH Proprietary - 31 Economic Analysis CAM for DBSAE STEP 5 - Determining the Alternatives In this Step primary and alternative solution scenarios are defined: Analyze each alternative to detail benefits and costs. Remember that in many cases the alternatives provide different approaches to accomplish the same business goal – in which case benefits may be constant for all alternatives and the goal is to find the lowest cost / risk approach. In other instances, both benefits and costs may differ for each alternative Capture key assumptions in spreadsheet and validate financial model. Capture the model of benefits and costs at the level of estimation desired. Analysis of Alternatives (AoA) provides quality decision-making data as to the tradeoff of different alternatives, providing insight on the direction the sponsor as illustrated in Figure A-16. Alternatives that can be considered are: Different implementation plans for a technology Comparison of different technology solutions for a capability Different time periods Different short-term or long-term goals affecting the quantities delivered per year Different support strategies Different architecture alternatives Base-line vs. alternatives comparisons Figure A-13: AoA Variances: ROIs, Breakeven, Cash Positive The cost team should analyze the alternatives selected and differentiate them in specific instantiations of the cost model that calculate each alternative’s TCO with its financial indicators. STEP 6 - Determining the Financial Indicators The final step is to create the model’s financial indicators. These include: Return on Investment (ROI) – the number of dollars that the investment generated Investment – the dollars (budget) needed to fund the project ROI Percent – return over investment Breakeven – the time it takes for the entire investment to be paid back. Note- this is approximated based on the models results. Cash Positive – the time when no further investment is required and the project is returning savings. Note- this is approximated based on the model’s results, the cost team will select and get approval on the alternative to be analyzed. Initial guidance should be the Capability Analysis Report (CAR). ICH Proprietary - 32 CAM for DBSAE How to: Determine Alternatives Determining alternatives answers a series of questions to make proper selections, such as: Have all possible solutions been identified? Have all viable alternatives been determined? Is there sufficient reason for the exclusion of possible solutions? Are the alternatives truly distinguishable? Are the viable alternatives defined at a sufficient level of detail to define costs and benefits? Have all constraints for each alternative been identified? List all possible solutions that may meet the business problem or opportunity. Based on a practical and common sense analysis, narrow the list to include only viable alternatives, stating the reason for excluding an alternative. Valid alternatives can be simply excluded due to funding constraints. Only the viable alternatives will be further detailed and carried forward into following sections of the business case. 1. Differentiate Alternative Approaches: Review previous alternatives analyses, cost/benefit analyses, or risk/benefit analysis studies that may be used in lieu of conducting a new analysis if the previous studies are consistent with the current requirements and weighted criteria. Search the BTA or other DoD Repositories when it is available. Define the criteria and assumptions for various alternatives whether they are vendor specific, technology specific or functionality specific. Select alternatives in concert with the BCL Problem Statement. ICH Proprietary - 33 CAM for DBSAE EA Process 4: COLLECT DATA AND BENCHMARK METRICS EA Process 4: COLLECT DATA AND BENCHMARK METRICS Assemble data from the myriad sources available, both internal and external, based on the benchmarks established earlier Touch Points: Applicability: Compliance Requirements: PM approval PM’s Cost Team development Milestone: B - Economic Analysis, LCCE DoDD 5000.1 DoDI 5000.2 Clinger-Cohen Act (CCA) – Title 40 CCA 6 - An economic analysis has been conducted that includes a calculation of the return on investment; or for non-MAIS programs, a Life-Cycle Cost Estimate (LCCE) has been conducted Step 7: Collect the Model’s Data & Assumptions Entry Criteria: Exit Criteria: Completion of financial model alternatives PM Approval Functional Sponsor Approval Exit Criteria: Artifacts/Deliverables: Completion of populating financial model PM Approval Work papers on: – Data developed or collected – Documentation of Industry Metrics Step 7 – Collect Model’s Data and Assumptions This section primarily captures industry metrics to speed the time the TCO can be accomplished. It assumes the hierarchal approach (bill of material-like approach) for doing a cost build up of the chart of account is a well established process. Market research data is vital to building out the TCO model. During this stage of the EA process, the PM’s Cost Team identifies the types of data they will need, and classify it into categories. The PM’s Cost Team then identifies potential data sources, and creates a methodology for pursuing and obtaining the data. The PM’s Cost Team should note that the goal of the process is not only collect and analyze cost data. The PM’s Cost Team must identify all relevant data, to include performance data, so that the EA can identify the best overall value among alternatives, and not just the lowest cost. The analysts will also need to develop models to organize the data, such as spreadsheets and databases, which can be used to store the data once it is obtained. Once the data is received, analysts must measure the integrity of the data, and normalize it so that "apples to apples" comparisons can be made. ICH Proprietary - 34 Economic Analysis CAM for DBSAE The challenge is to determine from where the data will be collected. First, there is the Internet and its ability to find public information. Quite often that information is insufficient to determine its utilization in the cost model. The process of─ get the data─ although this step may be obvious, it is rarely easy. Data is often obscured in databases in remote locations, or buried in budget documents. Usually estimating the cost for a contractor to provide product support services is simple. The government formally requests a price from the contractor to provide the services, and the contractor replies with a proposal and price. But as often is the case, the required cost or performance data cannot be found, or in the case of a new weapon system, the data is not available. When data is not available, the resourceful analyst must estimate the data. There is nothing wrong with making estimates, so long as the reasons for why estimates were used, and the methodology for calculating the estimates are clearly explained. Once the data has been collected, the quality of the data must be assessed before it can be used. Analysts should determine whether the data is complete and accurate. Then, the data must be normalized to support "apples to apples" comparisons. The most common form of data normalization consists of applying inflation/deflation to indices to future or past costs. Verify what predefined Market Research firms are available to BTA. Alternatively, there are over 40 possible sources, which can be roughly grouped into three categories: Consulting Firms, Industry Groups, and Publications, detailed information is illustrated in Appendix C. Identify Data Sources The extent of the market research will vary depending on the complexity, past experience, and the amount of information already available. The following are examples of market research techniques that compliment government request-for -information (RFI) approaches: Identify commercial sources/practices/standards. Collect market research information from government and non-government sources. Develop questionnaire for industry and government agencies. Review recent market research from similar acquisitions. Query government and commercial databases. Conduct exchanges with industry. Review marketing literature/brochures. Identify new technology in the market place. Issue RFI or pre-solicitation notices for planning purposes. Conduct an industry panel. Meet one-on-one with commercial sources. Collected Data Independent Audit To insure the validity and accuracy of the data collected and to enhance the resources available, it is always a good practice to engage a third-party audit organization to work in concert. Populate and Compile The EA process is now ready to populate the model for each alternative and compile each model to review and analyze the results. ICH Proprietary - 35 CAM for DBSAE How to: Collect Data and Industry Metrics There are a number of different ways that the actual collection of data could be managed. The method chosen must be selected to fit the needs of the particular analysis. The goal of the team should be to maximize the accuracy and completeness of the data collected while minimizing the time and expense of collecting it. One method that has been used with good results is as follows: Provide industry request for information through formal government actions. Provide the Data Collection package to the industry partners in advance of the PM’s Cost Team interview. Finally, before leaving this discussion of data collection planning, it is worth making the point that the PM’s Cost Team must either include a member who is very proficient at designing and using complex systems of interconnected spreadsheets, or have access to a specialist who can provide this service. The design of such a system of spreadsheets can be a very complex task when the analysis has to account for a multidimensional presentation of data. The set of cost elements will represent one dimension, and the set of product lines or business areas will represent a second dimension. The two dimensional case is relatively uncomplicated and easily managed. The steps are as follows: 1. Create a data collection plan 2. Identify and classify by type of required data elements, i.e., workload, cost, etc. 3. Create data collection procedures and forms. 4. Design and construct spreadsheet architecture for archiving, manipulation, and presentation of business case data. 5. Collect data and populate the spreadsheet database − − − − Workload data─ This is data that quantifies the amount of work that is performed in each business area Performance data─ This is data that quantifies how efficiently work actually is accomplished Cost data─ This is data that captures the total cost of operating a business area Performance standards data ─ this is data that describes minimum acceptable levels of the efficiency of work performed. It includes customer requirements as well as internal measures of performance 6. Analyze data for consistency and any obvious anomalies. Revalidate with the source if required. ICH Proprietary - 36 CAM for DBSAE EA Process 5: CONDUCT TCO ANALYSIS EA Process 5: CONDUCT TCO ANALYSIS Execute a detailed total cost of ownership analysis that encompasses all of the elements defined to this point Touch Points: Applicability: Compliance Requirements: PM approval PM’s Cost Team development Functional Sponsor approval DBSAE PEO approval DBSAE briefing and approval Milestone: B - Economic Analysis, LCCE DoDD 5000.1 DoDI 5000.2 Clinger-Cohen Act (CCA) – Title 40 CCA 6 - An economic analysis has been conducted that includes a calculation of the return on investment; or for non-MAIS programs, a Life-Cycle Cost Estimate (LCCE) has been conducted Step 8: Conduct the EA Analysis Exit Criteria: Completion of populating financial model PM Approval Exit Criteria: Approval of Economic Analysis Report Completion of DBSAE Briefing Package Artifacts/Deliverables: Economic Analysis Report Document supporting CCA compliance Work papers on: – Model Documentation – Documentation of each Alternative Documentation on costs developed for the Model Documentation of Industry Metrics STEP 8 - Conduct TCO Analysis An Analysis of Alternative (AoA) approach is core to the EA, which allows capabilities and technologies to be articulated in concrete solution sets. Alternatives may include different grouping of capabilities or implementation approaches. This final step is running the financial model(s) and compares each of the alternatives. During this phase of the EA, the PM’s Cost Team begins the actual "number crunching" by using the data collected in the initial process to build a case for each alternative and by employing both quantitative and qualitative data. Each alternative, to include the baseline prepared during the process, is compared against each other, in an effort to identify a best value alternative. The PM’S Cost Team should not seek to merely determine which alternative has the lowest cost, but to determine which alternative provides the optimal combination of price and performance (Best Value). The PM’s Cost Team must ensure that each alternative can comply with existing DoD guidelines and identify all risk factors for each alternative and must also address risk mitigation strategies for each identified risk and develop contingency plans to mitigate unforeseen circumstances. ICH Proprietary - 37 Economic Analysis CAM for DBSAE Risk areas that must be addressed are: Cost risk (mitigated through recovery actions) Performance risk (mitigated through metrics) Surge capacity (mitigated through contractual agreement) After the PM’s Cost Team has gathered all cost data and performance/qualitative data, they must accomplish a risk analysis and a sensitivity analysis. Risk analysis attempts to predict the likelihood of an event occurring, and the impact to the case outcome. For some situations, risk analysis can occupy the most volume and level of effort of the entire EA development— quantitative or qualitative. A sensitivity analysis attempts to explain what happens if assumptions change or proven wrong (what if drills). How sensitive are the financial model’s overall outputs, to changes of individual inputs? If this cost changes, how does it affect the bottom line; can be quantitative or qualitative. After these analyses are completed, it is time to identify an alternative and recommend it to the decisionmakers as shown in Figure 17. Figure A-14 Constructing Results of TCO Analysis Units 250,000 Unmanaged PC Direct Cost - 1 Unit $ Direct cost - 250K Units$ In-Direct cost - 250K Units $ Migration Costs $ 4 yr TCO $ 4 yr TCO per SBC Client SBC Direct Cost In-Direct Cost Migration Cost Annual Costs Thin Client 504 $ 393 125,000,000 $ 126,000,000 $ 98,278,503 125,000,000 $ 69,300,000 $ 24,569,626 - $ Managed PC 500 $ $ - $ 24,569,626 437,500,000 $ 299,250,000 $ 184,272,193 2,500 $ 1,613 $ 885 Year 1 (25%) Year 2 (25%) Year 3 (25%) Year 4 (25%) TCO $ 24,569,626 $ 24,569,626 $ 24,569,626 $ 24,569,626 $ 98,278,503 $ 6,142,406 $ 12,284,813 $ 18,427,219 $ 24,569,626 $ 61,424,064 $ 24,569,626 $ 55,281,658 $ 36,854,439 $ $ 62,500,000 $ $ 7,218,342 $ Managed PC Annual $ SBC Saving $ $ 24,569,626 42,996,845 $ 49,139,251 $ 184,272,193 93,750,000 $ 125,000,000 $ 156,250,000 $ 437,500,000 56,895,561 $ 82,003,155 $ 107,110,749 $ 253,227,807 48,825,000 $ 66,150,000 $ 83,475,000 $ 100,800,000 $ 299,250,000 (6,456,658) $ 29,295,561 $ 40,478,155 $ 51,660,749 $ 114,977,807 Unmanaged PC Unmgd PC Annual SBC Saving Managed PC Breakeven Year is 2nd year ROI 468% benefit/investment ICH Proprietary - 38 CAM for DBSAE DBSAE Briefing Instructions ICH Proprietary - 39 CAM for DBSAE DECISION POINT/RESULTS PRESENTATION DBSAE Decision Point Brief By the time the CAM process has reached this stage the analysis is complete. But the work is not done until the results have been presented in the written report provided in Appendix A. This report is the exit criteria for providing a standardized block of information to acquisition officials. The report provides a framework for the Briefing Package. Timing: Keep the main presentation focused and to the point. A good rule of thumb is not more than 60 minutes. Questions and discussion quickly consume the briefer’s time, so schedule a time block for the briefing period to accommodate interaction between the briefer and the audience. Content: The content of the briefing have been described in the earlier processes. This should be able to be accomplished in about 10 to 15 slides. The Briefing Package needs to communicate with clarity and brevity but, at the same time have sufficient detail to allow an independent review of the methodology, analysis, findings, and recommendation. Did the recommended alternative cost less? Was it best value approach to achieving the capabilities? Were there any surprising or unexpected results or findings that could be misinterpreted? An effective CAM process must recommend a course of action to the decision-makers which would bring closure to the case and remind the stakeholders that in order to move forward, they (stakeholder) will have to make a decision. The course actions must be within Defense acquisition FAR policies. CAM does not select a solution but addresses whether there is sufficient market base that can satisfy the government’s mission needs. Pre-acquisition data under CAM is valuable for determining what capabilities are required in an Acquisition’s Statement of Work (SOW), the evaluation Criteria and independent cost estimate. Assessment data or solutions cannot be used in a procurement process as this data must come directly from vendor submissions in the RFP process. ICH Proprietary - 40 CAM for DBSAE APPENDICES LIST Appendix A: CAM Assessment Templates Part 1 & 2 Appendix B: Detailed description of Process Part 1 DOTMLPF Part 2 Multiple Product Solutions Part 3: Clinger-Cohen Act Compliance Appendix C: Sources for Collecting Data Appendix D: Acronyms Appendix E: Glossary Appendix F: Sample Project Plan ICH Proprietary - 41 CAM for DBSAE APPENDIX A: CAM SOLUTION ASSESSMENT REPORT TEMPLATE Part 1 EXECUTIVE SUMMARY <Optional> < An overview of the evaluation (the 'what', 'why' and the results) This will cover as appropriate: A brief description of the Problem Statement and its underlying high level requirement and specific capability) The brief description of the analysis group(s) The brief description of the prioritization table for each analysis group A brief description of the scoring and presentation of the CAM color-coded “value” matrix A brief discussion of the sensitivities of the risk scores > Capability Analysis < Identifying requirements and conducting the initial analysis, which will cover: Identification of the Sponsor, potential funding and BTA priority Description of the Project Statement to include scope ( BTA-wide. Use-case-wide, base-wide) Description of High-level Capabilities (e.g., JCID documents) Description of Functional Capabilities related to problem statement and a mapping to high-level capabilities Description of architectural constraints in the technology assessment Team Assigned Describe any organizational, industry, or government mandates that guide or constrain alternatives available > < Capability Map Result Table> Capability Determination <. Design of the Technology Assessment study Items here include: Description of the each analysis group Identification of the products, services components or technology to evaluated on the capabilities in each analysis group Short description of each products, services components or technology > < Capability Analysis Group(s) Result Table> Capability Prioritization < Valuing the capabilities of the Technology Assessment study, Items here include: ICH Proprietary - 42 CAM for DBSAE Description the valuing plan for how weights are to be assigned to each capability at level-1 and level2. Discussion on the necessity of low value weight capability > < Capability Prioritization Result Table> Feasibility or Architecture Assessment < Scoring the technology Items here include: Description the scoring plan for how scores are to be assigned to each capability Interpretations of the findings in the light of project and evaluation goals Any limitations or weaknesses in the findings, methods, data, etc. (i.e. validity issues) Judgments against the evaluation criteria Comment on the generalization of the findings Any 'unexpected' findings > < Scored Result Table> Conclusion, Recommendations and Options < Evaluation of the technology It should include: Description of the overall risks identified in the Value Matrix Discussion of the sensitivity/tradeoff of the technologies (e.g., one technology was better in one area and another was better in another. Overall judgments of the worth of the project Comment on the validity and reliability of the findings on which the judgments are made Any recommendations for change/improvement in product, process, outcomes, or application of the innovation to include Recommend any adjustments required in evaluation design should the study be replicated (lessons to be learned. Provide recommendations based on the analysis above. Discontinue investment, select one of the identified alternatives, or defer decision pending additional information or changes in business or market conditions. Identify any specific actions that are recommended to ensure that risks and uncertainties are effectively managed throughout Categorized the Assessment into one of these four options: − − − − Category -1: Technology is mature, has BTA value and has funding Category -2: Technology is mature, has BTA value and is not funding Category -3: Technology is immature, has BTA value (wait for industry improvement or invest. Category -4: has no BTA value > < Value Matrix Table> Acknowledgements / Credits <Give credit where due> References ICH Proprietary - 43 CAM for DBSAE APPENDIX A: CAM SOLUTION ASSESSMENT REPORT TEMPLATE Part 2 INTRODUCTION Background <Provide limited background on the needs that initiates an Economic Analysis>. Subject of the business case <Describe the problem statement under consideration and key objectives that drove the decision to develop a business case.> Scope of the business case <Describe the intended use of the Economic Analysis, including audience (system-wide, base-wide, usecase-wide, etc. and timing.> Governing mandates <Describe any organizational, industry, or government mandates that guide or constrains alternatives available.> METHODS AND ASSUMPTIONS Alternative Solutions and data <Describe the primary alternatives considered. Provide the basis for any comparable projects and or benchmark models used to validate estimation data sources. Describe the estimation approach (ballpark, parametric, or detail labor and material) used.> Scope of the case <Describe the time frame and resources available for the business case development process. Also, describe any constraints in alternatives considered.> Financial Metrics <Describe which financial metrics will be used to evaluate the effectiveness of an investment and / or to compare alternative solutions.> Benefits <Describe the expected primary benefit areas (mission effectiveness and cost reduction) for each alternative solution.> Costs <Describe the expected primary cost areas (direct costs, indirect costs, migration cost, saving) for each alternative solution.> ICH Proprietary - 44 CAM for DBSAE Major Assumptions <Describe key assumptions that were used to develop the TCO; e.g., industry metrics, cost-buildup> BUSINESS IMPACTS Overall Results <Present the net result (TCO, ROI, etc.) that demonstrates the impact of primary alternatives under consideration. Compare primary alternative solutions in side-by-side table or chart (or possibly overlapping chart if it is not overly complex) to assist in evaluating decision> <Supplement with tables and / or charts that demonstrate accrual sensitivity analysis of other results or external factors outside the scope of the analysis: Present significant benefit impacts (both revenue enhancement and cost reduction) for each project alternative under consideration Present significant cost impacts (both capital and expense) for each project alternative under consideration.> <Identify key parameters and conditions that impact the investment decision. Present potential contingent actions that could mitigate the uncertainty. Identify how such uncertainties impact the analysis and investment decision > RECOMMENDATIONS AND CONCLUSIONS <Provide recommendations based on the analysis above. Invest/Discontinue investment, select one of the identified alternatives, or defer decision pending additional information or changes in business or market conditions. Identify any specific actions that are recommended to ensure that risks and uncertainties are effectively managed throughout> REVIEW AND APPROVAL PROCESS Acknowledgements / Credits <Give credit where due.> Derived from Defense Acquisition University, Business Case Cots Toolkit, Version 1.0 ICH Proprietary - 45 CAM for DBSAE References OMB Circular A-94, Guidelines and Discount Rates for Benefit-Cost Analysis of Federal Programs. http://www.whitehouse.gov/OMB/circulars/a094/a094.pdf OMB Circular A-76, Performance of Commercial Activities. http://www.whitehouse.gov/OMB/circulars/a076.pdf OMB Circular A-130, Management of Federal Information Resources. http://www.whitehouse.gov/omb/circulars/a130/a130trans4.html NIH IT Management Guide. http://irm.cit.nih.gov/itmra/itmgmtgd.html VA Information Technology Capital Investment Guide (specifically Appendix F, Cost Benefit Analysis Guide). http://www.va.gov/oirm/ITplanning/IT_Capital_Investment_Guide.asp Making Hard Decisions with Decision Tools Suite. Robert T. Clemen, Terry Reilly, and Terence Reilly. (Duxbury Press, 2000). Cost Estimation, From Concept to Bid. John D. Bledsoe, (R.S. Means, 1992). The Business Case Guide, 2nd edition, Marty Schmidt, Solution Matrix, (www.solutionmatrix.com). Numerous sample business cases. ICH Proprietary - 46 CAM for DBSAE APPENDIX B: DETAILED DESCRIPTION OF PROCESS Part 1 DOTMLPF D for DOCTRINE– Are there any doctrines or procedures in place which are contributing to the identified gaps or capabilities. The Capability Analysis report will address essential information on any Doctrine (DOTMLPF) or operational procedures in place that are not being followed which may be contributing to the identified deficiencies. The report will also tackle the challenges which could correct the deficiencies or at least reduce its impact. If no doctrine or procedures are in place which pertain to the identified gaps or capabilities, the Capability Analysis report will furthermore provide recommendation if new doctrine or new procedures need to be developed and implemented. This will provide either a complete or partial solution to the identified capabilities or gaps. O for ORGANIZATION– What is the problem we are trying to solve and should explain? Who has the problem or who is the client/customer (organization) and should explain who needs the solution and who will decide that the problem has been resolved? The “Problem statement” presents a clear concise description of the issues that need to be addressed by the PM’s Cost Team and should be presented to them (or created by them) before they try to solve the problem. A good “Problem Statement” should answer the questions below and place emphasis on the Organizational (DOTMLPF) scope of the problem: What is the problem we are trying to solve and should explain? Who has the problem or who is the client/customer (organization) and should explain who needs the solution and who will decide that the problem has been resolved? The mission and management direction of the organization/s with the problem (Primary and secondary missions). What are the organizational principles and priorities? Is the organization adequately staffed and funded to deal with the operational deficiencies and/or its primary effects? Is upper management aware of the mission/functional deficiencies? Whether the mission/functional deficiencies have been identified in any other departments within the organization and if so, are these deficiencies being resolved? If these deficiencies are not being resolved, why? Who is aware of the impact of these mission/functional deficiencies? What form can the resolution be? What is the scope and limitations (in time, money, resources, and technologies) that can be used to solve the problem? The primary purpose of a “Problem Statement” is to focus the attention of the team. However, if the focus of the problem is too narrow or the scope of the solution too limited, the creativity and innovation of the solution can be stifling, therefore a balance must be created. T for TRAINING– Are the identified deficiencies or gaps caused at least in part, by the complete lack of training, or inadequate training? In many organizations, certain IT tasks are performed by end users, either as a conscious decision because IT resources are not relied upon, or because IT resources were not allocated for the support and service functions. These indirect expenses are often hidden, and if not measured, the true costs of IT systems are often underestimated. Training (DOTMLPF) of the End-user can play a key role in cost of any program and during the Capability Analysis process the PM’s Cost Team must document whether: The deficiency or gap is caused at least in part, by the complete lack of training, or inadequate training Does training exist which addresses the deficiency/gap? ICH Proprietary - 47 CAM for DBSAE Is training being delivered effectively? How are training results being measured and monitored? Is the deficiency/gap caused by a lack of competency or proficiency on existing systems and equipment? Was the deficiency/gap discovered during an exercise? Do personnel affected by the deficiency/gap have access to training? Is upper management supporting and enforcing the training effort? Is training properly staffed and funded? M for MATERIEL– Are the deficiencies or gaps caused by inadequate systems or equipment? The purpose of the Capability Analysis report is not just to establish a clear problem statement but to understand why there is a problem and how to resolve this problem. In an attempt to understand “why we have the problem”, the PM’s Cost Team may consider factors such as: Whether the deficiency or gap may be caused in part by inadequate systems or equipment? What current systems are part of the occurring problem? What functionality would a new system provide that is currently not available? Are there increases in operational performance that are needed to resolve the deficiency/gap? Is the operational deficiency caused by a lack of competency or proficiency on existing systems and equipment? Can increase in performance be achieved without development of a new system, and if so, how? Who would be the primary and secondary users of the proposed systems or equipment? L for LEADERSHIP– Are the deficiencies or gaps in part caused by the inability or decreased ability to cooperate/coordinate/communicate with upper management or external departments? The Solution Assessment places a lot of emphasis on Leadership or Upper Management participation to provide governance over the program and offer feedback for continuous process improvement. The PM’s Cost Team will be tasked to identify whether: The deficiencies /gaps are in part caused by the inability or decreased ability to cooperate/coordinate/communicate with upper management or external departments Senior management understands the scope of the problem There are resources at its disposal to correct the deficiency/gaps Leadership is being trained on effective change management principles Upper management has properly assessed the level of criticality, threat, urgency and risk of the operational results of the deficiencies/gaps Upper management is aware of the constraints for resolving the deficiencies or gaps within the organization P for PERSONNEL– Is the operational deficiency caused in part by the inability to place qualified and trained personnel in specialized occupations. The PM’s office will assign the resources to perform the Solution Assessment, and it is importance for them to understand how lack of trained personnel can impact the Solution Assessment process. It is just as important for them to assess whether the operational deficiency is caused in part by the inability to place qualified and trained personnel in specialized occupations. If materiel or equipment is involved, the PM’s Cost Team needs to determine whether these personnel were tested and certified on their ability to operate all equipment used in executing the mission/function. If the solution requires new materiel, systems or equipment, different occupational specialty codes or sub codes are required to properly staff new systems─ primary users, maintenance personnel, and support personnel. The PM’s Cost Team must also assess whether new training programs need to be developed for newly recruited personnel. ICH Proprietary - 48 CAM for DBSAE F for FACILITIES–The cost of acquiring new facilities can have a huge impact on the initial cost of any program; therefore it is important for the PM’s Cost Team to assess the existing facilities to determine if: The deficiency or gap is caused in part by inadequate infrastructure and if so, was the problem a result of: Aging/wear? New engineering processes didn’t meet needs? ICH Proprietary - 49 CAM for DBSAE APPENDIX B: DETAILED DESCRIPTION OF PROCESS Part 2 Multiple Assessment Capability Determination - Multi-Product Solutions Multi-product solutions to a capability pose another dilemma. Comparing products when they meet only a subset of the capabilities does not provide a useful assessment. It is recommended that each subset of the capabilities that have specific clusters of products that provide those capabilities be placed into separate Analysis Groups. This avoids the sparse-matrix problem that portrays more risk to the solution that is really a lower risk score. In Figure A-18, the number of null scores will make every technology score low, due to the fact that not a single technology group is capable of solving the problem. Figure A-15 Capabilities Associated with Specific Different Technology Groups Analysis Group Collaboration Capability 1 Analysis Group Infrastructure 2 Analysis Group 3Visualization Cap # 1 Email Cap # 2 Voice/IP Cap # 3 Telephony Cap # 4 Graphical Simulation Cap # 5 Simple 2 – D plots Cap # 6 Graphical Surfaces Cap # 7 IM/Blog Cap # 8 Networks Cap # 9 Security Cap #10 Storage Three technologies are required to solve a subset (Email is an e.g. of a subset for collaboration) of the capability gap, for that reason 3 analysis groups (dense) may be required to avoid a sparse matrix problem. We will use an example for cross-domain solution (CDS) and illustrate the formation of Analysis Group # 1 for Collaboration in Figure A-19 below. Figure A-16 Analysis Model 1: Collaboration CDSs Capability Analysis Model Collaboration 1 Cap # 1 Email Cap # 2 Voice/IP Cap # 3 Telephony Cap # 7 IM/Blog Using the same example for Cross-Domain Solution (CDS), Figure A-20 illustrates the formation of Analysis Model # 2 for Infrastructure and depicts the assessment for capabilities 8, 9, and 10. Figure A-17 Analysis Model 2: Infrastructure Capability Cap # 8 Networks Cap # 9 Analysis Model 2 Infrastructure Security Cap # 10 Storage ICH Proprietary - 50 CAM for DBSAE The Analysis Model 3 is formed to assess the Visualization component of Cross-Domain Solution. Figure A-21 illustrates the model to assess capabilities # 4, 5, and 6. Figure A-18 Analysis Model 3: Visualization Capability Analysis Model 3 Visualization Cap # 4 Graphical Simulation Cap # 5 Simple 2 – D plots Cap # 6 Graphical Surfaces Combining multiple Analysis Models may yield useful information as well as the overall relative scoring related to fulfilling the total capability gap. During the Solution Assessment process the Analysis Plan must assure that quality and insightful decision-making information is generated from the assessment. ICH Proprietary - 51 CAM for DBSAE APPENDIX B: DETAILED DESCRIPTION OF PROCESS Part 3 Clinger-Cohen Act Compliance Requirements Clinger-Cohen Act Compliance Requirements CCA 1 - Make a determination that the acquisition supports core, priority functions of the Department CCA 2 - Establish outcome-based performance measures linked to strategic goals CCA 3 - Redesign the processes that the system supports to reduce costs, improve effectiveness and maximize the use of COTS technology CCA 4 - No Private Sector or Government source can better support the function CCA 5 - An analysis of alternatives has been conducted CCA 6 - An economic analysis has been conducted that includes a calculation of the return on investment; or for non-MAIS programs, a Life-Cycle Cost Estimate (LCCE) has been conducted CCA 7 - There are clearly established measures and accountability for program progress CCA 8 - The acquisition is consistent with the Global Information Grid policies and architecture, to include relevant standards CCA 9 - The program has an information assurance strategy that is consistent with DoD policies, standards and architectures, to include relevant standards CCA 10 - To the maximum extent practicable, (1) modular contracting has been used, and (2) the program is being implemented in phased, successive increments, each of which meets part of the mission need and delivers measurable benefit, independent of future increments CCA 11 -The system being acquired is registered ICH Proprietary - 52 CAM for DBSAE APPENDIX C: SOURCES FOR COLLECTING DATA Consulting Firms Tier 1 Benchmarkers These are individually commissioned and priced benchmarking projects in which the benchmarker compares the customer's metrics with data from a database built from previous benchmarks. Such projects typically take 8 to 12 weeks. Forrester Research says they can cost up to $200k. There are four main players: Interoperability Clearinghouse, consortium-based with quick action capabilities to gather detailed deep dives in capabilities delivery of technology specific to a business case analysis at a low cost Gartner Market leader, 20 year track record. Compass 20 year track record. Global reach. Burton group Foresster Nautilus, another newer player, founded in the US in 2005. Offer innovative Pro-Benchmark tool. Other Benchmarkers & Consultancies There are a number of smaller players who carry out benchmarking work within specific IT departments Industry Groups Hackett Group─ Atlanta GA based Hackett have offered some form of bench- marking for many years Global Information Partners─ small US-based benchmarker associated with IMF QPMG─ Application Development/Function Points only SPR─ Application Development/Function Points only Industry Process Models ITIL, the Information Technology Infrastructure Library, is a set of books and training courses describing the major processes and best practices in running an IT organization. COBIT. Widely adopted best practice and process guide which in v4 includes process metrics definition for over 300 processes. CMM, CMM-I Process maturity scorecard aimed mainly at software development and system integration. IFPUG Application Development only - industry body specifying Function Point counting rules Benchmark Portal is a service run through Purdue University which benchmarks contact centers, offering a free "Reality Check", reports on industry performance levels, and a specific peer comparison service. Other Industry Bodies Publications AQPC─ US based not-for-profit organization offering high to medium level list of recommended IT metrics. IMF─ US based CIO trade federation which includes a benchmarking service for members. SPEC─ Industry body providing standard performance ratings of CPU/motherboard configurations TPC─ Industry body providing standard performance ratings of servers running (mostly) database workloads. Publications are books or libraries which can be bought for a one-off or a monthly subscription fee. Industry Analyst Firms Gartner TCO Studies. In addition to their benchmarking research, Gartner introduced the term Total Cost Of Ownership (TCO) in 1993 and still predict TCO figures in their research papers. Gartner Key Metrics Reports. Additionally, Gartner offer an annual publication which used to be called the Worldwide Benchmark Report (published by META Group), which reports typical ranges for some IT costs and staffing levels. The report is believed to cost in the region of US$5,000 per year. Forrester Research. There are one or two research notes available on metrics from Forrester, but many IT functions are not covered, and many of the notes are extremely old (2002 or 2003). Only available to paidup Forrester subscribers. ICH Proprietary - 53 CAM for DBSAE APPENDIX D: ACRONYMS AA Architecture Assessment CAM Architecture Assurance Method ACAT Acquisition Category AF Air Force AFCA Air Force Communication Agency AFCYBER Air Force Cyber Command AOA Analysis of Alternatives ASAP AF Solution Assessment Program ASAP-AP ASAP Analytical Process ASV Application Streaming and Virtualization BCA Business Case Analysis CAC Common Access Card CD Capability Determination CDD Capabilities Determination Document CDS Cross-domain solution CI&D Capabilities, Implementation and Deployment (Process) COTS/GOTS Commercial off-the-shelf / Government off-the-shelf CP Capability Prioritization DAU Defense Acquisition University DOTMLPF Doctrine, Organization, Materiel, Leadership, Personnel, Facilities EBR Evidenced Based Research EP&I Enterprise, Planning and Investment FA/AA Feasibility / Architecture Assessment FA Feasibility Assessment FAA Functional Area Analysis GDB Governance Database HVAC Heating, Ventilation, Air Conditioning I&A Identification and Authentication ICH Interoperability Clearinghouse IM Instant Messaging IP Internet Protocol IT Information Technology ITF Integration Task Force i-TRM Infrastructure Technology Reference Model ICH Proprietary - 54 CAM for DBSAE JCIDS Joint Capabilities Integration Development System JOPsC Joint Operations Concepts JWICS Joint Worldwide Intelligence Communication System EA Economic Analysis NIPRNet Unclassified but Sensitive Internet Protocol Router Network; formerly -- Non-Classified Internet Protocol Router Network PC Personal Computer PfM Portfolio Management PKI Public Key Infrastructure RAS Reliability, Availability, Survivability RFI Request for Information RFP Request for Proposal RIE Rapid Improvement Event SA Solution Assessment SA Selection Assessment SBC Server Based Computing SDC Standard Desktop Configuration SIPRNet Secure Internet Protocol Router Network TA Technology Assessment TCO Total Cost of Ownership TOA Target(s) of Assessment TP&R Transformation, Priorities & Requirements VCA Value Chain Analysis WAN Wide Area Network XML Extensible Markup Language ICH Proprietary - 55 CAM for DBSAE APPENDIX E: GLOSSARY Air Force Solution Assessment Program (ASAP) -- ASAP is a Technology Assessment tool. The ASAP management and technical assessment framework assures that AF sought capabilities are comprehensively defined and evaluated and that end results are developed that provide for best-fit and best-value solutions. The ASAP process is measurable, executable and repeatable. Architecture Assurance Methodology (CAM) -- The CAM is an Information Technology investment decision support methodology designed to better enable sound investment decisions in the acquisition process. Goals are three fold: validate the priority and clarity of requirements in terms of capabilities; establish objective, service-oriented evaluation criteria and metrics; and increase efficiency, efficacy and a higher utilization of COTS/GOTS products in Federal Agency operations. Capability Analysis (CA) The initial process of the CAM identifies the requirements/ capabilities for the program and further defines the problem statement and scope of the effort. Capabilities are defined at the Program level as a basis of the business case. This analysis ensures that there is sufficient data to understand the viability of technology and sufficient data to develop the Total cost of ownership (TCO) for the material solution. Capability Determination (CD) -- A Capability Determination is the process that defines “what” capabilities are to be evaluated by “what” technologies. This is a process that creates groupings (tables) of capabilities and technologies that satisfy the capabilities gaps. These groups will be utilized in the followon processes of the assessment. Capability Prioritization (CP) -- A Capability Prioritization is a process used to assess the comparative value of the capabilities under review to the various activities/roles (use cases) of the organization. This prioritization should be provide by the potential “users” of the products/solutions under evaluation. Feasibility/Architecture Assessments (FA/AA) -- The Feasibility Assessment is a process for analysis of emerging/innovation technology products regarding their ability to fulfill the high level capability gaps. The Architecture Assessment is a process that supports ACAT I/II programs for a more detailed analysis of the alternative technology products being consider in the program’s architecture, where this capabilities have been decomposed into more detail than the FA. Economic Analysis (EA) – An Economic Analysis is a simplified Business Case Analysis, which provides decision support process that identifies alternatives and provides business and technical arguments for selection and implementation to achieve stated organizational objectives. An EA provides an analytic and uniform foundation upon which sound decisions are made. Total Cost of Ownership (TCO) -- The Total Cost of Ownership is a financial estimate designed to help consumers and enterprise managers assess direct and indirect costs commonly related to the purchase of Information Technology (IT) components. It is an all-encompassing collection of the costs associated with IT investments, including capital investments, license fees, leasing costs and service fees, as well as direct (budgeted) and indirect (unbudgeted) labor costs. Typically, direct costs are made up of labor and capital costs. Indirect costs are often expressed as the costs associated with the factors that drive or are driven by direct costs decisions, e.g., downtime or quality of service, training, etc. ICH Proprietary - 56 CAM for DBSAE APPENDIX F: Sample Project Plan The objective of this effort is the following Problem Statement: “What are the best value Hosting Environments for BTA Systems” The Project Plan for the DBSAE Hosting Assessment is shown in a 6 diagrams. Each Phase diagram show the Phase and Phase name and describes the entry Point into the Phase, the Activities to be performed, the Documents to be produced and stored in to a repository and finally the exit criteria. The overall sequence of the phase is shown on Figure E-1. Fig. E-1 The 6 Phases are shown in Figures E-2 – E-7 and the Plan of Actions and Milestones (POAM) in Figure E8. ICH Proprietary - 57 CAM for DBSAE Fig. E-2 Fig. E-3 Fig.4 ICH Proprietary - 58 CAM for DBSAE Fig 5 ICH Proprietary - 59 CAM for DBSAE Fig. E-6 Fig 7 ICH Proprietary - 60 CAM for DBSAE Finally, the Project POAM is presented in Figure E-9 which provides the Project Schedule: Fig. E-8 ICH Proprietary - 61 CAM for DBSAE ICH Proprietary - 62