A Case Study of GAO’s Review of FY06 Exhibit 300s “Agencies Need to Improve the Accuracy & Reliability of Investment Information” GAO-06-250 Carol Cha Presentation Overview • • • • Audit Objectives Background Audit Scope & Methodology Discussion of Key Exhibit 300 sections: • Performance Goals & Measures • Analysis of Alternatives • Risk Management • Security & Privacy • Project & Funding Plan • Reported cost data across sections • Wrap-up/Questions • Resources 2 Exhibit 300 Review Objective • To determine the extent to which selected agencies have underlying support for the information in their FY 2006 exhibit 300s. • Requested by House Government Reform 3 Background: Legislative Mandate Clinger-Cohen Act of 1996: • improve the implementation and management of IT projects • expand the responsibilities of OMB to establish processes to analyze, track, and evaluate major capital IT investments • expand the responsibilities of the agencies to engage in capital planning and performance- and results-based decisions 4 Background: OMB requirements Capital Asset Plan and Business Case or Section 300 of Circular A-11: • provides policy for planning, budgeting, acquisition, and management of federal capital assets • instructs agencies on budget justification and reporting requirements for investments • agencies submit to OMB to justify resource requests for investments 5 Background: Processes behind the 300 Agencies will develop capital planning and investment control (CPIC) processes that include: • evaluation and selection of investments that will support core mission functions and demonstrate projected returns on investment • institution of performance measures and management processes that monitor and compare actual investment performance to planned results • establishment of oversight mechanisms that require periodic review of operational capital assets 6 Background: OMB Use of 300 OMB uses the exhibit 300 to: • make quantitative decisions about budgetary resources • make qualitative assessments about whether the agency’s investments are consistent with OMB policy and guidance • identify and correct poorly planned or performing investments • find real or potential systemic weaknesses in federal information resource management 7 Background: Usefulness of the 300 Provides a good overview of investment including: • Project status • Investment stage • documentation available • Project processes • Budget • Program decision-making • Institutional processes • Board review and approval 8 Background: Usefulness of the 300 Lack of required information in the 300 or documentary support: • may indicate that the information was developed for the exhibit 300 alone and is not actually used to manage the investment • may indicate an underlying weakness in the management of the project • may indicate lack of guidance or weakness in agency oversight of project 9 Audit Scope & Selection Methodology • The 5 selected agencies made up about one-third of civil planned FY 2006 expenditures. These were: • USDA • Commerce • Energy • Transportation • Treasury 10 Audit Scope & Selection Methodology • Agency selection process was based on two criteria: • Planned spending of at least $1 billion on IT investments in FY06 • First and second largest number of IT investments in each of three categories of the federal government’s Business Reference Model: (1) Services for Citizens, (2) Support Delivery of Services, (3) Management of Government Resources. • DOD and DHS were excluded due to completed and ongoing IT investment reviews. • This was a non-probability sample, therefore results cannot be used to make inferences across government. 11 Audit Methodology • Three key questions of our review: • Was the exhibit 300 complete? • Did the agency documentation comply with various OMB requirements and laws? • Did the agency documentation match what was reported in the exhibit 300? 12 Audit Tool • Team designed a criteria matrix that included: • Evaluative questions for each question in the 300 • Applicable cites/requirements (i.e. legislative mandates, guidelines, circulars) • Suggested documentary support • Criteria matrix enabled the team to perform a detailed and consistent evaluation of the information contained in the 300 for: • Completeness • Adherence to federal requirements/guidelines 13 Performance Goals and Measures Purpose: • Describes the link between the agency’s annual goals and mission and how the investment will help the agency meet those goals. This section illustrates the performance measures and results of the investments Documentation: • Annual performance plan and/or annual performance budget, and documentation to support the calculations presented with performance goals. 14 Performance Goals and Measures Guidance and Criteria: • Paperwork Reduction Act, Clinger-Cohen Act, OMB’s Capital Programming Guide Management Processes: • Both agency-wide and IT strategic planning processes such as the Enterprise Architecture management process (e.g. Federal Enterprise Architecture – Performance Reference Model). Management Use: • Senior agency officials should leverage the use of IT investments to help fill gaps in agency performance. OMB reviews whether or not an investment helps to meet agency goals as well as goals across the Federal Government. 15 Performance Goals and Measures Findings: • Many investments did not follow instructions/properly report performance goals in the exhibit 300. Reporting was generally observed in the new performance table related to the FEA Performance Reference Model. • Most investments (23 of 29) did not have support for the performance goals listed in the exhibit 300. Cause: • Agency officials cited a lack of training as one of the reasons for not following the exhibit 300 instructions for reporting performance goals. • Investment officials stated that they did not know they needed to have documentary support for reported performance goals. • One investment official noted that because performance goals and the related actual results are managed external to the investment team, they have little control over what is reported. 16 Analysis of Alternatives Purpose: • Provide a summary of the comparison of viable alternative solutions that includes a general rationale and analysis of the monetized costs and benefits for each alternative presented. Documentation: • Cost-benefit analysis (CBA), Cost effectiveness analysis (CEA), Lifecycle Cost Analysis Guidance and Criteria: • Clinger-Cohen Act, Paperwork Reduction Act, OMB Circular A-94, and OMB’s Capital Programming Guide. 17 Analysis of Alternatives Management Processes: • The development of the analysis of alternatives for all alternatives considered should be done prior to selecting an investment. Thereafter, a new CBA or CEA should be conducted as needed. • The development of an analysis of alternatives should be an integral part of the Capital Planning and Investment Control Process, whereby an investment is selected, controlled, and evaluated throughout its useful lifecycle. 18 Analysis of Alternatives Management Use: The analysis of alternatives can be used by management and GAO to: • Help make investment decisions based upon a comparison of viable alternatives; • Confirm whether or not a selected investment continues to be cost effective or the best viable alternative; and • Review the investment’s cost performance in terms of expected versus actual lifecycle costs. 19 Analysis of Alternatives Findings: • Most (72%) did not have support for what was reported in the analysis of alternatives section of the exhibit 300. Ex. Net Present Value and Payback Period calculations often could not be supported. • Numbers reported were often not representative of the alternatives analysis conducted. Ex. one agency hard-coded numbers such as the discount rates that were used for financial calculations. The expected lifecycle for some investments were also altered. • In all cases where documentation was available, none fully complied with A-94 and A-11 criteria. Cause: • Agency officials were confused by the guidance provided in Circular A-94. • One agency instructed investments in steady state to not conduct an analysis of alternatives. 20 Risk Management Purpose: • Identify potential problems before they occur so that risk-handling activities can be planned and risks can be mitigated. • Risk management is a continuous process and should start early in the project with the collaboration of relevant stakeholders. • According to OMB’s guidance, agencies need to actively manage risks (covering 19 risk areas (e.g. cost, schedule, technology, security, privacy) that OMB cites specifically, plus any others identified by the agency) Documentation: • Risk management plans, risk assessments, risk reports 21 Risk Management Guidance and Criteria: • SEI Capability Maturity Model Integration (CMMI) Management Processes: • Prepare for risk management • Establish a risk management strategy • Identify and Analyze Risks • Evaluate, categorize and prioritize risks • Mitigate Risks • Develop risk mitigation plans and implement them Management Use: • Regardless of the type of risk management, all agencies should have risk management teams that practice informal, if not formal, risk management. • Risks should be assessed continuously and used for decision-making in all phases of a project. Risks are carried forward and dealt with until they are resolved. 22 Risk Management Findings: • While most investments provided documentation to show that some risk were being managed (security), most of the other risks identified by OMB were not being managed (75% investments). • Risk management plans were out of date or did not reflect the current operating environment. Cause: • Officials stated that some of the 19 risks were not applicable to their investment. • Officials did not understand some of the risk categories. 23 Security & Privacy Purpose: • summarizes the agency’s ability to manage security at the system or application level • ensures security planning for the investment is proceeding in parallel with the development of the investment • demonstrate that privacy has been considered in the context of the investment Documentation: • Security Plans, Certification & Accreditation package, security awareness training logs, incident handling procedures 24 Security & Privacy Guidance and Criteria: • Adherence to Federal Information Security Management (FISMA) requirements • Meets National Institute of Standards and Technology (NIST) and OMB guidelines Management Processes: • demonstrate compliance with the certification and accreditation process • undertake mitigation of security weaknesses and risks 25 Security & Privacy Findings: • Security plans generally adhered to NIST (for 21 of 22 operational investments. • Most investments (86%) had documented security awareness training and a mechanism for tracking training • 77% of operational investments did not have incident handling procedures documented at the system level Cause: • Most agency officials stated that they were not aware that incident handling needed to be addressed at the system level. 26 Project & Funding Plan (i.e., EVM Section) Purpose: • Demonstrate to OMB and senior agency executives that the investment is being managed using a performance-based management process. Specifically, • new & mixed-life cycle investments must use an earned value management process that meets the American National Standards Institute (ANSI) standard; • steady state investments must use an operational analysis process that meets Capital Programming Guide criteria. 27 Project & Funding Plan (i.e., EVM Section) Documentation: • EVM system validation: may include integrated baseline review (IBR) briefings, system validation letter from an independent organization (i.e., DCMA, private firms). • EVM deliverables: includes work breakdown structure, cost performance reports, integrated master schedule/plan. • Operational analysis: agency or investment procedures, analysis documents. Guidance and Criteria: • EVM: ANSI/EIA-748-A, OMB August 2005 Memorandum (M05-23) • Operational analysis: Capital Programming Guide 28 Project & Funding Plan (i.e., EVM Section) Management Use: • Conformance with ANSI criteria provides assurance that EVM data is reliable. Findings: • 15 of the 21 investments did not implement ANSI-compliant EVM processes. • 6 of 8 steady state investments did not conduct an operational analysis. Cause: • Lack of understanding at the department and project levels on what EVM is and how to implement it. • Lack of guidance from OMB on what constitutes an operational analysis. 29 Reliability of Cost Data Reported in the 300 • In all cases, cost information reported in the exhibit 300 was derived from ad hoc processes rather than from costaccounting systems with adequate controls. • Summary of Spending – figures for FY04 were not reliable. • Project & Funding Plan – government costs were derived from ad hoc systems. 30 Recommendations Recommended that OMB: • Direct agencies to determine the extent to which information contained in the exhibit 300s is accurate and reliable and disclose where it is not and how the agency plan to remedy the problem. • Develop and promulgate clearer guidance. • Provide for training of agency personnel responsible for completing exhibit 300s. 31 Resources • GAO-06-250, Information Technology: Agencies Need to Improve the Accuracy and Reliability of Investment Information, January 2006. • GAO-04-49, Information Technology Management: Governmentwide Strategic Planning, Performance Measurement, and Investment Management Can Be Further Improved, January 2004. 32