INDEPENDENT VERIFICATION AND VALIDATION GUIDELINES June 14, 2010 Quality Assurance Bureau Project Oversight and Compliance Division Department of Information Technology Department of Information Technology IV&V Guidelines ABOUT THIS DOCUMENT This document is intended for New Mexico state agencies to prepare IV&V information technology (IT) professional services contracts with appropriate IV&V deliverables and IV&V vendors to prepare IV&V reports. Independent verification and validation (IV&V) is required for all certified IT projects except those for which IV&V is waived by the Secretary of the Department of Information Technology (DoIT). This document: 1. Covers IV&V as part of the DoIT mandate to ensure adequate risk management for information technology projects. 2. Clarifies the IV&V procurement process. 3. Reviews templates that serve as the foundation for the IV&V review of certified projects. 4. Covers agency and IV&V vendor requirements. REVISION HISTORY REVISION NUMBER DATE COMMENT 12 June 14, 2010 Quality Assurance Bureau initial publication 2 Department of Information Technology IV&V Guidelines Table of Contents ABOUT THIS DOCUMENT ....................................................................................................................................... 2 REVISION HISTORY................................................................................................................................................. 2 BACKGROUND AND DOIT AUTHORITY ................................................................................................................... 4 PROJECT CERTIFICATION AND IV&V 4 IV&V DEFINITIONS 4 Verification ...........................................................................................................................................................5 Validation .............................................................................................................................................................5 Independent .........................................................................................................................................................5 Independent Verification and Validation .............................................................................................................5 IT PROJECTS MUST FOLLOW PRODUCT DEVELOPMENT LIFE CYCLE 5 CERTIFICATION PROCESS 6 Project certification phases and implementation phase reporting ......................................................................7 LEAD AGENCY AND IV&V VENDOR RESPONSIBILITIES ............................................................................................ 7 LEAD AGENCY IV&V RESPONSIBILITIES BY PROJECT CERTIFICATION PHASE IV&V PROCUREMENT AND CONTRACT PROCESS 8 IV&V SCOPE OF WORK 9 REPORT SUBMISSION 12 7 APPENDIX A ..................................................................................................................................................... 13 PROJECT DELIVERABLES TASK ITEMS / REPORTING TOPICS 13 APPENDIX B ..................................................................................................................................................... 27 IV&V REPORTING TEMPLATE 27 3 Department of Information Technology IV&V Guidelines BACKGROUND AND DoIT AUTHORITY The DoIT Secretary’s July 5, 2007 IT Project Oversight Process Memorandum (Project Oversight Memo) is a restatement of 1.12.5.2 NMAC which was repealed September 30, 2005. The authority for this memorandum is contained in the Department of Information Technology Act , Chapter 9, Article 27 NMSA 1978, which states that the secretary shall: “Provide oversight of information technology projects, including ensuring adequate risk management, disaster recovery and business continuity practices and monitoring compliance with strategies recommended by the information technology commission for information technology projects that impact multiple agencies.” The Project Oversight Memo delineates agency responsibilities for following state project management methodology, including IV&V, and expectations for IV&V vendor reporting. Project Certification and IV&V The project certification and IV&V processes are DoIT risk management tools which rely heavily on strong project management. • Project management is risk mitigation. It is about how we create and manage a temporary organization to deliver a unique product, service or result. • Project management reduces the risk of counterproductive chaos by providing structure to the temporary organization and the solution development/deployment process. • Acknowledging and communicating risks to stakeholders keeps them in the loop, and often involves them in risk mitigation or reducing impact. Project Certification provides risk management through controlled release of project funds. Agencies are required to provide evidence of adequate planning through identification of business requirements and project organizational structure to ensure project success and responsible expenditure of state federal funds. The IV&V process reduces project failure through identification and mitigation of risks which are possible roadblocks to success. Projects may have unique risks and issues; however, common risks are undocumented processes and procedures, or not adhering to them when documented. Verification and validation by independent entities is fundamental to DoIT’s responsibility to “provide oversight of information technology projects, including ensuring adequate risk management.” IV&V Definitions Consistent with oversight responsibilities of the DoIT Project Oversight and Compliance Division the following define independent review of a project, its process and its deliverables. 4 Department of Information Technology IV&V Guidelines VERIFICATION “Verification” means the project is adhering to project management disciplines, planned and performed according to its project plans and that such adherence can be verified by an independent examination of project documents and other evidence. VALIDATION “Validation” means that the project deliverables and project results meet the business and technical objectives established by the project sponsors, ensuring that the end product meets the documented performance outcomes and requirements of the project. INDEPENDENT “Independent” means autonomous and impartial verification and validation assessment of a project’s adherence to project management plans and compliance with business requirements. These independent assessments are performed by an entity that is not responsible for developing the product or performing the activity being evaluated. INDEPENDENT VERIFICATION AND VALIDATION “Independent verification and validation (IV&V)” is the means of obtaining an independent and objective view of an IT project with the intent of protecting the state of New Mexico’s interests, and is focused on the management of the project and its compliance with specified requirements through its development stages. IT Projects Must Follow Product Development Life Cycle The Project Oversight Memo establishes the requirement for a product or solutions development life cycle approach to state IT projects and mandates its use as the basis of agency and IV&V reporting: “Product development life cycle” is a series of phases comprised of iterative disciplines such as requirements, analysis and design, implementation, test and deployment implemented to build a product or develop a service. “During the project management lifecycle, agencies shall select and implement a phased product development lifecycle methodology approved by the Department.” “Lead Agency shall: Prepare a written risk assessment report at the inception of a project and at the end of each product development lifecycle phase or more frequently for large and highrisk projects. “ “IV&V Reporting: Prepare interim reports based on the phases as indicated within the project schedule. Included in the report will be an evaluation on whether product development requirements are being met, project management is effective, continuing risk analysis, and how the project is implementing previous recommended risk mitigation strategies. 5 Department of Information Technology IV&V Guidelines PRODUCT DEVELOPMENT LIFE CYCLE APPROVAL Agencies must present the project’s product development life cycle in Section 4.2 of the Department of Information Technology project management plan template. When the version of the Project Management Plan containing the product development life cycle is accepted by the Project Oversight and Compliance Division, that life cycle is approved. It is expected that the IV&V vendor will follow the phases presented by the agency as the basis for monthly and development phase assessments. Certification Process The IT certification process establishes basic requirements for each phase in order to recommend release of funds commensurate with project status. The Initiation, Planning, and Closeout phases within the product development life cycle are synonymous with their corresponding phases in the certification process. The remaining phases of any approved product development life cycle are contained in the Implementation Phase of the certification process. The Project Initiation and Planning phases of the certification process address the organization of the project around business and technical objectives. The focus is on the governance structures, the budget and schedule, as well as the potential impact on the state’s information technology infrastructure – in particular any impact on the DoIT. It may also include development of business requirements. The Implementation phase, using an approved product development life cycle, may include the elaboration of business and technical requirements; a system design that evolves from these requirements; building, testing and accepting the solution, deployment into production and transfer to operations. 6 Department of Information Technology IV&V Guidelines PROJECT CERTIFICATION PHASES AND IMPLEMENTATION PHASE REPORTING The illustration below shows the product development life cycle using a traditional implementation methodology as the structure for the implementation phase of the certification process. The implementation methodology is defined in the agency’s Project Management Plan and approved by the Department of Information Technology. LEAD AGENCY AND IV&V VENDOR RESPONSIBILITIES The lead agency on a multi-agency project is the agency with fiscal accountability for the project or the agency designated lead by legislation, usually the General Appropriation Act. The majority of IT projects are deployed by a single agency, which is the lead by default. Lead Agency IV&V Responsibilities by Project Certification Phase The lead agency is accountable for quality strategies, including IV&V, in compliance with DoIT best practices and standards. Initiation Phase – Project Charter included with certification request indicates IV&V plans Planning Phase – Project Management Plan indicates IV&V planning and procurement approach Implementation Phase – Executed IV&V contract included with certification request and periodic IV&V reports received for the duration per that contract Closeout Phase – Final IV&V report, preferably a post-implementation report, must be submitted, specifically addressing any open items 7 Department of Information Technology IV&V Guidelines IV&V Procurement and Contract Process It is important to consider the likelihood of an IV&V engagement requiring more than one year or ultimately costing over $200,000 when considering the appropriate procurement method. Consultation with the agency’s DoIT IT Business Consultant prior to embarking on a specific approach is highly recommended. The General Services Department State Purchasing Division maintains price agreements with responsive vendors for IV&V services under $200,000 and lasting no more than 365 days. If there is any possibility that either of these limitations will be exceeded during project execution – by taking into consideration the not uncommon possiblity of project setbacks, delays, changes in scope and schedule, etc., an RFP procurement provides the most flexibility for changes to scope and total cost of IV&V services warranted by anticipated, and unanticipated, implementation delays and changes. IV&V services are required to be maintained continuously throughout project implementation regardless of setbacks, delays and price agreement limitations, underscoring the need to include adequate time for IV&V procurement in the project schedule and vigilance by the project director, manager(s), steering committee(s) and executive sponsor(s) to ensure successful procurement planning and navigation of procurement intricacies and processes. All IV&V contracts must include: language that requires submission of all IV&V deliverables to DoIT to a location designated by DoIT; and a report format consistent with the current DoIT IV&V reporting template. When an IV&V contract is fully executed, i.e. all signatories have signed, an electronic copy of the executed contract must be emailed to a location designated by DoIT. The scope of work contained in IV&V contracts executed under a price agreement may be extended beyond one year if: sole source documentation submitted with a new contract with the same IV&V vendor is approved; and any additional activities and costs do not incur total project lifetime costs for IV&V services to exceed the $200,000 limit; and the new contract scope of work does not duplicate the original contract scope of work. Price agreement IV&V services may be modified by amendment as long as the amendment does not extend the term beyond 365 days nor increase the total amount of the contract above $200,000. IV&V services procured by RFP may be amended at any time without these limitations, depending on term status of the contract under the RFP, i.e. the contract hasn’t expired. 8 Department of Information Technology IV&V Guidelines IV&V Scope of Work Deliverables should be tailored to the project in its entirety, which means tailored to the project risk, project scope and project budget. The deliverables listed below are high level suggestions. Project deliverables and task items may also include reporting topics in Appendix A. Small Projects – Scope of work may include COTS Applications: Installation, Configuration and Migration. IV&V Project Management Plan, Conduct Initial Review, Conduct Periodic Review(s), Planning Oversight, Project Management, Quality Management, Training, Requirements Management and System and Acceptance Testing. Medium Projects – Scope of work may include COTS Applications: Installation, Modification, Conversion, Implementation, Testing and Multi-Stakeholders. IV&V Project Management Plan, Conduct Initial Review, Conduct Periodic Review(s), Planning Oversight, Project Management, Quality Management, Training, Requirements Management, System and Acceptance Testing, and Data Management. Large Projects – Scope of work may include New Development, Requirements Gathering, Architecture Design, Implementation, Conversion, Testing, Deployment, Multi-Stakeholders, and Multiple Agencies. IV&V Project Management Plan, Conduct Initial Review, Conduct Periodic Review(s), Planning Oversight, Project Management, Quality Management, Training, Requirements Management, Operating Environment, Development Environment, Software Development, System and Acceptance Testing, Data Management and Operations Oversight. All projects subject to IV&V activities must have at a minimum the following IV&V activities: IV&V project management plan initial assessment initial status reports interim project progress ongoing risk analysis deliverable forecast DoIT may require other specific deliverables for a project depending on the risk of the project. 9 Department of Information Technology IV&V Guidelines Deliverable One IV&V Project Management Plan In general the plan will describe the activities, vendor personnel, project schedule and methodology for conducting the reviews. Project Reporting Topics and Frequency The project management plan will include any deliverables, time line, frequency, personnel negotiated in the IV&V contract, as well as details on the conduct of specific topics for Verification and Validation of the Implementation and Close-out Phases, including post-implementation reporting. It is highly recommended that periodic reports use the IV&V report template in Appendix B and required they contain mandatory reporting criteria. Reporting topics should be tailored to the project respective to risk, scope, and budget. Monthly reporting is recommended and preferred. Upon acceptance of the IV&V project management plan deliverables, the next report is usually an initial assessment of the project. 10 Department of Information Technology IV&V Guidelines Deliverable Two Initial IV&V Report Following acceptance of the IV&V Project Management Plan, the vendor shall conduct a review of project risks and activities to produce an initial IV&V assessment. The report must contain the following sections. Risk Management Assessment Project Management Assessment The IV&V vendor shall verify and evaluate the approach and potential effectiveness of the risk management strategy for the project. The IV&V vendor shall verify and asses project management and organization; verify that lines of reporting and responsibility provide adequate managerial oversight of the project. Verify project risk identification in the Project Management Plan and Risk Analysis Matrix. The IV&V vendor shall validate identified roles and responsibilities of key project personnel including technical and business ownership. Verify risk management mitigation strategies are appropriate and traceable to identified risk. Verify Project Deliverables Verify Product Deliverables The IV&V vendor shall verify Project Deliverables are appropriately scoped and detailed respective to the current phase, including but not limited to: The IV&V vendor shall verify Product Deliverables are clearly organized and defined. Expected products and/or services and associated results should be explicitly defined including: - Project Management Plan - Project Budget - Project Deliverables and associated milestones - Identification of Critical Components to achieve lasting business value During the course of the project at prescribed intervals, the IV&V vendor shall prepare interim project progress reports as follows: 11 Department of Information Technology IV&V Guidelines Deliverable Three IV&V Periodic Reports Executive Summary A summary of the overall project progress must include status and trend per Appendix B template: Green – All scope, budget, schedule or IV&V project issues are manageable by the project team & issues are resolved within an appropriate time period, e.g. 30 days for short-term projects to 90-days for longterm projects. Project Reporting Topics IV&V vendor will report on agreed upon Reporting Topics in Deliverable 1 above. Periodic reports shall reflect the current phase of the project and associated reporting topics. Reporting topics should be tailored to the project respective of risk, scope and budget. Monthly reporting is preferred. Yellow – All scope, budget, schedule or IV&V project issues are manageable; but one or more is escalated to the Executive Steering Committee or require(s) executive management intervention for resolution in accordance with the project’s Issue Escalation and Resolution Procedure, e.g. 60 days for short-term projects to 120 days for long-term projects. Includes all Watchlist projects. Red – Scope, budget, schedule or IV&V project issues have been escalated to the Executive Sponsor in accordance with the project’s Issue Escalation and Resolution Procedure and the project requires DoIT assistance. Includes all projects with outstanding issues that cannot be resolved by the Executive Sponsor within 45 days and all At Risk projects. Trend Improving Trend Static Ongoing Risk Analysis IV&V vendor shall report on project risk activity including agency management of identified risk. Deliverable Forecast IV&V vendor shall report on completion or noncompletion of critical project components and/or deliverables especially when in danger of not being achieved. Trend Deteriorating It is highly recommended that the IV&V vendor conduct a post-implementation assessment after the project has been in production for several months to report whether business and technical objectives were achieved based on project scope and acceptance criteria. This report, or the final IV&V report, is submitted with the request to certify project close-out. If the report contains any outstanding issues, the agency must address them in writing. Report Submission The IV&V vendor shall submit all reports to the agency contractual recipients, agency executive steering committee and the Department of Information Technology Project Oversight and Compliance Division (IVANDV.REPORTS@state.nm.us) within five (5) business days of each deliverable due date as indicated in the IV&V contract. 12 Department of Information Technology IV&V Guidelines APPENDIX A Project Deliverables Task Items / Reporting Topics Planning Oversight Task Item Task # Task Description Procurement PO-1 Verify the Agency procurement strategy supports the State of New Mexico regulations and applicable Federal procurement policies PO-2 Review and make recommendations on the quality of the solicitation documents relative to their ability to adequately inform potential vendors about Project objectives, requirements, risks, etc. PO-3 Verify that procurement evaluation criteria are consistent with Project objectives and that evaluation processes are consistently applied; verify all evaluation criteria are metrics based and clearly articulated within the solicitation documents. PO-4 Verify that the obligations of the vendor, sub-contractors and external staff (terms, conditions, statement of work, requirements, technical standards, performance standards, development milestones, acceptance criteria, delivery dates, etc.) are clearly defined. This includes verifying that performance metrics have been included allowing the tracking of Project performance and progress against criteria set by the State. PO-5 Verify that the final contract for the vendor team states that the vendor will participate in the IV&V process, be cooperative in the coordination and communication of information. Project PO-6 Justification Assess and review the agencies methodologies used for the feasibility study, verifying it was objective, reasonable, measurable, repeatable, consistent, accurate and verifiable. PO-7 Review the agency’s project justification, business case and assess the alignment of the project with the purpose and mission of the agency or agencies. PO-8 Review and evaluate the Agency’s Cost Benefit Analysis to assess its reasonableness related to the project. P0-9 Verify that the assigned complexity level is current and accurate. If the project complexity is not current and/or accurate, the re-assess its complexity to make sure appropriate project tasks and reporting is being carried out Project Complexity 13 Department of Information Technology IV&V Guidelines Project Management Task Item Task # Task Description Project Sponsorship PM-1 Assess participation, support and commitment by the Executive sponsor(s) and Business Owner(s), and that open pathways of communication exist between Project Management and Executive sponsor(s). PM-2 Verify that there is a Project Governance plan and an active Executive Steering Committee whose role it is to acknowledged all changes impacting Project objectives, cost, or schedule. PM-3 Verify and assess Project management and organization; verify that lines of reporting and responsibility provide adequate technical and managerial oversight of the Project. PM-4 Evaluate Project progress, resources, budget, schedules, work flow, and reporting. PM-5 Assess coordination, communication and management to verify that agencies and departments are following the communication plan and are not working independent of one another. PM-6 Verify that a Project Management Plan is being followed. Evaluate the Project management plans and procedures to verify that they are developed, communicated, implemented, monitored and complete. PM-7 Evaluate Project reporting plan and actual Project reports to verify Project status is accurately traced using Project metrics. PM-8 Verify milestones and completion dates are planned, monitored, and met. PM-9 Verify the existence and institutionalization of an appropriate Project issue tracking mechanism that documents issues as they arise, enables communication of issues to proper state agency individuals/entities, documents a mitigation strategy as appropriate, and tracks the issue to closure. This should include, but is not limited to, technical and development efforts. PM-10 Evaluate the project’s product planned life-cycle development methodology or methodologies (waterfall, evolutionary spiral, rapid prototyping, incremental, etc.) to see if they are appropriate for the product being developed. Management Assessment Project Management Business Process PM-12 Reengineering Risk Management Evaluate the Project’s ability and plans to redesign business systems to achieve improvements in critical measures of performance, such as cost, quality, service, and speed. PM-13 Verify that the reengineering plan has the strategy, management backing, resources, skills and incentives necessary for effective change. PM-14 Verify that resistance to change is anticipated and prepared for by using principles of change management at each step (such as excellent communication, participation, incentives) and having the appropriate leadership (executive pressure, vision, and actions) throughout the reengineering process. PM-15 Verify that the Project is managing project risk through reporting, logging and acting on reducing Risk. Ascertain that the project is following the State’s requirement for an 14 Department of Information Technology IV&V Guidelines initial and then periodic risk assessment reports. Evaluate the Project’s risk management plans and procedures to verify that risks are identified and quantified and that mitigation and contingency plans are developed, communicated, implemented, monitored, and complete. Change Management PM-16 Verify that a Change Management Plan is created and being followed. Evaluate the change management plans and procedures to verify they are developed, communicated, implemented, monitored, and complete, and that Project Managers anticipate and prepare for resistance to change. Communication PM-17 Management Verify that a Communication Plan is created and being followed. Evaluate the communication plans and strategies to verify they support communications and work product sharing; assess communication plans and strategies for effectiveness, implementation, monitoring and completeness. Configuration Management PM-18 Review and evaluate the configuration management (CM) plans and procedures associated with the project management documents and with the product development process.” Configuration management deals with both. PM-19 Verify that all critical documents, project management and development, including but not limited to requirements, design, code and JCL, are maintained under an appropriate level of control. PM-20 Verify that the processes and tools are in place to identify code versions and to rebuild system configurations from the source code. PM-21 Verify that appropriate source and object libraries are maintained for training, testing, and production, and that formal sign-off procedure is in place for approving Deliverables. PM-22 Verify that appropriate processes and tools are in place to manage system changes, including formal logging of change requests and the review, prioritization and timely scheduling of maintenance actions. PM-23 Verify that mechanisms are in place to prevent unauthorized changes to the system and to prevent authorized changes from being made to the wrong version. PM-24 Review the use of corrective maintenance information (such as the number and type of corrective maintenance actions over time) in Project management. Project PM-25 Estimating and Scheduling Project Personnel Evaluate and make recommendations on the estimating and scheduling process of the Project to ensure that the Project planning assumptions, budget and resources are adequate for the work-breakdown structure and schedule. PM-26 Review schedules to verify that adequate time and resources are assigned for planning, development, review, testing and rework. PM-27 Examine relevant supporting data to determine if the Project team has accurately estimated the time, labor and cost of software development efforts. PM-28 Examine the job assignments, skills, training and experience of the personnel involved in program development to verify they have the appropriate skills for the development task. PM-29 Evaluate the staffing plan for the Project to verify that adequate human resources will be available for development and maintenance. 15 Department of Information Technology IV&V Guidelines Project Organization PM-30 Evaluate the Project Manager’s qualifications and comment about the project’s fulfillment of the State requirement for qualified project managers. PM-31 Verify that lines of reporting and responsibility provide adequate technical and managerial oversight of the Project. PM-32 Verify that the Project’s organizational structure supports training, process definition, independent Quality Assurance, Configuration Management, risk and issue management, product evaluation, and any other functions critical for the Projects success. Subcontractors PM-33 and External Staff PM-34 Evaluate the use of sub-contractors or other external sources of Project staff (such as IS staff from another State organization) and its impact on budget and knowledge transfer. Verify that the obligations of sub-contractors and external staff (terms, conditions, statement of work, requirements, standards, development milestones, acceptance criteria, delivery dates, etc.) are clearly defined. PM-35 Verify that the subcontractors’ software development methodology and product standards are compatible with the system’s standards and environment. PM-36 Verify that the subcontractor has and maintains the required skills, personnel, plans, resources, procedures and standards to meet their commitment. This will include examining the feasibility of any offsite support of the Project PM-37 Verify that any proprietary tools used by subcontractors do not restrict the future maintainability, portability, and reusability of the system. State Oversight PM-38 Verify that project is in compliance with the State of New Mexico’s Enterprise Architecture, Information Security and other IT policies and Guidelines. PM-39 Verify that the project is in compliance with the DoIT memorandum on project oversight process and project certification process. PM-40 Verify that State staff has the ultimate responsibility for monitoring Project cost and schedule. 16 Department of Information Technology IV&V Guidelines Quality Management Task Item Quality Assurance Task Task Description # QA-1 Evaluate and make recommendations on the Project’s Quality Assurance plans, procedures and organization. QA-2 Verify that QA has an appropriate level of independence from Project management. QA-3 Verify that the QA organization monitors the fidelity of all defined processes in all phases of the Project. QA-4 Verify that the quality of all products produced by the Project is monitored by formal reviews and sign-offs. QA-5 Verify that Project self-evaluations are performed and that measures are continually taken to improve the process. QA-6 Monitor the performance of the QA team and or contractor by reviewing its processes and reports and performing spot checks of system documentation; assess findings and performance of the processes and reports. QA-7 Verify that QA has an appropriate level of independence; evaluate and make recommendations on the Project’s Quality Assurance plans, procedures and organization. QA-8 Evaluate if appropriate mechanisms are in place for Project self-evaluation and process improvement. Process QA-9 Review all defined processes and product standards associated with the system development, identify gaps if any. Definition and Product QA-10 Verify that all major development processes are defined and that the defined and approved Standards processes and standards are followed in development. QA-11 Verify that the processes and standards are compatible with each other and with the system development methodology. QA-12 Verify that all process definitions and standards are complete, clear, up-to-date, consistent in format, and easily available to Project personnel 17 Department of Information Technology IV&V Guidelines Training Task Item Task # Task Description User Training TR-1 and Documentation TR-2 Review and make recommendations on the training provided to system users. Verify that training for users is instructor-led and hands-on and is directly related to the business process and required job skills. TR-3 Verify that user training materials and help desk support documentation match the accepted system TR-4 Verify that all necessary policy, process and procedure documentation are easily available to users. TR-5 Verify that all training is given on-time and is evaluated and monitored for effectiveness, with additional training provided as needed. Developer TR-6 Training and Documentation TR-7 Review and make recommendations on the training provided to system developers. TR-8 Verify that all necessary policy, process, procedure and standards documentation is easily available to developers. TR-9 Verify that all training is given on-time and is evaluated and monitored for effectiveness, with additional training provided as needed. System Administrators Training and Documentation Verify that developer training is technically adequate, appropriate for the development phase, and available at appropriate times. TR-10 Review and make recommendations on the training provided to system administrators. TR-11 Verify sufficient knowledge transfer for maintenance and operation of the new system. TR-12 Verify that Systems administrator training is technically adequate, appropriate for the development phase, and available at appropriate times. TR-13 Verify that all necessary policy, process, procedures and standards documentation is easily available to Systems administrator. 18 Department of Information Technology IV&V Guidelines Requirements Management Task Item Task # Task Description Requirements RM-1 Management Evaluate and make recommendations on the Project’s process and procedures for managing requirements. RM-2 Verify that business, technical and system requirements, as well as operations requirements are well-defined, understood and documented. . RM-3 Verify that business, technical and system requirements, as well as operations requirements can be traced through technical design, system build, software coding/configuration and test phases to verify that the system performs as intended and contains no unnecessary software elements RM-4 Verify that requirements are under formal configuration control. Security and RM-5 Privacy Requirements RM-6 Evaluate and make recommendations on Project policies and procedures for ensuring that the system is secure and that the privacy of client data is maintained. Evaluate the project’s compliance with State of New Mexico IT security policies, processes and procedures in the development of the products IT security. RM-7 Evaluate the project’s planning for compliance with Federal security and privacy requirements RM-8 Verify that processes and equipment are in place to back up client and Project data and files and archive them safely at appropriate intervals Requirements RM-9 Analysis Verify that an analysis of client, State and federal needs and objectives has been performed to verify that requirements of the system are well understood, well defined, and satisfy state and federal regulations. RM-10 Verify that all necessary state agency individuals/entities have been consulted to the desired functionality of the system, and that users have been involved in prototyping of the user interface. RM-11 Verify that all state agency individuals/entities have bought-in to all changes which impact Project objectives, cost, or schedule. RM-12 Verify that performance requirements (e.g. timing, response time and throughput) satisfy user needs RM-13 Verify that user’s maintenance requirements for the system are completely specified Interface RM-14 Requirements RM-15 Verify that all system interfaces are exactly described, by medium and by function, including input/output control codes, data format, polarity, range, units, and frequency. Verify those approved interface documents are available and that appropriate relationships (such as interface working groups) are in place with all agencies and organizations supporting the interfaces. 19 Department of Information Technology IV&V Guidelines Requirements RM-16 Allocation and Specification RM-17 RM-18 Reverse RM-19 Engineering Verify that all system requirements have been allocated to an either a software or hardware subsystem. Verify that requirements specifications have been developed for all hardware and software subsystems in a sufficient level of detail to ensure successful implementation. Validate that there are objective acceptance criteria for validating the requirements of the project including those in any requirements specification documents If a legacy system or a transfer system is or will be used in development, Verify that a well defined plan and process for reengineering the system is in place and is followed. The process, depending on the goals of the reuse/transfer, may include reverse engineering, code translation, re-documentation, restructuring, normalization, and re-targeting. Operating Environment Task Item System Hardware System Software Database Software Task # Task Description OE-1 Evaluate new and existing system hardware configurations to determine if their performance is adequate to meet existing and proposed system requirements. OE-2 Determine if hardware is compatible with the State’s existing processing environment, if it is maintainable, and if it is easily upgradeable. This evaluation will include, but is not limited to CPUs and other processors, memory, network connections and bandwidth, communication controllers, telecommunications systems (LAN/WAN), terminals, printers and storage devices. OE-3 Evaluate current and Projected vendor support of the hardware, as well as the State’s hardware configuration management plans and procedures. OE-4 Evaluate new and existing system software to determine if its capabilities are adequate to meet existing and proposed system requirements. OE-5 Determine if the software is compatible with the State’s existing hardware and software environment, if it is maintainable, and if it is easily upgradeable. This evaluation will include, but is not limited to, operating systems, middleware, and network software including communications and file-sharing protocols. OE-6 Has Project evaluated Current and Projected vendor support of the software, as well as the States software acquisition plans and procedures. OE-7 Has project evaluated new and existing database products to determine if their capabilities are adequate to meet existing and proposed system requirements? 20 Department of Information Technology IV&V Guidelines System Capacity OE-8 Has the Project determined if the database’s data format is easily convertible to other formats, if it supports the addition of new data items, if it is scalable, if it is easily refreshable and if it is compatible with the State’s existing hardware and software, including any on-line transaction processing (OLTP) environment. OE-9 Has Project evaluated Current and Projected vendor support of the software, as well as the States software acquisition plans and procedures. OE-10 Has the project Evaluated the existing processing capacity of the system and verify that it is adequate for current statewide needs for both batch and on-line processing. OE-11 Has the project Evaluated the historic availability and reliability of the system including the frequency and criticality of system failure? OE-12 Evaluate the results of any volume testing or stress testing. OE-13 Has the Project evaluated any existing measurement and capacity planning program and evaluated the system’s capacity to support future growth. OE-14 Has the project investigated changes in processing hardware, storage, network systems, operating systems, COTS software, and software design to meet future growth and improve system performance. OE-15 Network Capacity and Connectivity OE-16 Operational Staffing OE-17 Verify that the project has ascertained that network connections and bandwidth, communication controllers, telecommunications systems (LAN/WAN) are being benchmarked and are sufficient to support the solution. Verify that coordination is going on with the State DoIT telecommunications Infrastructure Voice and Radio staff. Has the project verified and evaluated operations staffing requirements for implementation and post implementation, including but not restricted to existing agency or DoIT staff and staffing levels. Identify any needs for specialized skill sets. 21 Department of Information Technology IV&V Guidelines Development Environment Task Item Task # Task Description Development DE-1 Hardware Has the project evaluated new and existing development hardware configurations to determine if their performance is adequate to meet the needs of system development? DE-2 Determine if hardware is maintainable, easily upgradeable, and compatible with the agency or State’s existing development and processing environment. This evaluation will include, but is not limited to CPUs and other processors, memory, network connections and bandwidth, communication controllers, telecommunications systems (LAN/WAN), terminals, printers and storage devices. DE-3 Has the Current and Projected vendor support of the hardware been evaluated, as well as the State’s hardware configuration management plans and procedures. Development DE-4 Software Evaluate new and existing development software to determine if its capabilities are adequate to meet system development requirements. DE-5 Determine if the software is maintainable, easily upgradeable, and compatible with the State’s existing hardware and software environment. DE-6 Evaluate the environment as a whole to see if it shows a degree of integration compatible with good development. This evaluation will include, but is not limited to, operating systems, network software, CASE tools, Project management software, configuration management software, compilers, cross-compilers, linkers, loaders, debuggers, editors, and reporting software. DE-7 Language and compiler selection will be evaluated with regard to portability and reusability (ANSI standard language, non-standard extensions, etc.) DE-8 Has the Project evaluated Current and Projected vendor support of the software as part of the States software acquisition plans and procedures. 22 Department of Information Technology IV&V Guidelines Software Development Task Item Task Task Description # HighLevel Design SD-1 Evaluate and make recommendations on existing high level design products to verify the design is workable, efficient, and satisfies all system and system interface requirements. SD-2 Evaluate the design products for adherence to the Project design methodology and standards. SD-3 Evaluate the design and analysis process used to develop the design and make recommendations for improvements. Evaluate design standards, methodology and CASE tools used will be evaluated and make recommendations. SD-4 Verify that design requirements can be traced back to system requirements. SD-5 Verify that all design products are under configuration control and formally approved before detailed design begins. Detailed SD-6 Design Evaluate and make recommendations on existing detailed design products to verify that the design is workable, efficient, and satisfies all high level design requirements. SD-7 The design products will also be evaluated for adherence to the Project design methodology and standards. SD-8 The design and analysis process used to develop the design will be evaluated and recommendations for improvements made. SD-9 Design standards, methodology and CASE tools used will be evaluated and recommendations made. SD-10 Verify that design requirements can be traced back to system requirements and high level design. SD-11 Verify that all design products are under configuration control and formally approved before coding begins. Job SD-12 Perform an evaluation and make recommendations on existing job control and on the process for designing job control. Control SD-13 Evaluate the system’s division between batch and on-line processing with regard to system performance and data integrity. SD-14 Evaluate batch jobs for appropriate scheduling, timing and internal and external dependencies. SD-15 Evaluate the appropriate use of OS scheduling software. SD-16 Verify that job control language scripts are under an appropriate level of configuration control. Code SD-17 Evaluate and make recommendations on the standards and process currently in place for code development. SD-18 Evaluate the existing code base for portability and maintainability, taking software metrics including but not limited to modularity, complexity and source and object size. 23 Department of Information Technology IV&V Guidelines SD-19 Code documentation will be evaluated for quality, completeness (including maintenance history) and accessibility. SD-20 Evaluate the coding standards and guidelines and the Projects compliance with these standards and guidelines. This evaluation will include, but is not limited to, structure, documentation, modularity, naming conventions and format. SD-21 Verify that developed code is kept under appropriate configuration control and is easily accessible by developers. SD-22 Evaluate the Project’s use of software metrics in management and quality assurance. SD-23 Evaluate the plans, requirements, environment, tools, and procedures used for unit testing system modules. Unit Test SD-24 Evaluate the level of test automation, interactive testing and interactive debugging available in the test environment. SD-25 Verify that an appropriate level of test coverage is achieved by the test process, that test results are verified, that the correct code configuration has been tested, and that the tests are appropriately documented. System and Acceptance Testing Task Item Task # Task Description System Integration Test ST-1 Evaluate the plans, requirements, environment, tools, and procedures used for integration testing of system modules. ST-2 Evaluate the level of automation and the availability of the system test environment. ST-3 Verify that an appropriate level of test coverage is achieved by the test process, that test results are verified, that the correct code configuration has been tested, and that the tests are appropriately documented, including formal logging of errors found in testing. ST-4 Verify that the test organization has an appropriate level of independence from the development organization. ST-5 Evaluate the plans, requirements, environment, tools, and procedures for pilot testing the system. ST-6 Verify that a sufficient number and type of case scenarios are used to ensure comprehensive but manageable testing and that tests are run in a realistic, real-time environment. ST-7 Verify that test scripts are complete, with step-by-step procedures, required pre-existing events or triggers, and expected results. ST-8 Verify that test results are verified, that the correct code configuration has been used, and that the tests runs are appropriately documented, including formal logging of errors found in testing. Pilot Test 24 Department of Information Technology IV&V Guidelines Interface Testing ST-9 Verify that the test organization has an appropriate level of independence from the development organization. ST-10 Evaluate interface testing plans and procedures for compliance with industry standards. Acceptance and ST-11 Turnover Verify that acceptance procedures and acceptance criteria for each product must be defined, reviewed, and approved prior to test and the results of the test must be documented. Acceptance procedures must also address the process by which any software product that does not pass acceptance testing will be corrected. ST-12 Verify that appropriate acceptance testing based on the defined acceptance criteria is performed satisfactorily before acceptance of software products. ST-13 Verify that the acceptance test organization has an appropriate level of independence from the subcontractor. ST-14 Verify that all training materials, end user and system administrator for using the contractor-supplied and or contractor customized software is taken through a testing and user acceptance process to assure than it matches the actual finished and delivered software solution. ST-15 Verify the existence of agency or DoIT operations management acceptance of the solution to be under their operational management Implementation ST-16 Planning Review and evaluate implementation plan, including consideration of training, agency productivity and scheduling and other factors that could interfere with a successful implementation. Data Management Task Item Task Task Description # Data DM-1 Evaluate the project’s existing and proposed plans, procedures and software for data conversion. Conversion DM-2 Verify that procedures are in place and are being followed to review the completed data for completeness and accuracy and to perform data clean-up as required. DM-3 Determine conversion error rates and if the error rates are manageable. DM-4 Make recommendations on making the conversion process more efficient and on maintaining the integrity of data during the conversion. Database Design DM-5 Evaluate new and existing database designs to determine if they meet existing and proposed system requirements. 25 Department of Information Technology IV&V Guidelines DM-6 Recommend improvements to existing designs to improve data integrity and system performance. DM-7 Evaluate the design for maintainability, scalability, refresh ability, concurrence, normalization (where appropriate) and any other factors affecting performance and data integrity. DM-8 Evaluate the Project’s process for administering the database, including backup, recovery, performance analysis and control of data item creation. Operations Oversight Task Item Task # Task Description Operational Issues OO-1 and Change Management OO-2 Tracking OO-3 Evaluate Post implementation plans for tracking issues and formulating change requests and fixes. Evaluate post implementation monitoring and maintenance processes to assure proper solutions operations and verify operations team sign off. Evaluate post implement change management plan Customer & User OO-3 Operational Satisfaction Evaluate methods to be used to obtain user satisfaction feedback to determine areas for improvement Operational Goals OO-4 Evaluate impact of system on program goals and performance standards. Operational Documentation OO-5 Evaluate operational plans and processes are properly documented for turn over to operations teams. Operational Processes and Activity OO-6 Evaluate implementation of the process activities including backup, disaster recovery and day-to-day operations to verify the processes are being followed. 26 Department of Information Technology IV&V Guidelines APPENDIX B IV&V Reporting Template (starts next page) 27 TITLE Agency and Project Names Date etc. 1 REVISION NUMBER REVISION DATE #.# mm/dd/yyyy SUMMARY OF CHANGE IV&V Report – deliverable ## [description] [Which, e.g. Initial, Interim, Final] IV&V Report: [Deliverable Name and/or #] [Agency Name] [Project Name] PROJECT AT A GLANCE Overall Status* (See footnote for definitions) Green [Check ONLY one box] Yellow Red [Insert brief new comments in blue to explain choice of overall trend G/Y/R rating] Overall Trend [Check ONLY one box] Improving Static Deteriorating [Insert brief new comments in blue to explain choice of overall trend G/B/R rating] Immediate Attention Required Issue/Action Status By When By Whom Status By When By Whom [Describe until next report after status marked “Done”; new in blue] Accomplishments [Describe key accomplishments briefly; new in blue] Highest Risks [Describe highest risks briefly; new in blue] High Level Recommendations Recommendations [Describe until next report after status marked “Done”; new in blue] Executive Summary [Narrative in blue] *All scope, budget, schedule or *All scope, budget, schedule or IV&V project issues are *Scope, budget, schedule or IV&V project issues have IV&V project issues are manageable by the project team & issues are resolved within an appropriate time period, e.g. 30 days for short-term projects to 90-days for long-term projects. manageable; but one or more is escalated to the Executive Steering Committee or require(s) executive management intervention for resolution in accordance with the project’s Issue Escalation and Resolution Procedure, e.g. 60 days for short-term projects to 120 days for long-term projects. Includes all Watchlist projects. been escalated to the Executive Sponsor in accordance with the project’s Issue Escalation and Resolution Procedure and the project requires DoIT assistance. Includes all projects with outstanding issues that cannot be resolved by the Executive Sponsor within 45 days and all At Risk projects. Version #.# 2 MM/DD/YYYY [Which, e.g. Initial, Interim, Final] IV&V Report: [Deliverable Name and/or #] [Deliverable Name and/or #] [Agency Name] [Project Name] [Project Name] DETAIL REPORT Tables below cover status and trend for each task identified in the IV&V contract. [Choose appropriate format, status & trend indicators for detail tasks below] [Task 1 description] Overall status Details [Brief summary] [Describe key details briefly; new in blue] Status Trend Status Trend Status Trend [Task 2 description] Overall status Details [Brief summary] [Describe key details briefly; new in blue]: [Describe key detail briefly; new in blue] [Describe key detail briefly; new in blue] [Describe key detail briefly; new in blue] [Task 3 description] Overall status Details [Brief summary] [Describe key details briefly; new in blue]: [Describe key details briefly; new in blue] [Describe key details briefly; new in blue] [Describe key details briefly; new in blue] DOCUMENTS REVIEWED FOR THIS REPORT [List document names and versions] [List document names and versions] [List document names and versions] RISKS, MITIGATION, CONTINGENCIES AND ACTIONS Risk Mitigation Contingency Action(s) Taken [Describe key details briefly; new in blue] [Describe key details briefly; new in blue] [Describe key details briefly; new in blue] [Describe key details briefly; new in blue] [Describe key details briefly; new in blue] [Describe key details briefly; new in blue] [Describe key details briefly; new in blue] [Describe key details briefly; new in blue] [Describe key details briefly; new in blue] [Describe key details briefly; new in blue] [Describe key details briefly; new in blue] [Describe key details briefly; new in blue] Version #.# 3 MM/DD/YYYY