Constructing Independent Verification and Validation Life Cycles Using Process Kernels 1. Introduction Effectively applied, Independent Verification and Validation is a catalyst for improving the quality of a computer software system and the quality of a software development process. Verification activities provide assurance that the products of a particular software development stage comply with the pre-defined specifications and standards applicable for the stage. They occur throughout the software development life cycle. Validation activities provide assurance that the end product meets the original specification and standards. They occur predominantly at the end of the software development life cycle (SDLC). Independence of the IV&V organization from the software development organization, not mandatory but recommended, assures objectivity of the verification and validation process and results. Various degrees of independence are usually employed in practice. The highest independence is achieved when the IV&V organization is contractually obligated to the highest authority responsible for procuring the subject system. Limited independence is achieved when the IV&V organization reports to the procuring agent or even the development organization above the level of the project management. Although verification and validation can be accomplished as a part of the development project with no independence from the project management, the greatest independence assures the most objectivity. IV&V is employed most extensively for high-assurance systems (safety-critical systems, real-time systems, aerospace systems, traffic control systems, etc.) where a high level of integrity is required. IV&V provides an added assurance that potential risk areas are considered and risk mitigation is performed. Although the recommendation for highassurance systems is a robust, full life cycle IV&V (with an emphasis is on the early phases of the development life cycle), not all systems warrant the same level or type of IV&V. The fundamental goals of all levels and types of IV&V are two fold. One goal is to continuously improve the quality of the development process and products. The other goal is to improve the quality of IV&V’s own process and products. Improvements are achieved by measuring quality, setting progressive goals for improvement, and monitoring progress. To do this effectively, IV&V must be a formalized, repeatable, and optimized process that employs proven methods and standards. The IV&V process must also be tailorable to match the needs of any development model (waterfall, evolutionary, incremental, spiral, rapid-prototype, etc.), for any type of system (management information systems, safety systems, real-time systems, knowledge-base systems, simulation and modeling, etc.), and any development methodology (Object Oriented, topdown structured, etc.). This presents a unique challenge for conducting IV&V. The kernel concept and process was developed by Humphrey in Chapter 13 of “Managing the Software Process.” The Entry, Task, Verificiation, and Exit (ETVX) paradigm is a concept initially developed by a group at IBM, Radice et al in their paper "A Programming Process Architecture," IBM System Journal, vol. 24, No. 2, 1985, PP 7990. The IV&V Life Cycle methodology is based on a pre-defined repository of process “kernels” from which the IV&V life cycle for a given project is defined. A “kernel” is defined for each IV&V function such as Requirements Analysis, Document Review, Code Analysis, etc. Each “kernel” contains entry and exit criteria, inputs and outputs, activities, process controls, and metrics for a given IV&V activity. These kernels are then mapped to the development model as determined by the needs of the project. The following paragraph will provide an overview of the IV&V Life Cycle methodology. Section 3 will describe the construction of IV&V life cycles using IV&V kernels, and section 4 will detail the IV&V Process Kernel using an example of the Requirements Tracing Kernel. In section 5, the metrics data collection process will be described and an example for the requirements tracing kernel provided. 2. Overview of the IV&V Life Cycle Methodology Unique challenges are continuously presented to IV&V. Schedule slips may erode or eliminate the time allocated to perform important activities. Documentation and other artifacts may be out of date or be non-existent. Therefore, a delicate balance (between the specific needs of a project and the most robust, full life cycle IV&V) is required to reduce risk and maximize the quality of a system. The IV&V life cycle methodology provides the means to meet the challenges through a formalized, repeatable process that is optimized based on the development model, development methodology, application type, and life cycle phase(s) that are applicable. The methodology is designed to elevate the 2 maturity of the IV&V process to ensure improvement of quality of both software development and IV&V processes and products. The IV&V life cycle methodology includes set of stable, well-defined IV&V Process Kernels supported by robust program control and metrics. Because IV&V must match the needs of varied system development efforts while remaining formalized and repeatable, a modular construction is necessary. Webster's dictionary defines modular as "designed with standardized units or dimensions for flexible use." This modular concept was proposed for software development in IEEEStd-1074-1991[3], “Standard for Developing Software Life Cycle Processes." The "kernel" approach is expanded and applied to IV&V with the result being a repository of IV&V Process Kernels. Basically, an IV&V Process Kernel is a modular, reusable, self-contained building block defining inputs, entry criteria, activities, exit criteria, outputs, process controls, and metrics. The complete set of kernels represents a partitioning of the total IV&V process into building blocks, a proper subset of which can address any and all development life cycles and abbreviated permutations of IV&V. These building blocks can be used to construct a standard IV&V methodology tailored for any particular system, allowing one to apply tried and true IV&V practices to radically different projects. Although a mature process must be well-defined and repeatable, IV&V must retain much flexibility because of the unique challenges mentioned above. It is most advisable to employ full life-cycle IV&V with emphasis on the early phases of the software development life cycle. This is seldom realized because of lack of resources (IV&V is often under-funded) and lack of opportunity (IV&V is often not funded until after coding is complete). At times, IV&V may be tasked to perform only validation or only verification of a particular life cycle phase. At times, IV&V may be called to rescue a faltering project that is already at the test and integration life cycle phase. The focus of IV&V should remain on risk abatement, where discrepancies are identified and ranked, associated risks are identified, and solutions are recommended. This focus is packaged within each IV&V Process Kernel and remains in tact regardless the of size or type of IV&V life cycle that is constructed. 3 3. IV&V Life Cycle Construction Each IV&V activity is defined as a self-contained IV&V Process Kernel. A configuration of kernels is mapped to the specific software development model being employed for a project. The most complete example of a full life cycle IV&V can be demonstrated for a simple waterfall development model. In a waterfall model, requirements are presumed to be stable at the beginning of the project and each development phase is consecutively executed until delivery of the final system. 3.1 The IV&V Life Cycle for Waterfall Software Development Life Cycle Figure 3-1 depicts a simple waterfall software development life cycle model with cascading, shaded boxes for each phase: Requirements, Design, Coding, Integration and Test, and Delivery. Behind each development phase are IV&V Process Kernels relevant for that phase. The implementation of each process kernel is dependent on the requirements and constraints of the project and the specified IV&V statement of work. For instance, an IV&V change assessment may not be required at all or IV&V audits may be needed at each phase along with milestone reviews. IV&V Document Review and Audits may also be required for developer management, quality assurance, and/or configuration management products or processes. IV&V Project Management (PM), Configuration Management (CM), and Quality Assurance (QA) kernels are shown spanning the entire life cycle. They interact with all IV&V Process Kernels including each other to monitor and control IV&V processes. An IV&V life cycle such as the one shown in Figure 3-1 then becomes the basis for the project management plan and supports costing, sizing, and scheduling of all IV&V activities. 4 Project Management Special Analysis Change Assess. Doc Review Formal Reviews Criticality Anal. Change Assess. Criticality Anal. Req. Tracing Audit I/F Verification Test Planning Req. Analysis Formal Reviews Doc. Review Req. Tracing Requirements Req. Analysis Design Analysis Design Test Data/ Tool Generation Code Analysis Req. Tracing Doc Review Audit Coding = SDLC Phase Install Planning Doc Review Test Monitoring Formal Reviews Audit Formal Reviews Test Planning Training Integration Test Executuion & Test Install Delivery = IV&V Kernel CM / QA Figure 3-1: IV&V Life Cycle for a Simple Waterfall SDLC 3.2 The IV&V Life Cycle for Model Verification In contrast to the robust, full life-cycle IV&V process for the simple waterfall SDLC model shown in Figure 3-1, a statement of work for IV&V may define a more narrow role. It may restrict the IV&V role to a subset of development life cycle phases or a role that does not parallel the development process in any way. An example of such a situation is shown in Figure 3-2, IV&V Life Cycle for Model Verification. For this example, IV&V is restricted to only a portion of a total Verification, Validation & Accreditation (VV&A) process that is typical in the Department of Defense. Verification is the process of ensuring that the software does what its author intended it to do. Validation addresses the question of whether the software properly represents the real world (the model should not be too crude or too refined for the purpose intended). Accreditation is the assertion by a responsible authority or regulating agent that the model is suitable and sufficient for a specific use. 5 Completed Model Verified Model Project Management Test Change Analysis Data/Tool Generation Code Analysis Verification Special / Certification Report Analysis Interface Test Planning Verification = Extermal I/F = IV&V Process Kernel = IV&V Product CM / QA Figure 3-2: IV&V Life Cycle for Model Verification The IV&V task in Figure 3-2 addresses only the verification process with Validation and Accreditation not performed by IV&V. Verification begins with the hand-off of a completed model to the IV&V process (as defined by the contract or statement of work). Model in this context includes all types of coded simulations, algorithms, and formulas. The Special Analysis Kernel is invoked first and outlines analysis and data gathering that is needed to support the remaining kernels. If the model is a re-submittal of one previously verified, the Change Analysis kernel is invoked. After the Special Analysis kernel is complete (all exit criteria and outputs have been addressed), the Test Planning kernel is invoked. It outlines activities to develope verification tests and establishes the possible need for special test tools or data that may need to be generated by the Test Data/Tool Generation kernel. Two types of verification activities (Code Analysis and Interface Verification) are conducted--both using test cases previously defined in the Test Planning kernel. Finally, the results are documented and certification (that the model 6 meets its intended requirements) is granted or denied on the basis of code analysis results and interface verification test results. 3.3 Optimized IV&V Life Cycle Process The structure of the IV&V life cycle for the waterfall SDLC in Figure 3-1 is significantly different from the IV&V life cycle for model verification in Figure 3-2. The waterfall IV&V life cycle parallels, step by step, the development phases and spans the entire SDLC. The result is a series of assessments and activities that recommend improvement in the quality of development products. The model verification IV&V life cycle is independent of the software development life cycle, and occurs on a completed model before release to its intended users. The result of the IV&V verification process is a fully verified model with known qualities as determined by code analysis and interface verification test results. What is not readily apparent from the IV&V life cycle figures is the similarities between the two examples. Both life cycles draw from the same pre-defined repository of kernels. Each kernel is a formalized set of activities that is repeatable and is supported by a repeatable management infrastructure that uses process control and metrics consistently among all activities and projects. By example, the IV&V life cycle methodology has been applied to vastly different levels and roles of IV&V. The waterfall SDLC required a robust, full life cycle IV&V and the model verification required only a restricted role of IV&V for a portion of the SDLC. Therefore, IV&V has been optimized based on the needs of a particular project. The process remains standard because the kernels themselves are well-defined, and formalized. 4. Kernel Description There are currently 21 IV&V Process Kernels defined, 18 for IV&V plus 3 for continuing management and control activities. The phase at which each IV&V kernel may be applied is shown in Table 4-1. It illustrates the flexibility of the kernels and is not intended to be an all inclusive list. The rows represent the 18 IV&V Process Kernels. Sub-kernels for the Formal Reviews kernel are available, but are not listed (e.g., for SRR, PDR, CDR, etc.). The columns represent the major phases of the software development life cycle. Entries in each cell in the table indicate a particular instance for which the 7 kernel is activated. For example, the Requirements Analysis kernel is activated during the Requirements software development phase for system level requirements documents such as System/Segment Specification (SSS), Software Requirements Specification (SRS), and Concept of Operations (CONOPS). Many kernels can be, and should be, utilized at several phases of the life cycle depending on where they are needed and how the IV&V is scoped. Kernels for Program Management, Configuration Management, and Quality Assurance define on-going activities that span the entire IV&V life cycle and are shown at the bottom of the table. 8 Table 4-1: IV&V Kernel and Development Phase Cross-reference Development Phase / IV&V Kernel Requirements Analysis Requirements Tracing Interface Verification Criticality Analysis Change Assessment Formal Reviews Requirements Design Coding Delivery SSS, SRS, CONOPS SSS, SRS, CONOPS IRS SDD, PDL Code Modules Test Results SDD PDL Test Results SSS, SRS SDD, PDL Code Modules Test Results SSS, SRS, CONOPS SRR, SDR As Required As Required As Required As Required PDR, IPR, CDR SDP, CMP, QAP SDD, PDL Design Tradeoffs IPR TRR, FCA, PCA STD, Training Manual FQR, OTR, TPR SPS Test Data Test Tools Document SDP, CMP, QAP Review Design Analysis Special Analysis Requirements Acquisition Code Analysis Test Data / Tool Generation Audit Design Audit Rapid Prototype SUM, STP Algorithm Analysis Code Modules Test Tools Process, SDP, Progress IV&V Test Plan Test Planning Test Monitoring Integration/ Test Rapid Prototype Installation Planning Install Test Execution Training Training Plan Readiness Audit IV&V Test Description Integration Test, Software Test, System Test Installation Plan Version Install Testing Training Course Program Management Configuration Management Quality Assurance (* All acronyms are defined in the Acronym list after the Reference List) 9 Entry criteria, inputs, activities, exit criteria, outputs, process controls, and metrics are defined for each kernel. As an example, the IV&V Requirements Tracing kernel is shown in Figure 4-2. Entry Criteria describe the circumstances under which a kernel becomes activated. All entry criteria should be fulfilled before commencing with the activities defined for the kernel. If some entry criteria cannot be fulfilled, a work-around may be necessary. All such deviations from what is prescribed in the kernel must be performed to maximize risk reduction and minimize adverse impacts to quality. All deviations must also be documented appropriately. In Figure 4-2, availability of requirements tracing tools is listed as one of the entry criteria. If, for example, the desired tool is not available, and completion of the tracing activity is needed to support an impending review, manual tracing (or a preliminary manual trace) of the most critical section of the document may be a viable, temporary solution. Inputs identify the data items that are required to support the activities of the kernel. For the most part, these are outputs of other kernels or products of the software development process such as test plans or design documents. Activities describe a minimum set of actions that will produce the output items and meet the exit criteria objectives. For each related set of actions, step by step procedures are available to support consistency among analysts, adherence to proven practices, and training. If all activities cannot be performed, management steps to reduce risk should be taken, they should be noted in the outputs products (such as the Requirements Analysis Report), and the kernel closed. Exit Criteria identify the circumstances under which the kernel is completed or de-activated. It includes delivery or presentation of results, and passing of information to other kernels (such as the passing of comments to the Configuration Management kernel for tracking). Outputs identify products of the kernel activities and are either deliverable items or are required to support other kernels. Process Controls define quality assurance activities that are performed for the kernel. These are detailed in the Project Management and Quality Assurance kernels and are documented in the IV&V Project Management Plan. Metrics are the categories of measures collected and maintained for each kernel. The details of each metric are specific to each kernel and are defined in a IV&V Metrics Program Plan. The metrics allow the monitoring of trends and identification of problem areas. 10 IV&V REQUIREMENTS TRACING KERNEL Activities (See IV&V Procedure) Entry Criteria - A requirements document or other requirements artifact has been received at the tracing work site - Contract, project manager, and/or task leader directs that tracing is needed - All supporting data and references are available in the needed format (e.g., softcopy, database files) - Any supporting tools, such as requirements tracing tools, needed for the task are available for the duration of the task as specified by planned schedule and planned effort metrics. Conduct Document Preparation - Assemble all current, approved applicable documents (e.g., SRS and SDD, SRS and Test Plan, SDD and Code, Version 1 and Version 2 of SRS, etc.) - Convert to softcopy as needed - Determine parsing method (manual, automatic, level of detail, etc.) for input and parse - Import parent and child documents into requirements tracing tool (e.g., SuperTracePC) so that each requirement item is unique and distinct Requirements Preparation - Generate parent requirements list - Generate child requirements list - For each requirements list, identify pertinent requirements, N/As, etc. - Perform change assessment when needed - Assign keywords for attributes, characterization, and automatic candidate links generation Requirements Tracing - Generate candidate links automatically or perform interactive link analysis - Verify completeness (all parent requirements are satisfied by child requirements) - Verify traceability (all child requirements are decomposed from parent requirements) - Determine whether each parent-child verification is full, partial, or anomalous - For each discrepancy, assess potential severity; critical, major, or minor - For each discrepancy (especially the partial verification) include comment regarding nature of partial status to assist in future updates Report Generation - Generate metrics data for discrepancies by severity - Generate Requirements Trace Report - Generate table with status of parent requirements - Generate table with status of child requirements - Generate table with discrepancies of parent requirements - Generate table with discrepancies of child requirements - Assess acceptability of documents based on acceptability threshold defined in IV&V Metrics Program Plan Input - System Spec. - Operational Requirements - Test Eval. Master Plan - S/W Requirements Spec - Design Document - Test Plan/Procedure - Source code - VDD - IV&V Test Plan/Procedure Process Controls (See IV&V Project Management Plan) 1. Quality Assurance 2. Peer Review 3. Configuration Management Exit Criteria - All parent and child requirements have been assigned a verification status of full, partial, or anomaly, and a comment has been attached to explain anomalies and partials (traceability & completeness) - IV&V Requirements Analysis Report has received appropriate QA, and CM has recorded all metrics data - The IV&V Requirements Analysis Report has been delivered and is in the format specified by contract, project manager, and/or task leader - All outstanding comments are entered into the tracking system - Requirements database has been created or updated as required Output - Requirements Analysis Report - Metrics Data - List of Discrepancies - Severity of discrepancies - Parent/Child Requirement Trace Reports - Parent/Child Anomaly Report - Requirements data base Metrics (See IV&V Metrics Program Plan and Data Recording Instruction) 1. Document Characteristics 4. Schedule 2. Size 5. Quality: product, process 3. Effort Figure 4-2: IV&V Requirements Tracing Kernel 11 5. IV&V Metrics The key to a successful IV&V effort is knowing how effective the IV&V process is at identifying defects. Effective defect detection for software products and processes promotes high quality software development and results in a more reliable system. Kernels provide a consistent way to gather metrics to gauge quality and productivity. To accomplish this, a strong metrics program to support the management of IV&V activities and to standardize the reporting of results is essential. Therefore, the fundamental purpose of metrics is to set a mark of where you are so that goals for improvement can be set, and achievement of those goals can be recognized. Metrics also allow trend analysis and setting formal acceptance thresholds. Although some resistance to implementing a new metrics program has been reported by almost everyone who has tried it, experience has shown that once the products of metrics are available (graphics, charts, etc.), they become essential to effective project management. The Department of Defense (DOD) has mandated the use of a core set of metrics to manage DOD system acquisition[1]. Of the core set, the IV&V methodology employs Size, Cost, Effort, Schedule, Quality, and Rework metrics. Each of these metrics, plus additional data regarding system and task characterization, is defined for each IV&V Process Kernel in a Metrics Data Recording Sheet. A data recording sheet, used as a procedure to gather information for a typical requirements tracing activity, is shown in Table 5-1. The table lists the actions to be performed, the source of the data, and the data items required for the activity. An IV&V Metrics Program Plan provides the management and details of the metrics for all IV&V Process Kernels. Roles, responsibilities, and methods for metrics collection, analysis and reporting should be detailed as well. 12 Table 5-1: METRICS DATA RECORDING SHEET IV&V Requirements Tracing Collect the following data for each tracing activity: Actions 1. Activity Characteristics Data Source a. IV&V Report b. Task Leader 2. Document size a. IV&V Report b. Document 3. Effort a. Task Leader BOE b. Timecard c. Database Data Item Results a. System Name b. Task Leader Name c. CM Name d. Date e.1 Parent Document Type e.2 Parent Version e.3 Parent Document ID e.4 Parent Document Date f.1 Child Document Type f.2 Child Version f.3 Child Document ID f.4 Child Document Date a.1 Number of parent requirements a.2 Number of valid parent requirements a.3 Number of pages in parent document a.4 Number of Function Points (if applicable) b.1 Number of child requirements b.2 Number of valid child requirements b.3 Number of pages in child document b.4 Number of Function Points (if applicable) a.1 Staff hours budgeted for document preparation by level of expertise (obtain from Task Leader) a.2 Staff hours budgeted for tracing by level of expertise (obtain from Task Leader) a.3 Staff hours budgeted for report generation preparation by level of expertise (obtain from Task Leader) a.4 Staff hours budgeted for QA by level of expertise (obtain from Task Leader) a.5 Staff hours budgeted for rework by level of expertise (obtain from Task Leader) b.1 Staff hours expended for document preparation by level of expertise (obtain from Task Leader) b.2 Staff hours expended for tracing by level of expertise (obtain from Task Leader) b.3 Staff hours expended for report generation preparation by level of expertise (obtain from Task Leader) b.4 Staff hours expended for QA by level of expertise (obtain from Task Leader) b.5 Staff hours expended for rework by level of expertise (obtain from Task Leader) b.6 Calculate staff hours expended for entire activity by level or expertise b.7 Total edit time in hours as recorded by MSWORD Statistics Information a. b. c. d. e.1 e.2 e.3 e.4 f.1 f.2 f.3 f.4 a.1 a.2 a.2 a.3 b.1 b.2 b.3 b.4 a.1 13 a.2 a.3 a.4 a.5 b.1 b.2 b.3 b.4 b.5 b.6. Project A Baseline Leader One CM One 9/12/94 Operational Rqts Final ORD-ID-1234 28MAR94 System Rqts/Design Final (V 2.3) SSS-ID-1234 28JAN94 109 76 15 N/A 728 376 71 N/A 48 hrs - Mid-level 8 hrs - Jr-level 24 hrs - Mid-level 1 hrs - Admin. 6 hrs - Mid-level 2 hrs - Sr-level 4 hrs - Mid-level 12.5 hrs - Jr-level 18 hrs - Sr-level 113.5 hrs - Sr-level 21 hrs - Mid-level 4 hrs - Jr-level 23 hrs - Sr-level 22.5 hrs - Mid-level 5 hrs - PM-level 8 hrs - Sr-level 5.5 hrs - Mid-level 5 hrs - PM-level 162.5 hrs - Sr-level 54 hrs - Mid-level 16.5 hrs - Jr-level b.7 6.5 hrs 4. Schedule a. Task Leader b. Metrics Database c. Timecard a.1 Planned start date, planned end date for document preparation activity a.2 Planned start date, planned end date for tracing activity a.3 Planned start date, planned end date for report generation activity a.4 Planned start date, planned end date for QA activity a.5 Planned start date, planned end date for rework activity a.6 Planned start date, planned end date for entire activity b.1 Calculate planned duration for document preparation activity b.2 Calculate planned duration (d) for tracing activity b.3 Calculate planned duration (d) for report generation activity b.4 Calculate planned duration (d) for QA activity b.5 Calculate planned duration (d) for rework activity b.6 Calculate planned duration (d) for entire activity c.1 Actual start date, actual end date for document preparation activity c.2 Actual start date, actual end date for tracing activity c.3 Actual start date, actual end date for report generation activity c.4 Actual start date, actual end date for QA activity c.5 Actual start date, actual end date for rework activity c.6 Actual start date, actual end date for entire activity d.1 Calculate actual duration for document preparation activity d.2 Calculate actual duration for tracing activity d.3 Calculate actual duration for report generation activity d.4 Calculate actual duration for QA activity d.5 Calculate actual duration for rework activity d.6 Calculate actual duration for entire activity 14 a.1 a.2 a.3 a.4 a.5 a.6 5/3/94 - 5/17/94 5/17/94 - 6/27/94 6/27/94 - 7/4//94 7/4/94 - 7/5/94 7/5/94 - 7/8/94 5/3/94 - 7/8/94 b.1 b.2 b.3 b.4 b.5 b.6 11d 29d 5d 1d 3d 49d c.1 c.2 c.3 c.4 c.5 c.6 6/17/94 - 6/30/94 7/1/94 - 8/26/94 8/29/94 - 9/6/94 9/7/94-9/7/94 9/7/94 -9/9/94 6/17/94 - 9/9/94 d.1 d.2 d.3 d.4 d.5 d.6 10d 35d 6d 1d 2d 54d 5. Quality of document and acceptability a. IV&V Report b. Metrics Database Parent to Child Trace (completeness) a.1 Number of full, partial, anomaly, derived traces for parent to child trace (completeness) a.2 Number of each priority (Critical, Major, Minor) for partial, anomalies, derived traces for parent to child trace (completeness) b.1 Calculate % of total requirements that are full, partial, anomaly, derived traces for parent to child trace (completeness) b.2 Calculate % of total requirements that are of each priority (Critical, Major, Minor) for partial, anomalies, derived traces for parent to child trace (completeness) Child to Parent Trace (traceability) c.1 Number of full, partial, anomaly, derived traces for child to parent trace (traceability) c.2 Number of each priority (Critical, Major, Minor) for partial, anomalies, derived traces for child to parent trace (traceability) d.1 Calculate % of total requirements that are full, partial, anomaly, derived traces for child to parent trace (traceability) d.2 Calculate % of total requirements that are of each priority (Critical, Major, Minor) for partial, anomalies, derived traces for child to parent trace (traceability) e. Recommended approval (See IV&V Metrics Program Plan for Approval Criteria (e.g. Acceptable if: < 0.001 Critical faults per requirement < 0.025 Major faults per requirement < 0.2 Minor faults per requirement) 6. Quality of IV&V Report a. IV&V Report a. Number of pages in IV&V report (body, appendices, etc.) b. QA Redlines b. Number of QA comments for V&V Report (typos, clarity, completeness, etc.) a.1 Full = 60 Partial = 10 Anomaly = 6 a.2 Partial Critical - 0 Partial Major - 1 Partial Minor - 9 Anomaly Critical - 0 Anomaly Major - 0 Anomaly Minor - 6 b.1 Full = 78.95% Partial =13.16% Anomaly = 7.89% b.2 Partial Critical - 0% Partial Major - 0.03% Partial Minor - 0.12% Anomaly Critical - 0% Anomaly Major - 0% Anomaly Minor - 0.8% c.1 Full = 370 Partial =00 Anomaly = 6 c.2 Partial Critical - 0% Partial Major - 0% Partial Minor - 0% Anomaly Critical - 0% Anomaly Major - .002% Anomaly Minor - 0 d.1 Full = 98.40 Partial = 0 Anomaly = 1.6 d.2 Partial Critical - 0% Partial Major - 0% Partial Minor - 0% Anomaly Critical - 0% Anomaly Major - .002% Anomaly Minor - 0 e. Approved a. b. c. QA quality assessment of IV&V Report (score out of 10) c. 7. Metrics Report Generation a. Bar graph depicting anomalies per requirements and partials per requirement for each IV&V Requirements Tracing Activity b. Bar graph depicting planned and actual labor hour per requirements for each activity 15 Body - 5 pgs Appendix A - 128 pgs Appendix B - 126 pgs Appendix C - 9 pgs Appendix D - 3 pgs typos - 4 clarity - 3 completeness - 2 enhancement 8 out of 10 One product of this data recording procedure is a measure of the quality of the requirements document (item 5 in column one). Quality is represented by the completeness of the trace and a recommendation of acceptability based on an objective threshold. Item 5, a.1-b.2 in column three of Table 5-1, lists metrics to record the trace status of each requirement and the criticality of each discrepancy. Column 4, item 5, a.1 of the example shows that six anomalies were found during the trace of parent to child requirements. Item 5, a.2, indicates that all six were minor and none were major or critical anomalies. All discrepancies are tabulated in this way and compared to a predefined acceptance criteria (column 3, item e). Acceptance criteria are based on results of similar activities with similar documents and are defined in the IV&V Metrics Program Plan. The metrics also measure the productivity of IV&V processes and the quality of IV&V products. The Effort and Schedule metrics (items 3 and 4) help monitor the planned versus actual productivity and progress of the kernel. For a kernel of extended duration such as requirements tracing, careful tracking of such metrics can be a valued management aid. The quality of kernel products is measured by the metrics in item 6. Undesirable trends for a certain type of QA comment can be noted (such as problems in the completeness of reports) and steps for improvement (such as increased training) can be taken. Regardless of process tailoring for a specific project, a standard set of metrics must be collected for all projects. Item 7 identifies the graphic representation of results for the metrics gathered. Table 5-2 shows the requirements tracing results such as that indicated by item 7.a for several systems. 16 100% 90% 80% Percentage of Total 70% 60% 50% 40% 30% 20% 10% 0% System C ORD SystembC SSDD to System C ORD to System C SSDD SystemB SS Segment B1 SRS to System B SS to Segment B1 SRS System A SRS to SSS Didn't Trace System A Partially Traced SSS to SRS Fully Traced System Figure 5-1 Requirements Trace Results by System 6. Summary This lesson has described a process that uses IV&V Process Kernels to define a life cycle for performing Independent Verification and Validation. The modular nature of the kernels facilitates the construction of an IV&V life cycle that meets the needs and constraints of any software development effort or level of IV&V involvement. Kernel components were described in detail, and examples of an IV&V life cycle for a simple waterfall development life cycle and an IV&V model verification task were provided. It was shown how this kernel approach achieves a high level of maturity (formalized and repeatable) in the IV&V process and how metrics and process controls are used to monitor status and quality. This provides a defined and repeatable process for performing IV&V. An optimized IV&V process can be achieved using this process to tailor standard, proven methods for different development models, system types, or to meet the constraints of narrowly scoped IV&V. The processes defined will be continuously refined and expanded in the future as they are used on more projects and emerging technologies become available. All kernels will be fully defined and detailed step by step procedures will be developed to support each kernel. As data for the initial selection of metrics becomes more abundant, additional metrics will be added. 17 References [1] Kind, Lt. General Peter A. Software as a Force Multiplier. CROSSTALK: The Journal of Defense Software Engineering, July 1994. Software Technology Support Center [2] Paulk, M.C., et al. Capability Maturity Model for Software, Version 1.1. SEI Technical Report CMU/SEI-93-TR-24-February 1993. [3] IEEE Standard for Developing Software Life Cycle Processes IEEE-Std-10741991 [4] Zelinski, Lillian K.; "Constructing Independent Verification and Validation Life Cycles Using Process Kernels", Paper at the 10 th Annual IEEE COMPASS Conference, June 26-30 1995, Gaithersburg, MD USA. . [5] Radice, R.A., N.K. Roth, A.C., O'Hare, Jr., and W.A. Ciarfilla. "A Programming Process Architecture," IBM System Journal, vol. 24, No. 2, 1985, PP 79-90. [6] Humphrey, Watts S.; Managing the Software Process. New York: Addison- Wesley Publishing Company, 1989, Chapter 13. 18 Acronyms BOE CDR CM CMP CONOPS DD FCA FQR IPR IRS IVV O-O OTR PCA PDL PDR PM QA QAP SDD SDLC SDP SDR SPS SRS SSDD SRR SSS STD STP SUM TPR TRR VDD Basis of Estimate Critical Design Review Configuration Management Configuration Management Plan Concept of Operations Detail Design Functional Configuration Audit Formal Qualification Review In-Progress Review Interface Requirements Review Independent Verification and Validation Object Oriented Operational Test Review Physical Configuration Audit Preliminary Design Language Preliminary Design Review Project Management Quality Assurance Quality Assurance Plan Software Detailed Design Software Development Life Cycle Software Development Plan System Design Review Software Product Specification Software Requirements Specification System/Software Detailed Design System Requirements Review System/Segment Specification Software Test Description Software Test Plan Software User’s Manual Training Package Review Test Readiness Review Version Description Document 19