VALIDATION DEFINITIONS - PBworks

advertisement

VALIDATION DEFINITIONS

1. U.S. Department of Defense (DoD):

The definition of validation as used by the U.S. DoD has been evolving. Examples of that evolution are:

• The process of determining the extent to which a model or simulation is an accurate representation of the real world from the perspective of the intended use(s) of the model or simulation. [Extracted from DoD

5000.59-P, “Modeling and Simulation (M&S) Master Plan,” Oct 1995]

The process of determining the degree to which a model or simulation is an accurate representation of the real-world from the perspective of the intended uses of the model or simulation. [Extracted from DoDD

5000.59M, “DoD Modeling and Simulation (M&S) Glossary,” Jan 1998]

The process of determining the degree to which a model is an accurate representation of the real-world from the perspective of the intended uses of the model. [Extracted from DoDD 5000.59, “DoD Modeling and

Simulation (M&S) Management,” January 4, 1994 (But incorporating

Change 1 of January 20, 1998 and Certified Current as of December 1,

2003)]

• The process of determining the degree to which a model and its associated data are an accurate representation of the real world from the perspective of the intended uses of the model. [Extracted from DoDI 5000.61, “DoD

Modeling and Simulation (M&S) Verification, Validation, and

Accreditation (VV&A),” May 13, 2003, but attributed to DoDD 5000.59]

Note the inclusion of “associated data” in the latest definition.

Supplemental DoD definitions (taken from DoDD 5000.59-M, “M&S Glossary”):

Output Validation :

The process of determining the extent to which the output (outcome distributions for the M&S and/or sub-models) represent the significant and salient features of distributions of real world systems, events, and scenarios.*

• Note: the original definition carried with it a second sentence, “The application determines which features are ‘significant and salient.’ ”

Also note the correspondence to the DOE definition of Validation

Also please note that the group that originally defined this used the word

“ or

” in the definition. A typo was made and the DoDD 5000.59-M has the word “ of ” in the definition – which changes the definition significantly.

Note that the M&S Glossary uses the word “ or

” in its definition.

Structural Validation :

The process of determining that the M&S assumptions, algorithms, and architecture provide an accurate representation of the composition of the real world as relevant to the intended use of the M&S. *

• Note the correspondence to the DOE definition of Qualification

2. UK MOD:

VALIDATION - To establish that the model / process is fit for purpose

“Have we built the right model / process?”

[The rest is tailored, somewhat, to cost modelling]

The primary aim of the V&V process is to ensure that objective and auditable evidence on the credibility of the modelling capability, and the underlying data, is available to both customers and scrutiny community. With such information available in an auditable document to both user and customer, the possibility that a capability is inappropriately applied, thereby rendering a study invalid or ‘unfit for purpose’, will be reduced. The continuous availability of such evidence will also enable more effective scrutiny at an earlier stage in the work, such as when concepts are being formulated, rather than when results are emerging.

Validation aims to determine the extent to which a model represents the real world situation, thus providing information from which judgements can be made on whether the model can be used as a credible analytical tool for a specific purpose.

Control of the following key areas dictates the effectiveness of a model in use and its consequent fitness for purpose: a) Modelling reality: The model's processes and / or algorithms. b) Data: (whether used to feed or calibrate a model.) These have different sources which may include other models. c) Model and Data Management: This covers the development of a model, its use and maintenance, and impinges on the previous two areas. d) Operator skills: The availability of people with skills and experience to support the model and assessment of their experience and specialised competence.

Three levels of Validation:

Un-validated : Little or no assessment of the modelling capability has been undertaken or documented, or fundamental shortfalls have been identified.

Minimum standards of validation have not been reached. The model may not be used as the primary tool for a cost forecasting task on which key decisions will be based. With care the model may be used, by experienced staff, for budgetary and

ROM cost forecasts.

Level 1 Validated: The model has undergone full assessment by review and its strengths and weaknesses recorded. Areas in which it should not be used for key decisions have been identified together with the risks to decisions in areas where it is to be used. Both data supporting the model and the model management system

have also been reviewed and any strengths and weaknesses recorded. The model and data have been judged as likely to provide a realistic representation of the costs of providing a defined military capability. Validation based on real events has not been completed (or indeed may not be feasible) at this stage but there needs to be a record such that later occurring data can be used for validation at a later stage if the model is still in use. The model may be used, with care, as a tool in support of cost forecasting tasks on which key decisions may be based, but the recipient of outputs should be made aware of this.

Level 2 Validated : As for Level 1, but, additionally, validation against trials/exercises and/or historical cost outturns has taken place. The aim is that historical analysis should have been used, if appropriate. Particular care should be taken to ensure that the use of historical analysis takes account of technological advances, economic changes and the restructuring of the Defence Industry. The model may be used as a primary tool for cost forecasting tasks on which key decisions may be based.”

Source: 20070126- UK MOD Guidelines For The Verification And Validation

Of Cost Modelling Used For Forecasts Of Future Cost –Vers 2 Final – U

3. AIAA:

Validation is defined as

The process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model. (AIAA G-077-1998)

Validation has also been described as "solving the right equations" . It is not possible to validate the entire CFD code. One can only validate the code for a specific range of applications for which there is experimental data. Thus one validates a model or simulation. Applying the code to flows beyond the region of validity is termed prediction.

Validation examines if the conceptual models, computational models as implemented into the CFD code, and computational simulation agree with real world observations. The strategy is to indentify and quantify error and uncertainty through comparison of simulation results with experimental data. The experiment data sets themselves will contain bias errors and random errors which must be properly quantified and documented as part of the data set. The accuracy required in the validation activities is dependent on the application, and so, the validation should be flexible to allow various levels of accuracy.

The approach to Validation Assessment is to perform a systematic comparison of CFD simulation results to experimental data from a set increasingly complex cases.

Each CFD simulation requires verification of the calculation as specified in the discussion of Verification Assessment .

The process for Validation Assessment of a CFD simulation can be summarized as:

1. Examine Iterative Convergence.

Validation assessment requires that a simulation demonstrates iterative convergence. Further details can be page entitled Examining

Iterative Convergence .

2. Examine Consistency.

One should check for consistency in the CFD solution. For example, the flow in a duct should maintain mass conservation through the duct. Further total pressure recovery in an inlet should stay constant or decrease through the duct.

3. Examine Spatial (Grid) Convergence.

The CFD simulation results should demonstrate spatial convergence. Further details and methods can be found on the page entitled Examining Spatial (Grid) Convergence .

4. Examine Temporal Convergence.

The CFD simulation results should demonstrate temporal convergence. Further details and methods can be found on the page entitled

Examining Temporal Convergence .

5. Compare CFD Results to Experimental Data.

Experimental data is the observation of the "real world" in some controlled manner. By comparing the CFD results to experimental data, one hopes that there is a good agreement, which inreases confidence that the physical models and the code represents the "real world" for this class of simulations. However, the experimental data contains some level of error. This is usually related to the complexity of the experiment. Validation assessment calls for a "building block" approach of experiments which sets a hierarchy of experiment complexity.

6. Examine Model Uncertainties.

The physical models in the CFD code contain uncertainties due to a lack of complete understanding or knowledge of the physical processes. One of the models with the most uncertainty is the turbulence models. The uncertainty can be examined by running a number of simulations with the various turbulence models and examine the affect on the results.”

4. ASME:

“ Guide for Verification and Validation in Computational Solid Mechanics” was published a few months ago. The next step is to identify and define best practices.

Completion of that step will enable the development of a formal standard in the future.

The document points out that computational solid mechanics is increasingly important in the development and testing of engineered systems from cars to aircraft and weapons. At the same time, "the state of the art of V&V [that is,

verification and validation] does not currently lend itself to writing a step-by-step performance code/standard."

The Guide outlines procedures for developing methods to verify engineering software and validate the results from models by comparing the results from simulations with experiments. The publication's purpose is to give those involved in computational solid and structural mechanics "a common language, a conceptual framework, and general guidance for implementing the processes of computational

V&V."

As part of the contribution to common language, the Guide defines "verification," for instance, as "the process of determining that a computational model accurately represents the underlying mathematical model and its solution. "Validation" is "the process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model."

According to the document, verification and validation, taken together, "are the processes by which evidence is generated, and credibility is established, that computer models have adequate accuracy and fidelity for their intended use."

The publication also includes figures illustrating the verification and validation processes, and discussions of ideas to be considered in establishing a verification and validation program.

The publication, which contains just under 30 pages, is available from the Society, and designated as ASME V&V 10-2006.

5. NASA:

“Validation: The process of assessing by independent means the quality of the data products derived from the system outputs.” (Committee on Earth Observing

Satellites (CEOS)

IV&V Overview

What is Verification and Validation?

Verification answers the question, "Are we building the product right?"

Verification is the process of determining whether or not the software products of a given phase of the SDLC fulfill the established requirements for that phase.

Validation answers the question, "Are we building the right product?"

Validation evaluates the software products throughout the SDLC to ensure those products meet the mission and customer's needs.

What is Independence?

IEEE defines independence in IV&V as three parameters: technical independence, managerial independence, and financial independence.

Technical independence is achieved by IV&V practitioners who use their expertise to assess development processes and products independent of the developer.

Managerial independence requires responsibility for the IV&V effort to be vested in an organization separate from the organization responsible for performing the system implementation. The IV&V effort independently selects the segments of the software and system to analyze and test, chooses the IV&V techniques, defines the schedule of

IV&V activities, and selects the specific technical issues and problems to act upon. Most projects view V&V as sufficient and do not recognize the added value the independence brings.

Financial: The NASA IV&V Program is funded from Corporate General

& Administrative (Expense). Projects may directly fund services.

6. DOE:

The Department of Energy (DOE) takes its definition from AIAA. However, DOE departs significantly from everyone else with respect to the process. To understand the DOE validation process, one also must look at their very different definition of

“verification” and its process.

First, in DOE, the Software Quality Engineering (SQE) follows the IEEE 1012 for

IV&V.

Then, in DOE, the DOD two-path process (V&V) is replaced by a threestage process

(with SQE proceeding in parallel):

Qualification : Determination of the adequacy of the conceptual model to provide an acceptable level of agreement for the domain of application

[original source: Society for Computer Simulation (SCS)]

Verification : The process of determining that a model implementation accurately represents the developer’s conceptual description of the model and the solution to the model (emphasis added) [original source: AIAA]

Validation : The process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model [original source: AIAA]

Qualification activities must be completed before Verification, or Validation, can be started because both operate only on implemented code

Verification activities

: “Accuracy is generally measured in relation to benchmark solutions of simplified model problems. Benchmark solutions refer to either analytical solutions or highly accurate numerical solutions.”*

Validation activities

: “Accuracy is measured in relation to experimental data, i.e., our best indication of reality.”*

[*According to “ Verification and Validation in Computational Dynamics ,

,by William L.

Oberkampf and Timothy G. Trucano, March 2002]

Download