G Model CIRP-598; No. of Pages 20 CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx Contents lists available at ScienceDirect CIRP Annals - Manufacturing Technology jou rnal homep age : ht t p: // ees .e lse vi er. com/ci rp/ def a ult . asp Design verification and validation in product lifecycle P.G. Maropoulos (1)a,*, D. Ceglarek (1)b a b Department of Mechanical Engineering, University of Bath, Claverton Down, Bath BA2 7AY, UK Warwick Digital Laboratory, University of Warwick, Coventry, UK A R T I C L E I N F O A B S T R A C T Keywords: Design Validation Verification Lifecycle management The verification and validation of engineering designs are of primary importance as they directly influence production performance and ultimately define product functionality and customer perception. Research in aspects of verification and validation is widely spread ranging from tools employed during the digital design phase, to methods deployed for prototype verification and validation. This paper reviews the standard definitions of verification and validation in the context of engineering design and progresses to provide a coherent analysis and classification of these activities from preliminary design, to design in the digital domain and the physical verification and validation of products and processes. The scope of the paper includes aspects of system design and demonstrates how complex products are validated in the context of their lifecycle. Industrial requirements are highlighted and research trends and priorities identified. ß 2010 CIRP. 1. Introduction Globalisation coupled with product customisation and short time to market have spearheaded new levels of competition among manufacturers. In CIRP, the needs for design adaptability [1], the ability to develop products and services for the ecommerce era [2] and the issues of dealing with design complexity [3] have been recognised. To be successful in the global market, manufacturing companies are increasingly expanding simulation models from product and process based (value chains) to service based (value networks) by focusing on lifecycle simulations and design for product variation [4] to obtain both quality of product and robustness of processes, and to enable the validation and verification of products and processes to 6-sigma. These methods are vital to reduce process faults and facilitate efficient and effective engineering changes. Current validation and verification-based approaches mainly focus on product conformance to specifications, product functionality and process capability. However, even the most robust systems can be subject to failures during product verification and validation. This paper presents the concepts of validation and verification in the product lifecycle by including analysis and review of literature and state-of-the-art in: (i) preliminary design, (ii) digital product and process development; (iii) physical product and process realisation; (iv) system and network design; and (v) complex product verification and validation. The paper starts with a summary of the scientific motivation for the review of design verification and validation. The definitions of verification and validation are then covered, including concepts and definitions arising from ISO standards as well as software * Corresponding author. development. The paper also defines the design application areas in terms of products, processes and systems and reviews mainstream methods and systems. 2. Motivation, scope and definitions of verification and validation methods and technologies 2.1. Motivation The current product and production system requirements that influence the way products are developed and verified include: Mass customisation and personalisation. Reconfigurability and flexibility of production systems. Responsive factories. Products and processes need to be designed, verified and validated in a manner that is compatible with the above industrial requirements. Fig. 1 shows a representation of validating products and processes after the digital modelling phase, clearly identifying the research questions and business drivers. Validation in the digital space is a key objective and industrial requirement that drives research and development. If this were to be feasible, the results would have been reduced lead times and critically, fewer failures and better perceived product quality by the customers. Fig. 2 shows the closed-loop nature of the process required for managing the lifecycle data capture for design validation. This ability presupposes: Integrated and holistic views of design in order to be able to validate in an integrated manner. Digital modelling and representation ability for both the product and the process (function and specification testing). A time horizon that includes the product lifecycle. 0007-8506/$ – see front matter ß 2010 CIRP. doi:10.1016/j.cirp.2010.05.005 Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx 2 Fig. 1. Validation and verification requirements in the product lifecycle. Fig. 2. Closed-loop validation and verification. The following observations are valid in relation to the present industrial practice for design verification and validation: overall functionality at component, subsystem and complete product level. Processes are also validated at each one of their physical levels so as to provide the required physical attributes of components, sub-assemblies and the overall product. The system and network design and development also includes a digital phase and major considerations are confirmed by validating real system performance. Product lifecycle aspects are best exemplified by considering how complex products are validated in the context of lifecycle considerations. The framework shown in Fig. 3, puts a coherent structure to the multiplicity of digital analyses, manufacturing processes and metrology technologies needed for the verification and validation of complex products in their lifecycle. These techniques and methods and their relevance to design verification and validation are analysed herein. Such activities are usually executed when the design process is almost complete, during prototyping and first-off testing and development. This results in frequent deviations from the required form, dimensions or function, extending development times and increasing the compliance cost. This problem is both procedural (stage or time of execution of such activities and requirement for different skills) and theoretical (lack of robust verification and validation methods for deployment during the digital design stages). The aim is to execute verification and validation as early as possible during the design process, by developing new generation digital or virtual testing methods. Complexity in design makes verification and validation even more difficult to apply as part of the design process. 2.2. Scope of the keynote paper 2.2.1. A framework for design verification and validation Fig. 3 shows the scope of the new framework for engineering design verification and validation which is lifecycle based, tracking the progression of engineering designs across four key stages: (i) from the preliminary design stage that sets the requirements, (ii) to the digital design domain, (iii) the physical, product and process development and prototyping phase, and (iv) the consequent design of the production system and network for the realisation of complex products and processes. Product and process designs are developed in the digital domain and the final validation usually requires the execution of physical trials to confirm the product properties, dimensions and 2.2.2. Keynote scope The scope for this keynote is outlined in Fig. 4. The main focus of the paper is on product and process verification and validation. System perspectives are also included for completeness and lifecycle aspects are covered by reviewing standards and practices in relation to the verification and validation of complex products. The paper principally deals with mechanical engineering design from meso-scale to large-scale, and the corresponding processes, typical of high complexity and value industry sectors such as aerospace, marine and automotive. 2.3. Definitions of verification and validation Verification and validation are the methods that are used for confirming that a product, service, or system meets its respective specifications and fulfils its intended purpose. In general terms, verification is a quality control process that is used to evaluate Fig. 3. A conceptual framework for design verification and validation. Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx Fig. 4. Scope of the keynote paper. whether or not a product, service, or system complies with regulations, specifications, or conditions imposed at the start of a development phase [5,6]. Validation, on the other hand, is a quality assurance process of establishing evidence that provides a high degree of assurance that a product, service, or system accomplishes its intended use requirements [5,6]. Verification and validation have been defined in various ways that do not necessarily comply with standard definitions. For instance, journal articles and textbooks use the terms ‘‘verification’’ and ‘‘validation’’ interchangeably [7,8], or in some cases there is reference to ‘‘verification, validation, and testing (VV&T)’’ as if it were a single concept, with no discernible distinction among the three terms [9]. Table 1 shows definitions of verification and validation as provided by international and national bodies. The definitions given by ISO 9000 [16] originate from the general field of quality and focus on the provision of ‘‘objective evidence’’ that specified requirements have been fulfilled. The verification process according to ISO is broadly defined, and validation is focused on fulfilling an intended use or application. The Global Harmonisation Task Force, defines verification in a manner compatible with ISO, and process validation is based on consistent generation of results that satisfy predetermined requirements [19]. However, such generic definitions evolved due to the specific demands of application domains. For example, in the field of metrology, the Joint Committee for Guides in Metrology defines verification on the basis that a ‘‘target measurement uncertainty has been met’’ [17]. The definition of validation is much less specific, referring to the adequacy of requirements for an intended use. The verification definition by the International Organisation of Legal Metrology [18] is based on the interpretation of the word ‘‘accurate’’, and it clearly creates a direct link with metrology in the process of establishing how different the real artefact is from its modelling representation. There are extensive definitions of verification and validation in the context of digital design and these definitions also cover aspects of modelling and simulation. These include the IEEE Standard 610 [10] and the definitions of the US Department of Defence (DoD) [12], as shown in Table 1. The US Department of Navy [13] and the CFD Committee of AIAA [14] provide definitions for modelling and simulation software systems that are derivatives of those provided by the US DoD. The US Food and Drug Administration has given definitions of digital systems verification and validation [15], which explicitly include references to the ‘‘consistency’’ and ‘‘correctness’’ of the software. SAE Aerospace [20] and Sargent [21] reported a variety of design verification aspects, as shown in Fig. 5. In summary, the generic definitions for design verification and validation are given by ISO 9000 [16]. As the digital stages of design become increasingly important, the verification of the modelling Table 1 Definitions of verification and validation in the digital and physical domains. V&V processes in digital design phase V&V processes in physical world 3 Verification Validation The process of evaluating software to determine whether the products of a given development phase satisfy the conditions imposed at the start of that phase [10] The process of determining that a computational model accurately represents the underlying mathematical model and its solution [11] The process of determining that a computer model, simulation, or federation of models and simulations implementations and their associated data accurately represent the developer’s conceptual description and specifications [12] The process of determining the degree to which a modelling and simulation (M&S) system and its associated data are an accurate representation of the real world from the perspective of the intended uses of the model [13] The process of determining that a model accurately represents the developer’s conceptual description of the model and the solution to the model [14] Providing objective evidence that the design outputs of a particular phase of the software development lifecycle meet all of the specified requirements for that phase [15] The process of evaluating software during or at the end of the development process to determine whether it satisfies specified requirements [10] Confirmation, through the provision of objective evidence, that specified requirements have been fulfilled [16] Provision of objective evidence that a given item fulfils specified requirements, such as confirmation that a target measurement uncertainty can be met [17] Pertains to the examination and marking and/or issuing of a verification certificate for a measuring system [18] Confirmation by examination and provision of evidence that the specified requirements have been fulfilled [19] The process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model [11] The process of determining the degree to which a model, simulation, or federation of models and simulations, and their associated data are accurate representations of the real world from the perspective of the intended use(s) [12] The process of determining that an M&S implementation and its associated data accurately represent the developer’s conceptual description and specifications [13] The process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model [14] Confirmation by examination and provision of objective evidence that software specifications conform to user needs and intended uses, and that the particular requirements implemented through software can be consistently fulfilled [15] Confirmation, through the provision of objective evidence, that the requirements for a specific intended use or application have been fulfilled [16] Where the specified requirements are adequate for an intended use [17] Objective evidence that a process consistently produces a result or product meeting its predetermined requirements [19] Validation of requirements and specific assumptions is the process of ensuring that the specified requirements are sufficiently correct and complete so that the product will meet applicable airworthiness requirements [20] The verification process ensures that the system implementation satisfies the validated requirements [20] Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 4 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx Fig. 6. Transition of designer’s intent to physical realisation through GPS guidelines. Fig. 5. Verification in digital and physical world (adapted from Refs. [20,21]). and simulation aspects [10,12] will become increasingly applicable. The overall process for integrated digital and physical prototype verification and validation is exemplified by SAE Aerospace [20], see Fig. 5, and the metrological practice governing the physical prototypes is given by VIM [17]. 3. International standards related to product and process design in the lifecycle perspective International standards play an important role in preserving the designer’s intent and seamlessly utilising the associated information and manufacturing practices in a heterogeneous manufacturing environment. The transition of the designer’s intent from the digital design specification to the actual product and associated service realisation is illustrated in Fig. 5. Today, as each phase of the product’s lifecycle is globally dispersed in supply and knowledge chains [2], international standards are essential to deploy standardised manufacturing execution protocols in order to establish an unambiguous definition ‘‘language’’ throughout a global supply chain and ensure consistent product performance in the service phase. Hence, the provisions of the most relevant to product and process verification and validation standards are analysed herein. 3.1. Standards for representing product information Computer interpretable representation of product information is utilised within a variety of CAx applications for design verification and validation. The majority of these standards represent geometric information and evolved to cover other aspects. Standards such as Geometrical Product Specification (GPS) [22], ASME Y14.5: Geometric Dimensioning and Tolerancing (GD&T) [23], STandard for Exchange of Product model data (STEP) [24] have thus evolved for modelling and preserving other aspects of product related information such as tolerances, kinematics, dynamics and manufacturing processes. For example, the STEP and GPS standards have evolved, providing product specific informa- tion constructs known as ‘‘application protocols’’ in STEP and ‘‘GPS matrix’’ in GPS. Current GPS standards define global guidelines along with fundamental principles for capturing designer’s intent and expressing design requirements. Product and process design characteristics such as size, angle, orientation and surface texture are considered as individual chains as shown in Fig. 6. The information regarding each characteristic is categorised according to its relevance in the product lifecycle. Each category is called a ‘‘link’’ within the GPS masterplan [22]. Thus, a comprehensive ‘‘chain-link’’ matrix (Fig. 6) has resulted in a number of GPS standards which address how product specific characteristics can be represented and utilised throughout the design, manufacture and verification phases of the product. For example, designer’s intent regarding the size of the product’s feature is preserved in the ‘‘size’’ chain of the GPS matrix. Mathieu and Dantan [25] proposed to ISO a new model for Geometric Specification and Verification called ‘‘GeoSpelling’’ as a basis for GPS standards rebuilding. The merits of GPS standards have been exploited in a variety of digital product design applications such as coherent tolerancing process [26], evaluation of measurement uncertainty [27] and quantitative characterisation of surface texture [28,29]. Srinivasan [30] identified the merits of unifying and standardising ad hoc approaches practiced by industry. GPS allows such unification and standardisation through global guidelines described in the GPS masterplan [22]. More recent GPS standards [31] introduced the concepts of specification uncertainty and correlation uncertainty that directly influence validation and verification. A symbolic language called GD&T [23] has been developed for describing nominal geometry of parts and assemblies and allowable variation in the product design and verification phase. GD&T brings significant benefits in design and inspection activities as a correct GD&T representation captures design intent and shows the functional requirements of the part as well as the method for its inspection [23]. Arguably, the most important benefit of the GD&T approach lies in ensuring, at the design phase, that component parts will assemble into the final product and function as intended [32]. Shen et al. [33] proposed a semantic GD&T representation model, named the ‘‘constraint-tolerance-feature-graph’’ that is claimed to satisfy all tolerance analysis needs. Kong et al. [34] formulated an approach for the analysis of non-stationary Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx tolerance variation during a multi-station assembly process with GD&T considerations. The application of GD&T for mechanical design has gained widespread acceptance by industry [35]. However, several organisations have attempted to implement the method without a fundamental understanding of how the design process is impacted [36]. Poorly applied GD&T, ambiguous plus/minus location or orientation controls, and sometimes no variation specifications are commonly encountered [37]. The need to capture functional requirements and improve the design of parts as well as to consider the cost and quality issues defined by GD&T makes this subject an even more important element of mechanical engineering design [38]. In summary, the GPS [22,31] and GD&T [23] standards are vital for the correct and efficient verification of mechanical engineering designs. There are exciting new research opportunities arising from the utilisation of these standards to automate the bidirectional relationships between design specifications, process capability and measurement uncertainty. The STEP project was launched with the objective of conserving the manufacturing context and developing information bridges between segregated CAx domains [24]. EXPRESS [39] is used to specify requirements on information content as ‘‘it consists of language elements that allow an unambiguous data definition and specification of constraints on the data defined’’. The development of the STEP standard was governed by industry’s need to overcome interoperability problems. The standard established a neutral data file format that is used for developing domain specific applications using application protocols (APs). For example, AP 219 [40] provides information requirements for analysing the dimensional inspection data and results of solid parts and assemblies. Fig. 7 shows a selected set of application protocols that are vitally important for the communication and sharing of data required in design verification and validation of mechanical components. 3.2. Standards for representing manufacturing processes A ‘‘process’’ in a manufacturing context is defined as a combination of activities that occur over a period of time in which objects participate [41]. The National Institute of Standards and Technology (NIST) in the USA developed the Process Specification Language (PSL) [42] to create a generic, neutral and high-level language for specifying processes and the integration of multiple process-related applications. PSL uses the ontology based Knowledge Interchange Format to specify concepts, terminology and relationships for processes. Similarly, a data model for representing manufacturing processes was developed by NIST, which later became a part of the international standard ISO 16100 for exchanging information between design and manufacturing process planning software systems for mechanical products [43]. The need for comprehensive information regarding specific manufacturing processes and the verification of components, compelled practitioners to develop process specific international 5 standards such as DMIS [44], DML [45] and I++DME [46] for the exchange of inspection process information and measurement results in the production environment. Similarly, the BS EN ISO 8062 series [47] and the BS EN ISO 10135 [48] series of standards within the GPS framework cover the requirements for casting and moulding processes. Another set of process specific standards is the ISO 14649 series [49], with parts corresponding to different processes; for instance, part 16 [50] for performing inspection operations in a STEP-NC manufacturing environment. 3.3. Standards for representing manufacturing resources A typical manufacturing system consists of a range of resources such as machine tools, material handling systems, fixtures, robotic arms, and measurement systems [51]. Each resource has a distinct purpose and thus provides specific capabilities that are utilised in manufacturing decision-making. A variety of international standards have evolved in order to utilise and exchange the information regarding manufacturing resources and their capabilities in a digital environment [52]. For example, ISO 13584 [53] with the acronym PLIB is a series of standards for the computerbased representation and exchange of part library data. PLIB is fully inter-operable with STEP [24]. Resource specific standards have evolved to satisfy business needs. For example, ISO 13399 [54] deals with the representation and exchange of cutting tool data and ASME B5.59-2 [55] is an information model for machine tools. Measurement equipment related GPS standards [56,57] were developed to describe the acceptance tests for co-ordinate measuring machines and general requirements for GPS measuring equipment respectively. 3.4. Standards for preserving design verification knowledge International standards are used to preserve and seamlessly transfer context specific knowledge obtained through design verification, within a heterogeneous manufacturing environment. Business sectors such as, aerospace manufacturing, defence, ship building and military equipment manufacturing intensively invest in research and development activities and have a strong requirement to conserve and reuse knowledge acquired through the design verification processes. Consequently, ISO 10303 AP 209 [58] has been developed by aerospace and commercial research organisations for associating engineering analysis data with geometric data. ISO 10303 AP 237 deals with the exchange of computational fluid dynamics (CFD) information, including product geometry, associated meshes defining the computational details and CFD boundary conditions [59]. 4. Verification and validation in the early stages of design – capture intent and confirm requirements The early design stages are vitally important for the correct capture of technical and lifecycle requirements arising from understanding and interpreting market needs. Verification is inherent in methods deployed during these important early stages, although this is not always appreciated by designers and manufacturing practitioners. This section outlines methods for design idea validation and quality function deployment (QFD) as well as the more technical aspects of ensuring that consistency in terms of key design objectives is maintained using key characteristics (KCs) and Design for X (DFX) techniques. 4.1. Product idea validation and market analysis Fig. 7. Integration of designer’s intents within STEP framework. There are three key considerations that are applied in the early stages of design: (1) to prioritise customer needs (CNs) in a quantitative manner based on market analysis; (2) to select the best design schema; and (3) to improve communication at all levels of the organisation. Methods such as matrix prioritisation and analytical hierarchy process [60] are applied to help the Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx 6 4.4. The use of key characteristics in early design Fig. 8. Four-phase process planning by QFD [63]. enterprise determine where to invest the development resources to achieve maximum payoff. The traditional way is to analyse CNs systematically and to transform them into the appropriate product features. However, it is difficult to assess the performance of the transformation process with an accurate quantitative evaluation. Büyüközkan et al. [61] presented a fuzzy group decision-making approach to better align CNs with objectives of product development in QFD. This prioritisation of customer needs creates a set of criteria that is used for validating the final product i.e., assessing whether the enterprise is building the right product, service or system. 4.2. Quality function deployment QFD is a customer-driven methodology for product design and development that underpins quality systems and has found extensive applications in industry via the development of a multiplicity of tools and systems that aid an enterprise in understanding the voice of the customer [60]. QFD efficiently translates CNs into design requirements and parts deployment [62]. As shown in Fig. 8, a generic QFD process consists of four phases in order to relate the voice of the customer to product design requirements (phase 1), and then translate these into parts characteristics (phase 2), manufacturing operations (phase 3), and production requirements (phase 4) [63]. During early design, the first and second phases of the four QFD phases are implemented [63] and part characteristics are defined. In summary, QFD is critical to design validation as it translates customer needs into part characteristics and production controls that can then be used for design verification, by forming the set of criteria against which product and process compliance can be assessed. 4.3. Functional decomposition and flow analysis The verification and validation process of a function can be viewed as functional decomposition and flow analysis which aim to break overall functionalities down to functionally independent sub-functions as finely as possible [64]. A functional structure can be validated by considering both logical and physical dependencies and confirming matching inputs and outputs among sub-functions [65]. Several flow analysis methods such as bond graph and Petri nets [66] and modularity methods such as function structure heuristic method [67], design structure matrix [68] and modular function deployment [69] are applicable to the verification and validation of functional structures. In an era of increasing product sophistication, engineered systems are likely to become more complicated, increasing the functional requirements [3]. Suh [3] defined complexity as the measure of uncertainty in achieving the functional requirements of a complex system and outlined how axiomatic design can be used to reduce design complexity while satisfying the functional requirements within given constraints. As such, axiomatic design can enhance the functional validation of designs. Variability in production and measurement procedures can result in lower than expected quality levels, compromised product performance and increased rectification costs. Key characteristics (KCs) are being used to help identify and reduce important root causes of variability [70]. Research focused on KCs has had a significant impact in improving product and process performance in the context of the lifecycle [71,72]. KC methodologies have been introduced into the product development practices of world-class companies [73]. Thornton [74] categorised product related KCs according to the level of the product model as KCs belonging to; product, subsystem, component, feature and feature face. Thornton [75] proposed a method for variation risk management in aircraft and automotive production by establishing a direct link between KCs and the type of inspection process used for verification. The use of KCs for manufacturing planning during early design enhances process verification. Dai and Tang [76] defined verification parameters by prioritizing KCs. Whitney [77] proposed a KC oriented method for assembly planning by selecting the necessary part features, tools and machine capabilities. Wang and Ceglarek [78] developed a KC based methodology for quality-driven sequence planning. Suri et al. [79] introduced a technique based on key inspection characteristics to enhance process capability. Maropoulos et al. [80] proposed the use of aggregate product models as a method for the early integration of dimensional verification and process planning for complex product design and assembly. Maropoulos et al. [81] outlined the verification and validation related benefits arising from the integration of measurement and assembly using a digital enterprise framework that links key elements of the product, process and resource models. 4.5. Design for X Design for X (DFX) is an umbrella term used to denote design philosophies and methodologies which aim to improve designs by raising the designer’s awareness for a certain product lifecycle value or characteristic represented by ‘X’ [82]. The design considerations applied in DFX have a direct relationship to the verification methods for the ‘‘X’’ objective. Design for Manufacture (DFM) [77,83] includes a wide range of design rules and guidelines defined from the perspective of improving the manufacturability of parts. For example, the design guidelines for end milling stipulate that milled features should be designed in such a way so that the end mill required is limited to 3:1 in length to diameter ratio; the reason being that longer end mills are prone to chatter that deteriorates surface quality. Applying this DFM guideline will impact directly on end milling process capability in terms of surface quality and this will influence the process verification procedure, such as the sampling method deployed and the method of surface roughness measurement. The impact of Design for Assembly (DFA) [77,83] on verification is also direct. For instance, the part reduction of an electromechanical sub-assembly as a consequence of applying DFA may result in more complex parts that have additional features. This will directly change the inspection plan in terms of the number, type and sequence of measurement operations, the measurement points per operation and the selection of the measuring device. Also, DFA for automated assembly stipulates design methods so that parts can be supplied in the right orientation and do not tangle with other parts [84]. This again increases process yield and influences the sampling method deployed for assembly verification data collection and analysis. Design for Ergonomics is important in labour intensive industries [85] and has a noticeable and positive effect on process verification, as controls and displays are re-designed so that readings cannot be misinterpreted. Design for changeover is vital in high variety environments [86] and improves process verification as a consequence of high repeatability set-ups. Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx Design for 6-sigma (DFSS) is a design activity that aims to generate high capability, 6s processes, before production commences. DFSS is usually deployed within QFD and is also referred to as ‘‘Define–Measure–Analyse–Design–Verify’’ [87]. This is an explicit reflection of the inherent ability of DFSS to enhance the verification and validation of processes. There are considerable research challenges in developing new methodologies that link DFSS with KCs, so that key product features and dimensions are specified and evaluated by applying process capability criteria. Such methods would need to be directly integrated with the definition of GD&T, so that datum points, key dimensions, inspection methods and process capability are interlinked in an unambiguous manner. 5. Design verification and validation in the digital environment Digital prototyping helps manufacturers to virtually simulate a product and its associated lifecycle phases such as, product manufacture, assembly and functionality, before the product is physically realised. This gives manufacturers an excellent opportunity to visualise and anticipate aspects of the physical performance of a design with less reliance on costly physical experimentation. Physical prototyping and testing is still a requirement, especially for complex products. However, the clear current industry trend is toward reducing physical testing by replacing suitable aspects by virtual testing and verification. The digital verification results are compared with the experimentation results; this validates and certifies computational code embedded in a digital prototype. Thus, a validated digital prototype can be utilised for verifying the physical performance of the product manufactured in the globally dispersed supply chain. 5.1. Digital mock-up A digital mock-up (DMU), sometimes referred to as a virtual prototype, is essentially a digital simulation of a physical prototype and is increasingly used for the verification of product functionality. DMU is emerging as the core design collaboration tool, around which different engineering teams verify the product through its entire lifecycle, from production planning to functional testing, maintenance and recycling [88,89]. Multiple engineering teams can now operate in parallel, working on the same DMU, and this facilitates the enterprise wide application of concurrent engineering practice. Recently, the usage of DMU has increased, mainly among aerospace and automotive companies, owing in a large part to the availability of more robust models and enhanced computing resources. For instance, the Chrysler Corporation, used DMU to reduce automobile development cycle by half, while resolving 1200 potential issues before the first physical mock-up was built [90]. Using proprietary DMU systems, Boeing was able to reduce errors and rework on its 777 airliner by 70–80%, saving 100,000 design hours and millions of dollars [90]. Similarly, Airbus is also increasingly exploiting the advantages of DMU [91]. For complex engineering products, the use of DMU is not without problems, the largest of which is ensuring data quality between all of its suppliers, customers and design offices. For instance, data loss when transferring from one CAD format to another remains a major issue [91]. In summary, DMU is a powerful verification tool and research for its development should be based on: (i) enhanced capabilities to simulate functional performance using functional mock-up methods, and (ii) the solid foundation of international standards. The existing STEP (ISO 10303) standard captures adequately geometric data, while data pertaining to history based modelling [92], assembly [93], and kinematics linkages are less well represented [94]. ISO 10303-105 [95] is a good base for kinematic structure representation and supports case studies for machine tool modelling [96]. 7 5.2. Tolerance analysis and optimisation The primary function of tolerance setting is to balance the product functionality with economic factors [97]. Excessively tight tolerances will add cost due to more complex processing stages whereas inadequately wide tolerances will result in insufficient quality and costly rework. Tolerances are vitally important in the process of dimensional verification of mechanical parts and assemblies as the uncertainty of the measurement instrument needs to be an order of magnitude smaller than the tolerance value. Historically, tolerances are decided on the basis of legacy practice within a company and as Maropoulos et al. [81] suggest, many tolerances are set based on process capability and not on the study of tolerance build-up during assembly. A review of tolerancing methods by Singh et al. [98] identifies the main academic and industrial practices dealing with tolerancing as belonging to either ‘‘tolerance analysis’’ or ‘‘tolerance synthesis’’. In essence, tolerance analysis attempts to estimate the assembly tolerance stack-up, while synthesis considers the assembly and product requirements and distributes the assembly tolerances accordingly [99]. 5.2.1. Modelling assembly tolerances Dantan and Qureshi [100] describe statistical tolerance analysis as a 2D method that computes the probability that the product can be assembled and will function under a given set of tolerances. The assembly response function can be expressed as a function of the individual and independent component dimensions [101]. As shown in Fig. 9, there are two basic approaches to tolerance analysis, the worst-case method and the root sum square method [98]. The worst-case method assumes that the tolerances are at their respective extremities and the stack-up is consistently accumulative (i.e., there is no tolerance cancellation). This is a pessimistic estimate, but due to its simplicity it is still relevant today; however it can only be employed in one-dimension at a time [102]. The root sum square (RSS) method conversely gives a rather optimistic assembly tolerance estimate, as it is a simple statistical model based on the normal distribution. As before, the RSS method is only suited to single dimensional tolerance problems [103]. A more advanced method that is somewhat more indicative of tolerance stack-up in the physical world, is the Spotts modified approach [104]; this is essentially an average of the worse-case and the RSS model. The ‘‘correction factor’’ approach is also experimentally based, based on scaling the RSS to make it a more realistic figure. However, this method has particular limitations if the tolerances/dimensions in the stack-up vary greatly and/or are of small quantities [98]. More complex assembly response functions and non-normal tolerance distributions can cause difficulties when using traditional analytical techniques as a high number of samples is required to create an accurate estimation of the assembly response. In such cases, Monte Carlo Simulation (MCS) has become a viable solution. MCS can be applied when the assembly response function cannot be expressed analytically as a linear model and also when dealing with the effects of tolerance stack-up within kinematic systems [105]. In the ‘‘kinematic’’ approach [106], the tolerance chain is treated as a kinematic loop, with the understanding that the movements of the links are actually small displacements within prescribed tolerance zones. This approach involves modelling the small displacements using small displace- Fig. 9. Tolerance analysis [98]. Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 8 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx ment torsors [107] and modelling the effects that local small displacement have on the remote functional requirement using Jacobian transforms [108]. Desrochers et al. [109] proposed a unified Jacobian-torsor model for statistical or worst-case tolerance analysis or synthesis [110]. 5.2.2. Digital tolerancing methods and tolerance optimisation Optimizing tolerances aims to maximise the functional performance and economic factors associated with tolerances. The economic factor is often expressed in a quality loss function [111] and in most applications the Taguchi loss function is used. Govindaluri et al. [97] consider the quality loss from the perspective of the customer and the manufacturing and rejection costs by the manufacturer. When incorporating Taguchi’s quality loss function Cheng and Maghsoodloo [112] found that when a component’s mean varies, only the quality loss associated with that component will be changed; whereas when a component’s variance shifts, the optimal allowance, tolerance costs, and quality losses associated with each component will be affected. Tolerance optimisation methods are classed as either deterministic or stochastic; the former considers the nominal values of design variables with respect to given input values, using a single point for evaluation, whereas the latter consider the statistical variation of the design variables [113,114]. Computer Aided Tolerancing systems can provide a simulation platform for modelling the effects of tolerance setting within a manufacturing process or assembly [115,116]. Tolerance analysis and synthesis are considered within a DMU to include aspects of tolerance build-up and assembly clashes [117]. Tolerance design methods have been summarised by Singh et al. [99] as shown in Fig. 10, including traditional and advanced methods. design has made a direct and very positive impact on part verification as helped to codify and standardise both the manufacturing processes and the inspection methods used for types of features, thus improving design verification. Research is still required to provide coherence in relating inspection systems and methods to processes, especially in cases where there is a wide range of measurement options available, such as the verification of machined features, or complex assembly features. Case [122] used methods associated with external approach directions for features to enhance process capability and Wong and Wong [123] used volumetric machining features for part modelling in their feature-based design system. Several feature-based design systems are reported with a focus on prismatic machining process. In the case of machining, feature-based design allows the corresponding definition of ‘‘standardised’’ machining processes that are proven in terms of process capability. This is of major significance, as it allows rapid verification of a design in terms of its modelling entities and the corresponding machining process. Feature-based methods had a profound effect on computer automated process planning (CAPP) for machining. Gu et al. [124] identified the sequence of machining process in four stages namely; feature extraction, feature prioritisation, clustering of operations and the identifying of precedence relationships. Laperriere and ElMaraghy used precedence graphs for assembly sequence planning [125]. Qiao et al. [126] used a genetic algorithm method to sequence the machining operations for prismatic parts. Li et al. [127] and Ong et al. [128] tried to solve the process planning problems by combining the non-traditional optimisation techniques, namely genetic algorithm and simulated annealing. Azab and ElMaraghy used quadratic assignment for reconfiguring process plans [129]. The common problems and characteristics of these CAPP approaches for machining are one or more of the following: 5.3. Features for machining CAD/CAM/CAPP verification In the last two decades, extensive research efforts in various segments of CAx integration using feature technology have been reported especially for the integration of CAD and CAM. Salomons et al. [118], and Subrahmanyam and Wozny [119] have identified three major approaches of feature technology namely; interactive feature definition, automatic feature recognition and design by features. In interactive feature definition, features are defined with human assistance after creating the geometric model. Automatic feature recognition involves the comparison of the geometric model with pre-defined generic features. Many approaches for feature recognition have been reported; Lin et al. [120] extracted manufacturing features present in a feature-based design model, while ElMaraghy and ElMaraghy [121] introduced the concept of functional and manufacturing features. Presently, the design – by – features approach has become the core technology for product modelling. Feature definitions (templates) are placed in the feature library, from which features are instantiated by specifying dimension parameters, location parameters and application related attributes. Feature-based Feature recognition is used in most of the approaches. Hence, the feature-based databases of commercial software are not utilised. After recognition, the features (mostly design oriented) are converted into application (manufacturing) features using a knowledge base or heuristic rules. The common attributes are not directly transferred to application features. The process plans produced by these systems consider only a single machine set-up. But, in the factory environment, several machines may be used in different set-ups. The precedence constraints in the component are represented with respect to features and not with respect to low-level entities, namely operations. The set-ups were optimised with respect to the tool approach directions. This in turn reduces the search space or looses feasible design points. To conclude, process planning research has not as yet reached the maturity of key methods to focus on verification and validation in an integrated manner. The feature recognition approach is theoretically the most generic approach to process planning but it partly negates the design and process standardisation and verification benefits of feature-based design. 5.4. Virtual assembly modelling and simulation Fig. 10. Tolerance design methods [99]. Virtual or digital assembly modelling is a powerful and effective technology for the verification of assemblies during the digital design phase. Assembly process planning (APP) is a core component of virtual assembly modelling as it deals with assembly constraint identification, equipment selection and sequence generation [130]. Wang and Ceglarek [131] proposed an assembly sequence planning method which comprises: (1) sequence generation for predetermined line configurations using k-piece mixed-graph representation of assembly; (2) dimensional quality model of variation propagation for assembly processes with compliant parts; and (3) evaluation of sequences based on the multivariate process capability index. Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx 9 Ceglarek [144] proposed a GA and low-discrepancy sampling technique-based optimal design space reduction method to optimise the locator positions in a multi-station assembly system while ensuring the robustness of the fixturing system in terms of the product’s dimensional quality. Fig. 11. The VADE usage scenario [134]. Using Virtual Reality (VR), the 3D digital mock-up of the product can be manipulated with the assistance of VR interactive devices. It, therefore, attracted great interest from researchers dealing with assembly planning. The advantages of applying virtual engineering for assembly process planning were summarised by Jun et al. [132]. From the concurrent engineering perspective, it is preferable to implement the assembly and disassembly process in a virtual environment at an early stage of design, when only the geometric forms are determined and the functions can still be defined [132,133]. The Virtual Assembly Design Environment (VADE) was created to demonstrate the potential and the challenges involved in the design and manufacturing processes [134]. Fig. 11 illustrates the usage scenario of VADE. The VADE system allows the user to perform assembly processes by hand and assembly tools on the virtual product with the import data from a parametric CAD system. By maintaining a dynamic correlation with a CAD system, the design information created during the virtual assembly process is updated at the end of using VADE. Banerjee et al. [135] studied the effectiveness of VR in assembly planning by comparing: blueprints, a non-immersive desktop VR environment and an immersive projection-base VR environment. The results showed that the completion time of the assembly process was approximately halved by utilising VR. An Augmented Reality (AR) based human-computer interface was developed by Ong et al. [136] to provide an immersive and intuitive environment. Unlike VR, the assembly design and planning using AR can be verified by manipulating the virtual prototypes in the real assembly environment, which will decrease the possibility of redesigning and re-planning. 5.4.1. Digital tooling and fixturing for assembly Digital assembly modelling is now well established in the advanced engineering industries, like aerospace and automotive, for the design of assemblies and their integration with the design of tooling and the associated jigs and fixtures. Commercial software systems allow the seamless integration of product, process and resource models [137]. The data generated during assembly tolerance analysis can be utilised by tool designers to define appropriate tooling tolerances. Such systems are also being deployed within ITER – the nuclear fusion project – to model the manipulation of cassette tooling, the loading of which is robot controlled [138]. Additionally, the digital mock-up of tooling can simulate accessibility issues and lines of sight for an optical measurement system [139]. Digital fixturing is a key enabling technology for low cost tooling that will enhance industry’s capability for batch production and customisation of products [140]. As an extension from the established methods of rapid prototyping (RP) from a DMU to a physical mock-up, a range of rapid tooling applications are being developed [141]. An alternative to rapid tooling is to employ reconfigurable tooling; this generally requires modular components that allow a virtually unlimited number of tooling configurations. Ceglarek et al. [142] extended the ‘‘N-2-1’’ fixture layout design methodology by introducing a movability restraint condition which is essential for material handling fixture design. Kong and Ceglarek [143] addressed a fixture workspace synthesis method for reconfigurable assembly systems. Phoomboplab and 5.4.2. Stream-of-variation modelling and design synthesis Stream-of-Variation Analysis (SOVA) is a mathematical model to describe the relation between final product quality and process parameters of complex multistage assembly [145,146]. SOVA can predict potential downstream assembly problems, based on evaluations of the design using a large array of process variables. By integrating product and process design in a pre-production simulation, SOVA can head off individual assembly errors that contribute to an accumulating set of dimensional variations, which ultimately result in out-of-tolerance parts and products. Once in the ramp-up stage of production, SOVA can compare predicted misalignments with actual measurements to determine the degree of mismatch in the assemblies and diagnose the root causes of the errors [145,146]. Individual design tasks must be integrated in order to optimise the design of the entire system. Phoomboplab and Ceglarek [147] proposed a design synthesis method based on a hybrid design structure matrix which integrates design tasks with design configurations of key control characteristics, especially for dimensional management in multistage assembly systems. The method can generate design tasks sequences to minimise simulation time as well as benchmark design task sequences in terms of dimensional quality improvement. 5.5. Digital measurement modelling and planning 5.5.1. Measurement and inspection planning techniques The measurement process, often called inspection process, is now a vital element of integrated design and manufacturing [148]. Computer Aided Inspection Planning (CAIP) systems have been developed to accomplish the measurement planning task by the following generic procedures: (1) CAD interface and feature recognition, (2) determination of the inspection sequence of the features of a part, (3) determination of the number of measuring points and their locations, (4) determination of the measuring paths, and (5) simulation and verification [149]. The stages of CAIP for Co-ordinate Measuring Machines (CMMs), are defined as; establish the best sequence of inspection steps, the detailed inspection procedure of each feature, feature accessibility by probes, probe path planning and collision checking, generating the CMM control commands, and the post-processing of measured data such as statistical and cost analysis [150]. The first generation of inspection planning systems was developed by Hopp [151] and ElMaraghy and Gu [152]. Automatic inspection planning for dimensional and geometric inspections has two distinguished levels: macro- and micro-level planning [153,154]. Subsequently, Lee et al. [155] divided the planning process into two steps: global inspection planning that is focused on the generation of an optimum inspection sequence and local inspection planning that is focused on minimizing errors and times throughout the measurement process. Research in CAIP falls into two categories: (a) tolerance-driven inspection process planning and (b) geometry-based inspection process planning [148]. The former considers inspections on features with allocated tolerance requirements while the latter aims to conduct an entire geometry inspection by comparing the obtained complete geometric description of a part or product with the design model. The geometry-based CAIP systems theoretically offer a more coherent inspection process but at a high time and cost [148]. Recent research has been carried out aiming at the automation of the inspection features reorganisation, by extracting from the CAD model directly. Similar research concerning feature clustering, probe accessibility and orientation analysis dominated research interest for CMM-based inspection planning carried out Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 10 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx Fig. 12. Overview of the theoretical framework for integrating measurement with assembly planning. by Limaiem and ElMaraghy [156], Zhang et al. [157] and Hwang et al. [158]. With the rapid development of artificial intelligence and knowledge-based techniques, Expert Systems, Neural Networks and Fuzzy Logic were used to automate the measurement planning process. The expert system developed by Moroni et al. [159] tackles the problem of selecting touch probes and generating the measurement configurations. Lu et al. [160] and Hwang et al. [158] employed artificial neural network techniques to obtain the optimum inspection sequence while Beg and Shunmugam [161,162] achieved the same objective utilizing Fuzzy Logic on a prismatic part inspection process. Mohib et al. [163] used knowledge rules to select the most appropriate probe type and optimised the planned inspection tasks using a hybrid laser/CMM for complex geometries. 5.5.2. Metrology process modelling for verification planning Process modelling is an essential technology for design evaluation and process planning based on the codification of engineering knowledge and analytical methods [164,165]. There is a scarcity of metrology process models for measurement planning and this may be due to the traditional industrial perception of metrology simply being a verification step, rather than being an essential element of the production process [166]. Moreover, new frameless metrology systems have been integrated with production and assembly, enhancing the need for developing a process model to codify their capabilities [80,81]. Maropoulos et al. [166] proposed a theoretical framework for the development of metrology process models for integrating product design with assembly planning, based on the Digital Enterprise Technology methodology [167,168]. Fig. 12 shows the metrology framework, with the metrology process model positioned central to the integration of the design verification process with the verification of assembly operations and the subsequent deployment of measurement systems that support measurementassisted automation. The framework explicitly recognises the need to co-ordinate the digital verification aspects (left part of Fig. 12), with those that involve the physical deployment of measurement equipment for product and process verification (right part of Fig. 12) [166,168]. Industry requires the definition of new research projects addressing the development and evaluation of integrated metrology and assembly methods and systems that offer superior positional and orientation accuracy, with in-built verification capability. Such systems must be fully compliant with relevant standards and best practice guides including; ISO GUM [169], ASME B89.4.19 [170] and STEP (ISO 10303) [24]. 5.5.3. Measurement and inspection equipment selection A vitally important stage in the digital verification planning is the identification and selection of inspection equipment. This largely refers to measuring systems deployed for dimensional and shape validation of parts and assemblies. There is a very wide spectrum of physical scale and accuracy requirements for which such systems need to be selected covering industrial production from small parts (measured in millimeters) to large, complex products such as aircraft, ships, and wind turbines [166,171,172]. New techniques such as absolute length measuring interferometry and six-degrees-of-freedom probes are frequently combined with more traditional systems such as CMMs to cover the dimensional and shape verification needs of modern products [171,172]. The selection process needs to be based on metrology process models and employs multiple criteria with a key requirement being the definition and minimisation of measurement uncertainty [163,171]. Cai et al. [168,173] proposed an approach for large volume metrology instruments selection based on measurability characteristics (MCs) analysis. Inspired by the concept of quality characteristics, MCs can be used for instrument selection on the basis of measurement capability, cost and technology readiness. Muelaner et al. [174] proposed an approach employing a data filtering technique for instrument selection and Cuypers et al. [175] specify the task requirements and part restrictions before selecting instruments manually. There are exciting, new research challenges in generic measurement systems modelling and capability derivation that are essential for instrument selection and measurement planning within CAIP. Research is also needed for the integration of CAIP with CAPP, based on the coherent modelling of capabilities. 5.6. Computational and virtual methods for functional product verification and optimisation 5.6.1. Structural function verification and finite elements analysis The growing interest in reducing reliance on testing and cut the cost and time of certification of structural systems has pushed the academic and industrial world toward the development of Virtual Testing Labs (VTL) where the Finite Element Analysis (FEA) technique is employed to predict the possible behaviour of real world structures until failure (Fig. 13). However, to reduce and replace physical testing by virtual FEA testing, procedures must be put in place to demonstrate that the virtual tests are able to replicate actual tests and to generate the necessary confidence within the design and certification communities. The first stage of FEA is the ‘‘idealisation process’’ which takes the real-life structural design problem and turns it into an idealised mathematical model, the Finite Element Model (FEM). The second stage involves selecting appropriate finite elements, mesh layouts and solution algorithms to define the structural behaviour of the idealised mechanical system. The creation of an error-control Fig. 13. Virtual testing procedure. Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx procedure to facilitate the user of the FEA in solving structural design problems has been extensively studied. Other methods for creating error-free FE models may involve the use of sensitivity analyses [176]. Besides these intrinsic FEA errors, other uncertainties are present such as the experimental boundary conditions, exact panel geometry and presence of initial imperfections that affect the accuracy of the virtual testing. Such issues are more pronounced for structures made of newly developed materials such as hybrid materials, fibre reinforced plastics (composites) due to their high dimensional variability of products. This is becoming an important issue for thick large-scale structures where measurement of residual stresses and distortion are challenging tasks. To solve these issues, upstream 3D digital measurement and quality control techniques need to be employed in a synergistic manner with the finite element method for accurate representation of structural and material behaviour under in-service loads (static, vibration, cyclic loads and impact). While classical computational stress analyses provide good predictions in the elastic regime, they have not previously achieved predictive accuracy in the presence of damage and fracture. This limitation is starting to be overcome by new simulation strategies, which combine advances in the generality and physical realism of damage formulations with new experimental techniques for probing the physics of failure at the micron and nanometer scales. These research advances are making possible high-fidelity virtual tests, where the mechanical behaviour of a structure up to ultimate failure is computed through simulations of the physical processes involved at the atomic [177], microscopic and structural scales [178]. 5.6.2. Design function verification using computational fluid dynamics With the increasing availability of affordable access to substantial computing resources, computational fluid dynamics (CFD) is now becoming established as a viable tool for computer aided engineering and design, in spite of uncertainties that continue to surround the topics of automated mesh generation, solution sensitivity to mesh size and distribution, and the verification and realism of turbulence models. CFD software offers increasingly sophisticated (and computationally demanding) analysis features such as free-surface modelling, fluid-structure interaction (FSI) and large eddy simulation (LES). The turbomachinery and aircraft industries have made use of CFD for many years to study flows around smooth-shaped aerodynamic surfaces. Calibrated physical models are used for these flows using highly structured ‘‘curvilinear’’ (body-fitted) meshes to make best use of available resources. CFD has resulted in significant improvements to the design of compressor and turbine blades [179], including the use of inverse design and multiobjective optimisation techniques [180], with the attention of the industry and researchers now turning ever more assiduously to improving the use of valuable compressor bleed air in gas-turbine internal-air cooling systems [179,181]. In aircraft design, the requirement to carry out large-scale computations of complete aircraft configurations motivated the development of empirical ‘‘one-equation’’ models of turbulence for computational economy [182]. Following a period in which turbulence models tended to move toward more complicated, multiple-equation closures (such as shear–stress, v2-f or the even more substantial Reynolds–Stress models), the robustness and relative economy of one-equation models, such as Spalart and Allmaras [182], is enjoying a return to more widespread favour, and developments of such models to account for more complicated flow situations are now being proposed and introduced [183]. An important issue with the handling of complex geometries such as car body surfaces is the efficient translation from solid model geometry (CAD) representations into a form suitable for automated mesh generation for CFD. Dawes [184] proposes a tightly integrated approach in which a pre-defined mesh also acts as the surface geometry detection mechanism (using algorithms 11 Fig. 14. Isosurface of instantaneous vorticity over an F-18C aircraft at 308 angle of attack [185]. derived from medical imaging). This also lends itself to boundary surface adaptation in response to the flow, a process known as sculpting. Similar modelling of the interface between flexible membranes or solid surfaces and the forces exerted on them by a fluid medium is the basis of FSI, where finite element modelling can be integrated with CFD to calculate structural deformation in response to varying fluid dynamics loads. LES offers the prospect of less reliance of solutions on the often incomplete representation of flow physics using turbulence models. In LES, an unsteady turbulent flow is simulated in full three-dimensional and time-accurate detail, with only the exception of very small-scale (so-called ‘‘sub-grid’’) energy dissipation processes. The matching of LES techniques to more traditional modelling methods in low turbulence research, such as near walls, offers the prospect of high-fidelity ‘‘numerical experiments’’ being conducted replacing the need for large-scale physical testing. The unsteady information provided by the LES technique also lends itself naturally to the unsteady aerodynamics of separated flows, for example around wind turbine blades or around aircraft at very high angles of attack as shown in Fig. 14, as well as providing the fluctuating pressure information that is vital for controlling unsteady vibrations or acoustic signatures. 6. Physical product and process verification and validation 6.1. Product design – physical verification and validation Before digital prototyping and testing became the prerequisites of rapid product development, physical prototyping techniques were prevalent in industry and have influenced product performance, quality and competitiveness in the global markets. Physical testing is still an expected industry practice, frequently linked to product certification. For example, aerospace products undergo strict testing to pass certification criteria and automobile manufacturers are required to test their prototypes following combustion and safety standards. Moreover, physical testing generates valuable knowledge and data that can be utilised to enhance the design of future products or variants. 6.1.1. Dimensional and shape verification and validation Component verification is the process of assessing the conformance of key features and characteristics of a manufactured component to the specifications prescribed by the product designers, as these are captured by the GD&T notations. The scope of this paper is according to the GPS standard [186], that prescribes the surface, geometric and dimensional characteristics involved in verification, as shown in Fig. 15. Fig. 15. Dimensional and shape characteristics of GPS standards [186]. Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 12 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx Designers define tolerances on core models that are intended to describe the maximum allowable variation from the nominal size. Tolerances do not include any allowance for, or knowledge of, the measurement uncertainty of the equipment used to verify the dimensions. The standard ISO 14253 [187] makes it clear that the onus is on the supplier of the measurement data to guarantee the conformance to specification (tolerance) of the measurements, and that the data takes account of measurement uncertainty. There are several ways of carrying out dimensional and shape verification [171] including direct or indirect measurements, and measuring either all the parts (100% inspection) or a selection of parts. Direct measurements are taken off the part itself by deploying metrology systems suitable for the physical size and scale of the artefacts and these systems are outlined in the enabling technologies Section 6.4. Indirect dimensional verification requires taking inferred dimensions from something other than the part, for example by measuring the jig that is used to assemble the part. Verification may also be inferred statistically through controlling and measuring the process, as outlined in Section 6.3, and this can bring significant cost benefits through improvements to process capability. The level of inspection required for any given feature is dictated by the risk of non-conformance. Depending on the industry sector, design risk is driven by performance, safety and fit. Process and inspection risks are dictated by the capability of the process and inspection systems. Due to the criticality of aerospace components, high-risk features will always be subject to 100% inspection. Features that can be effectively controlled by validating the manufacturing process can be subjected to a reduced inspection regime, typically yielding a 50–75% reduction in final inspection load, reducing measurement time per part. A freeform surface, also known as a complex or sculpted surface, is classified in ISO 17450-1:2007 [186] as a complex feature with no invariance degree. Existing technologies for measuring free form surfaces are detailed by Savio et al. [188]. Photogrammetry and laser scanning are mature technologies for surface characterisation with measurement accuracies of 5 parts in 105 [189] and 1 part in 104 respectively. Structured light devices are less mature technologies with accuracy 1 part in 105 but they have potential for achieving higher accuracy than laser line scanners due to the fundamental limits imposed by speckle effects [190,191]. This is where a hybrid system [163] would be advantageous. While the ISO GPS standard allows profile tolerances on freeform surfaces like straightness [192], roundness [193] and cylindricity [194], there is no standard for the verification of freeform surfaces. Multiple instruments are applicable for surface verification, as shown in Fig. 16. The production uncertainties of a free form surface, compounded by the edge trimming and the assembly processes that freeform surfaces typically are involved in, eventually manifest themselves in gaps, steps and interferences between the surfaces. Gap and flush problems on a fluid dynamic device, such as an aircraft wing, are detrimental to its performance and the fit of automotive panels is indicative of the build quality of the product. The assembly methods used to minimise freeform surface interface problems can be classified as follows; Build to nominal: the assembled product tolerance is met by simply making the key features of the parts as accurately as possible. Typically used for small products with features that can be accurately produced. Measure and adjust: the assembled product tolerance is met by measuring the interfaces and adjusting some of the parts’ position and/or orientation to minimise interface problems. For larger parts which can be difficult and expensive to produce to tight tolerances (such as door panels in the automotive industry), the position and orientation may be manipulated manually or automatically to minimise the overall interface problems [195,196]. Measure for production: the assembled product tolerance is met by measuring one side of the interface and producing the other side using the measured data. For very large freeform shapes such as wings and wind turbine blades, it is very difficult and expensive to produce parts to tight tolerances. It is often preferable to tailor parts to fit the specific physical assembly by producing parts directly using measurements from the assembly [90,188]. 6.1.2. Design structure mapping and hidden features Hidden features can be defined as those which do not easily provide line-of-sight access, as occurs commonly in cluttered assembly environments and complex and enclosed products. Measurement of these features generally requires an ability to ‘‘see around corners’’ or measure directly through opaque objects. Possible approaches include; networks of line-of-sight instruments; mirrors; articulated CMM arms; through-skin sensing (using Hall effect sensors to locate holes, fitted with magnets, on components hidden by other components); and six-degrees-offreedom probing. A key issue with networks of line-of-sight instruments is closing the metrological loop and including sufficient common points from one instrument to the next, so as to minimise error buildup. 6.1.3. Measurement equipment deployment Production metrology begins with the set-up of systems and continues through the in-process measurement and metrology enabled automation [80,81]. Metrology must be seen as a manufacturing process and Muelaner et al. [174] developed a method for measurement planning and instrument deployment. Specification of the environmental conditions in which the measurement is to be carried out should include the average temperature, temperature gradients, pressure, humidity and carbon dioxide content [197]. Accuracy, properly defined as measurement uncertainty [169], is a key performance indicator for metrology. Much work has already been carried out to model measurement uncertainty in industrial measurement processes especially for large volume applications [171] using models Fig. 16. Examples of freeform surface verification applications. Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx 13 Table 3 Non-destructive evaluation techniques. NDE method Principle of operation Acoustic emission Detection of stress waves from defects in materials Ultrasonic detection of sub-surface defects Monitoring of metallic structures under a magnetic field Colour change of dyes in cracks based on capillary action IR camera measures thermal profile of structures Pulsed light generates radiation from sub-surface defects Laser beam Doppler shift detects vibrations and defects Sheared laser-generated image acts as a reference image of a surface. Application of load or heat reveals defects Ultrasonic imaging process C-scan Eddy current Fig. 17. Mechanical design, verification and validation of products. created for laser-based spherical co-ordinate measurement systems, such as laser trackers and laser radar [170,197]. Coordinate measurements may be calculated from a number of angular measurements obtained using cameras, theodolites, and iGPS [198]. Calculating the measurement is a complex task, since measurement uncertainty impacts on part rejection rates [173,174] and the accuracy of manufacturing processes. Decision rules for proving conformance or non-conformance with specifications are clearly defined by international standards. A component dimension must be accompanied by a tolerance [199] giving a lower specification limit (LSL) and an upper specification limit (USL) while a measurement result must be accompanied by an estimate of measurement uncertainty (U) [169]. Product conformance can be proven by a measurement result that is greater than LSL + U and less than USL-U [187]. 6.2. Product testing and validation 6.2.1. Mechanical design testing The effective mechanical design of a stand-alone product or a structural component is predicated on key stages of development which are summarised in Fig. 17. As already described in Section 5.6.1, the output of FEA modelling depends on the construction of accurate meshed or meshless continua and the correct assignment of materials properties. In many cases such materials property information is available from materials textbooks [200] or in the form of software [201] but if new materials or bespoke composite materials are to be used, materials evaluation is needed to define mechanical properties. Using a range of test coupon geometries, materials evaluation performs the dual role of firstly confirming the correct selection of materials and secondly providing materials properties for FE modelling. Mechanical tests are published by standards bodies such as ASTM International and BSI British Standards. The mechanical testing of fibre composites is given by Hodgkinson [202]. Some materials parameters and materials tests are given in Table 2. Materials tests will determine elastic properties and the onset of yield and will determine whether a linear or a non-linear FE Table 2 Selected materials parameters and associated test methods. Property Parameter Test method Strength (maximum, yield, etc.) s (MPa) Strain (maximum, yield, etc.) e Young’s modulus, stiffness E, cij (GPa) Tension, compression, flexure, etc. Tension, compression, flexure, etc. Tension, compression, flexure, etc. Vibration, time of flight Torsion, shear, tension Torsion, shear, tension Torsion, shear, tension All of the above Tension, compression Pendulum and drop impact Fracture mechanics tests Fracture mechanics tests Dilatometer DSC, DMTA Dynamic stiffness Shear strength Shear–strain Shear modulus, stiffness Elastic compliance Poisson’s ratio Work of fracture Edyn (GPa) t (MPa) g G, cij (GPa) Sij (m2 N 1) nij gf (J m 2) 2 Critical strain energy release rate Gc (J m Critical stress intensity factor Kc (Pa m1/2) Thermal expansion coefficient Glass transition temperature a (K 1) Tg (K) ) Dye penetrant Infrared thermography Photothermal imaging Laser vibrometry Shearography Acoustography model is required to model the mechanical behaviour of parts. A key feature of the measurement of materials parameters is the effective use of instrumentation. Strain measurement devices such as strain gauges, extensometers and lasers are well known but techniques such as Electronic Speckle Pattern Interferometry (ESPI), Holographic Interferometry and Digital Image Correlation (DIC) [203] provide more accurate 2D and 3D information on strain distributions around stress concentrations. An obvious method of evaluating products and components is to perform static structural tests in tension, compression and shear to destruction. Performance under cyclic load (fatigue), constant stress (creep) and constant strain (stress relaxation) will allow the determination of parameters such as fatigue life (constant amplitude and complex load or strain), fatigue limit, creep compliance and stress relaxation modulus. The observation and understanding of fracture is achieved by the application of optical, electron and atomic force microscopy. Non-destructive evaluation (NDE) includes a plethora of techniques, often used to locate defects. Some NDE methods are summarised in Table 3. 6.2.2. Flow related physical verification and validation The validation of CFD analysis deals with the assessments of comparison between computational and experimental results [14,204] as shown in Fig. 18 and this generates valuable data for improving the convergence of Large Eddy Simulation and experimental tests. The key parameters in CFD validation tests deal with the aerodynamic forces that consist of three force components (lift, drag, side force) and three moments (pitching, yawing, rolling). The static aerodynamic forces and moments can be measured indirectly by integrating the surface pressure distribution [204] or directly by strain gauge balance, internal spring balance and load cell. The unsteady aerodynamic forces and moments acting on a maneuvering air vehicle [205] can be measured by using strain gauge balance and load cell. The external flow structure of an air vehicle can be illustrated qualitatively by flow pattern images and quantitatively by measuring flow velocities. Qualitative flow patterns can be Fig. 18. Flow validation process [14]. Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 14 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx obtained by using flow visualisation techniques such as; light scattering particles, dye visualisation, smoke wire, tuft-grid method and oil-film method. The Laser-Induced Fluorescent technique can visualise the flow pattern in a 2D plane of a 3D flow field [206]. Quantitative data of the flow structures can be obtained by measuring flow velocities using pitot tubes (one velocity) and five hole probes (three velocities) for steady velocity measurement at one point. Fluctuating velocities can be measured by using thermal anemometers (intrusive) and laser doppler velocimetry (non-intrusive). Particle tracking velocimetry and particle image velocimetry are capable of obtaining velocity information on a 2D plane and volumetric three-component velocimetry has been applied successfully in capturing the whole volumetric flow information [207,208]. Fig. 19. Digitisation methods for dimensional verification. follows normal distributions. However, these approaches are insufficient in the case of an ill-conditioned system. An Enhanced Piecewise Least Squares approach was proposed by Ceglarek et al. [219] to diagnose the six sigma root causes associated with product variation. 6.3. Physical process verification and validation 6.4. Enabling verification technologies The formal manufacturing process verification involves the stages of inspection, analysis, testing and demonstration. Process validation is a means of ensuring that manufacturing processes are capable of consistently producing a finished product of the required quality and it typically involves the following formal methods; fault inspection, dependability analysis, hazard analysis, reproducibility analysis and risk analysis [11]. Process validation is conducted in the context of a system including design control, quality assurance, process control, and corrective and preventive action [19]. 6.3.1. Statistical process control and Taguchi’s robust design Within the field of statistical process control (SPC), a large number of techniques [209] have become established with the goal of improving the quality of manufactured products through the reduction of variability. SPC uses empirical evidence and statistical analysis to identify quality problems. All processes contain some unavoidable random variability with random causes referred to in SPC as chance causes. Avoidable sources of variability such as faults in machinery, operator errors or defects in materials are referred to as assignable causes. A primary objective in SPC is to detect where processes are out of statistical control so that assignable causes can be identified and eliminated. Taguchi’s robust design objective is to reduce the output variation from the target by reducing the sensitivity to noise, such as manufacturing variations and deterioration over time [210]. The approach uses the ‘‘loss model’’ because it actually fits a loss measure in a signal-to-noise (S/N) ratio format. The idea is to maximise S/N through design of experiments. The focus is to increase the robustness of the system’s performance. 6.3.2. Six sigma and root cause analysis Developed at Motorola in the early 1980s, 6-sigma is a business process methodology that enhances customer satisfaction from products or services by improving manufacturing processes [211]. Design for Six Sigma (DFSS) is a methodology utilizing tools, training and measurements to enable the design of products and processes that meet customer needs and can be produced at six sigma quality levels [87,212]. To control dimensional variations during manufacturing, efficient six sigma fault root cause diagnosis is critical for improving the quality and productivity of processes [144,213]. Ceglarek and Shi [214] proposed a diagnostic approach involving single faults in a single assembly fixture and this work was extended by Ding et al. [215], using the state space modelling technique. In order to overcome problems related to an illconditioned system, Rong et al. [216] have proposed unrotated Singular Value Decomposition and matrix partitioning techniques. Liu and Hu [217] proposed designated component analysis for dimensional fault diagnosis by pre-defining a set of fault patterns called designated components. Apley and Lee [218] proposed independent component analysis to model the fault variation pattern with the assumption that no more than one error source The physical scale and shape of the component and the accuracy of the required measurement tasks are key determinant factors for the selection of verification methods and technologies. Fig. 19 shows a classification of digitisation methods for dimensional verification and validation. Broadly speaking, contact methods are suitable for small to medium size components, of <1 m3 volume, while non-contact methods can be applied for much larger parts. There is merit in combining both contact and non-contact methods in one hybrid measurement system as demonstrated by Mohib et al. [163]. Over the past ten years there has been a rapid growth of large volume metrology systems that can deploy contact or non-contact methods. Another classification of these systems relates to their configuration [220]; centralised systems have one main unit (such as a laser tracker), while distributed systems have more than one unit (such as the infrared GPS) that work together for measurement of the same point. The result of any measurement is inevitably affected by a number of systematic and non-systematic errors which contribute to the overall value of measurement uncertainty as shown in Fig. 20. Therefore, regardless of the scale, the dimensional measurement results need to be accompanied by the statement of uncertainty as defined by GUM [169]. 7. Verification of systems and networks The design of manufacturing systems is carried out using criteria related to flow of materials and values of quality, cost and delivery, as dictated by just-in-time methods. Such considerations are beyond the scope of this paper that focuses on the use of discrete event simulation (DES) and radio frequency identification (RFID) for manufacturing system verification. 7.1. Discrete event modelling and simulation Manufacturing systems are designed by considering a variety of parameters such as material flow, resource allocation and utilisation that define performance within the factory and the Fig. 20. Contributing factors to measurement uncertainty [223]. Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx supply chain. DES is widely used for the design of systems. DES allows the dynamics of complex manufacturing systems to be verified without physical implementation. Simulation is identified as the second most widely used technique in the field of operations management after modelling [221]. DES utilises a multi-state mathematical model of the system where events happen in a chronological sequence, are instantaneous and change the state of the system [222]. DES environments with 3D capabilities have been developed leading to the concept of the ‘‘digital factory’’ [224]. The results from continuing research into the field of DES have been implemented in industry to the extent where almost every major manufacturing enterprise uses this technique to verify facility configuration, throughput times, material inventory issues and logistics in the digital design phase, or to evaluate improvement ideas before their physical deployment. Johansson et al. [225] conducted a survey and reported the need to improve the coherence and reliability of data provided via DES to enhance their use by industry for verification and validation of activities. 7.2. RFID methods for the verification of production logistics RFID technology uses tags for responding to radio frequency (RF) signals by transmitting a constituent data, readers for sending and receiving RF signals and software to process data. RFIDs have seen rapid adoption in the manufacturing, service and logistics industries and this section outlines the use of RFID technology for system design verification and validation. RFID sensors are an effective means of collecting and processing real-time data from manufactured parts, products, processes and resources, thus creating a traceable, real-time view of the production system and the supply chain, allowing the verification of production schedules and logistics [226]. For modelling large/complex systems, DES systems require modelling assumptions regarding the behaviour of elements of the system (statistical distributions, etc) and the inputting of a large amount of data. The quality of DES system output is a function of the correctness of these assumptions and input data. The use of RFIDs in conjunction with DES can dramatically improve the quality of DES decision-making by the provision of verified input data regarding key behaviours of the real system. Due to the quality and quantity of real-time RFID data, there is extensive potential to utilise such data for the active adaptation and reconfiguration of a system as reported by Huang et al. [227] and the creation of wireless kanbans with embedded RFIDs as outlined by Zhang et al. [228]. RFID technology finds widespread exploitation in supply chain management and logistics for improving decision responsiveness and reducing supply chain cost via the provision of verified data [229,230]. 7.2.1. Managing information loss in product manufacture Jun at al. proposed a framework for the utilisation of RFIDs in the product lifecycle [231]. Here the main hypothesis is that during the digital phases of design and planning, the data and knowledge is usually captured and codified at acceptable levels using commercial CAx systems. As product development transits from the digital to physical phase the information flow become less and less complete and the wideranging applications of RFIDs can enhance the information capture and utilisation during product manufacture, product service and recycling [231]. This ability can arguably enhance the design of production systems and networks and improve new product designs by the utilisation of service data. 15 Fig. 21. A conceptual framework for PLM strategy development [235]. common themes. Stark [233] introduces PLM by stating that it is ‘‘the activity of managing a company’s products all the way across their lifecycles in the most effective way’’. Ameri and Dutta [234] evolved the definition further by arguing that PLM is a ‘‘knowledge management solution which supports processes throughout the product lifecycle within the extended enterprise’’. Abramovici and Sieg [235] published details of a major PLM survey in which key findings included the maturity of PLM interaces with CAD and the corresponding maturity in capturing product design data as shown in Fig. 21. The trend clearly demonstrates the considerable prospects available for improving verification during the physical stages of product lifecycle, by improving the rate of capturing and re-using relevant data using PLM. As reported by Jun et al. [231], the increasing use of RFIDs will impact positively on PLM data completeness. As lifecycle management covers the complete period from product concept definition to disposal, it generates a compelling context in which to analyse the sharing and exchange of data between the plethora of CAx systems and the impact of respective standards [236] as shown in Fig. 22. It can be seen from Fig. 22 that STEP has a dominant position in terms of PLM data exchange and Peak et al. [237] and Ming et al. [238] argue that XML and UMLbased STEP are promising technologies for improving PLM interoperability. Despite all the activity in developing open standards, Gielingh [239] points out that the uptake of open standards, in general, has been very poor and that one of the major reasons is that meaning is often lost in data translation. This is a key research area for PLM. 8.2. Verification and validation of complex products in the context of the lifecycle Complex engineering products, like automobiles and commercial aircrafts, require a set of verification and validation stages that satisfy respective legislative requirements governing their use and the increasingly demanding nature of customer aspirations, all within a cost competitive package. In addition, the products themselves are highly complex and designed by large engineering teams spread across many countries and organisations – factors that, when combined with the exacting requirements, necessitate a formal and robust design and development methodology in terms of verification be employed. 8. Methods for the lifecycle verification of complex products 8.1. Enabling technologies and standards for product lifecycle management There are many definitions of Product Lifecycle Management (PLM). While no single definition has emerged [232], there are Fig. 22. Current standards and their coverage [236]. Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 16 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx Fig. 23. The V model for the verification of complex engineering products (adapted from Refs. [20,240]). The V model of verifying product development, as shown in Fig. 23, is capturing the aerospace recommended practice for the development of civil aircraft and systems according to ARP4754a standard [20]. Broadly speaking, the left side of the V model shows the top-down requirements development and validation starting from the product and cascading down to systems and discrete items, the design of which corresponds to the very bottom of the V model. The right side of the V model represents the bottom up process of verification that starts by verifying the design of discrete items by evaluating whether the respective requirements have been met, and proceeding with the verification of systems and the complete aircraft [20]. The same verification process has been adopted for the development of complex automotive systems, like power-trains. A significant common aspect is the verification of functional requirements as captured by QFD. For complex systems, QFD needs some augmentation where the customer responses are complex, as in the subjective assessment of vehicle acceleration. Pickering and Brace [240] describe an automated method to analyse data from driveability tests of existing vehicles in order to generate correlations between the subjective driver ratings and objective test data. This allows objective assessments of power-train performance to ensure that requirements are fulfilled. For commercial aircraft, the process depicted in Fig. 23 has duration of several years and involves the use of a raft of engineering and software design systems and methods with the process being managed using PLM systems across the enterprise. For instance, the verification of stress requirements will involve use of FEA at item level and the flow performance will be verified using CFD and LES as outlined in Sections 5.6.1 and 5.6.2 respectively. verified GD&T [166,168,173]. Other trends include the expansion of Design for ‘X’ to include measurability and the application of new design guidelines [166]. These are examples of ‘‘frontloading’’ by building in measurement process knowledge early into the lifecycle. Measurement uncertainty is being measured, but not used adequately. Recently, there have been considerable efforts in evaluating the uncertainty of different measurement techniques [241], and software tools are emerging to allow predictions of measurement uncertainty to be made [242]. However, typically this information is only used within ‘islands’ of automated inspection processes. Measurement uncertainty is expected to be accounted for early in the product lifecycle. Research in improving measurement simulation is ongoing to make these kind of environments easier for designers to use [243]. 9.3. Verification modelling and planning 9. Key future requirements and trends Metrology is integral to manufacturing processes, but its development in terms of measurement modelling and planning is embryonic when compared to processes and additional research is needed in metrology process modelling [166]. In the modern production environment, metrology is becoming tightly integrated with the manufacturing processes and such integration can provide valuable information about process capability and improve the design of future products [81]. For example, new measurement techniques allow many devices to be taken to the part [171], on-machine measurement is more common [241], or measurement is used to facilitate assembly [80]. However, measurement and manufacturing process planning are still not sufficiently interlinked [166,241] and considerable verification benefits will arise from their integration. 9.1. PLM and international standards 9.4. Early design verification in the digital domain One of the major trends in PLM will be to attempt to build product and process knowledge earlier in the product lifecycle; this is termed as ‘‘frontloading’’. In Srinivasan’s review [35] of standards for product geometry specification, verification, and exchange it was observed that standards have developed rapidly since the advent of the digital enterprise. However, some of the standards, especially the open standards, have poor uptake within industry [239]. Furthermore, Zheng et al. [73] found that a key priority is to provide feedback to ‘‘close the gap’’ between the physical and digital world in the context of PLM. Integration is expected to become easier over time with the increasing emphasis on open standards for data exchange. A key future trend is the requirement for early design verification. It is well documented that early design phases account for a large percentage of lifecycle costs. This is especially true for complex engineering products and for such applications the early verification of components and the corresponding functional verification of systems are critically important tasks. The challenges are significant, including; methods to deal with verification using low design data-intensity, enhance the scope of functional verification with the development of integrated functional mock-up, and techniques for integrated product and process verification. 10. Concluding comments 9.2. GD&T and measurement uncertainty The use of GD&T is widespread in industry [35], but is not adjusted for measurability. Although it is normal for manufacturing process capability to be considered during the design stages, the GD&T that is applied rarely takes account of measurement processes and their capabilities. Research is ongoing to address this issues, including measurement instrument selection that is carried out from early design, allowing the setting of measurability- This paper analysed methods and techniques for design verification and validation, especially focusing on mechanical engineering products of meso to large-scale, and the corresponding manufacturing processes. There is clear evidence that digital domain design verification and validation is a high industrial priority and there is evident research focus in such methods, as well as considerable coverage via international standards. Physical product and process verification and validation remain important Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx requirements, especially for complex products that require certification, such as aerospace. There is a gradual, but clear development of new measurement, inspection and verification modelling and planning methods, to underpin design verification both at the digital phase and the physical testing of products and processes. Such methods are underpinned by new enabling technologies and the trend for the integration of metrology with production processes. The development of enhanced PLM capabilities, in terms of codifying and capturing post-design verification data and knowledge, will be vitally important for the successful adoption and implementation of new design verification and validation methods by manufacturing industry. Acknowledgements The authors would like to gratefully acknowledge the support and contribution of many colleagues, from CIRP and other academics, in the development of this paper. Contributions were received from; Prof Luc Mathieu, Prof Luc Laperriere, Prof Hoda ElMaraghy, Prof Torsten Kjellberg, Prof Robert Wilhelm, Prof Gunnar Sohlenius, Prof Stephen Newman, Prof Rainer Stark, Dr Aydin Nassehi, Dr Martin Ansell, Dr Michele Meo, Dr Michael Wilson, Dr Alicia Kim, Dr Zhijin Wang and Dr Chris Brace. Last, but not least, we would like to note that we are especially grateful to Dr Parag Vichare, from the University of Bath, whose contribution in relation to the preparation of this keynote paper has been outstanding. References [1] Gu P, Hashemian M, Nee AYC (2004) Adaptable Design. CIRP Annals – Manufacturing Technology 53(2):539–557. [2] Tseng MM, Kjellberg T, Lu SCY (2003) Design in the New E-Commerce Era. CIRP Annals – Manufacturing Technology 52(2):509–519. [3] Suh NP (2005) Complexity in Engineering. CIRP Annals – Manufacturing Technology 54(2):46–63. [4] ElMaraghy HA (2008) Changing and Evolving Products and Systems – Models and Enablers. in ElMaraghy HA, (Ed.) Changeable and Reconfigurable Manufacturing Systems.. Springer-Verlag Publ, pp. 25–45. [5] Babuska I, Oden JT (2004) Verification and Validation in Computational Engineering and Science: Basic Concepts. Computer Methods in Applied Mechanics and Engineering 193(36–38):4057–4066. [6] Plant R, Gamble R (2003) Methodologies for the Development of Knowledgebased Systems. Knowledge Engineering Review 18(1):47–81. [7] Jagdev HS, Browne J, Jordan P (1995) Verification and Validation Issues in Manufacturing Models. Computers in Industry 25(3):331–353. [8] Dzida W, Freitag R (1998) Making Use of Scenarios for Validating Analysis and Design. IEEE Transactions on Software Engineering 24(12):1182–1196. [9] Allen NA, Shaffer CA, Watson LT (2005) Building Modeling Tools That Support Verification Validation, and Testing for the Domain Expert. Proceedings of the 2005 Winter Simulation Conference, Orlando, FL, 419–426. [10] Geraci A (1991) IEEE Standard Computer Dictionary Compilation of IEEE Standard Computer Glossaries . [11] ASME (2006) Guide for Verification and Validation in Computational Solid Mechanics, Ptc 60/V&V 10. [12] U.S. Department of Defence (2008) Documentation of Verification, Validation & Accreditation (VV&A) for Models and Simulations. [13] U.S. Department of Navy (2004) Modelling and Simulation Verification, Validation, and Accreditation Implementation Handbook. [14] American Institute of Aeronautics and Astronautics (AIAA) (1998) Guide for the Verification and Validation of Computational Fluid Dynamics Simulations, AIAA-G-077-1998. [15] U.S. Food and Drug Administration (FDA) (2002) General Principles of Software Validation, Final Guidance for Industry and FDA Staff. [16] ISO 9000 (2005) Quality Management Systems: Fundamentals and Vocabulary. [17] Joint Committee for Guides in Metrology (JCGM) (2008) VIM: International Vocabulary of Metrology – Basic and General Concepts and Associated Terms. [18] International Organization of Legal Metrology (OIML) (2009) International Vocabulary of Terms in Legal Metrology (VIML). [19] Global Harmonization Task Force (GHTF) (2004) Quality Management System – Process Validation Guidance, 2 ed. [20] SAE Aerospace ARP4754a (2009) Aerospace Recommended Practice. [21] Sargent RG (2005) Validation and Verification of Simulation Models. Proceedings of the 2005 Winter Simulation Conference, 130–143. [22] ISO/TR 14638 (1995) Geometrical Product Specification (GPS) – Masterplan. [23] ASME Y14.5 (2009) Dimensioning and Tolerancing. The American Society of Mechanical Engineers. [24] ISO 10303-1 (1994) Industrial Automation Systems and Integration – Product Data Representation and Exchange. Part 1: Overview and Fundamental Principles. 17 [25] Mathieu L, Dantan JY (2003) Geospelling: A Common Language for Geometrical Product Specification and Verification to Express Method Uncertainty. Proceedings of 8th CIRP Seminar on Computer Aided Tolerancing, NC, USA, 70– 79. [26] Dantan JY, Ballu A, Mathieu L (2008) Geometrical Product Specifications – Model for Product Life Cycle. Computer-Aided Design 40:493–501. [27] Heping P, Xiangqian J (2009) Evaluation and Management Procedure of Measurement Uncertainty in New Generation Geometrical Product Specification (GPS). Measurement 42(5):653–660. [28] De Chiffre L, Lonardo P, Trumpold H, Lucca DA, Goch G, Brown CA, Raja J, Hansen HN (2000) Quantitative Characterisation of Surface Texture. CIRP Annals – Manufacturing Technology 49(2):635–652. [29] Balsamo A, Di Ciommo M, Mugno R, Rebaglia BI, Ricci E, Grella R (1999) Evaluation of CMM Uncertainty through Monte Carlo Simulations. CIRP Annals – Manufacturing Technology 48(1):425–428. [30] Srinivasan V (2007) Computational Metrology for the Design and Manufacture of Product Geometry: A Classification and Synthesis. Journal of Computing and Information Science in Engineering 7(1):3–9. [31] DD ISO/TS 17450-2 (2002) Geometrical Product Specifications (GPS) – General Concepts. Part 2: Basic Tenets, Specifications, Operators and Uncertainties. [32] Chiabert P, Lombardi F, Orlando M (1998) Benefits of Geometric Dimensioning and Tolerancing. Journal of Materials Processing Technology 78(1–3):29– 35. [33] Shen ZS, Shah JJ, Davidson JK (2008) Analysis Neutral Data Structure for GD&T. Journal of Intelligent Manufacturing 19(4):455–472. [34] Kong Z, Huang W, Oztekin A (2009) Variation Propagation Analysis for Multistation Assembly Process with Consideration of GD&T Factors. Journal of Manufacturing Science and Engineering 131(5). 10.1115/1111.4000094. [35] Srinivasan V (2008) Standardizing the Specification, Verification, and Exchange of Product Geometry: Research, Status and Trends. Computer-Aided Design 40:738–749. [36] Rennels KE (2003) Current Methodologies for Geometric Dimensioning and Tolerancing. Proceedings of the Electrical Insulation Conference and Electrical Manufacturing & Coil Winding Technology Conference, 565–569. [37] Watts D (2007) The ‘‘GD&T Knowledge Gap’’ In Industry. Proceedings of the ASME International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Las Vegas, NV, 597– 604. [38] Zhang Y, Yang M (2009) A Coordinate SPC Model for Assuring Designated Fit Quality Via Quality-oriented Statistical Tolerancing. Computers and Industrial Engineering 57(1):73–79. [39] ISO 10303-11 (1994) Industrial Automation Systems and Integration – Product Data Representation and Exchange. Part 11: Description Methods: EXPRESS Language Reference Manual. [40] ISO CD 10303-219 (2005) Industrial Automation Systems and Integration – Product Data Representation and Exchange. Part 219: Application Protocol: Dimensional Inspection Information Exchange. [41] Bock C (2006) Interprocess Communication in the Process Specification Language, NISTIR 7348, NIST, Gaithersburg, MD. [42] ISO 18629-1 (2004) Industrial Automation Systems and Integration – Process Specification Language. Part 1: Overview and Basic Principles. [43] Matsuda M, Arai E, Nakano N, Wakai H, Takeda H, Takata M, Sasaki H (2005) An Interoperability Framework and Capability Profiling for Manufacturing Software Knowledge and Skill Chains in Engineering and Manufacturing Information Infrastructure in the Era of Global Communications. Springer, Boston. pp. 75– 84. [44] DMIS 4.0 (2001) Dimensional Measuring Interface Standard, ANSI/CAM-I104.0. [45] DML 2.0 (2004) Dimensional Markup Language. [46] International Association of Co-ordinate Measuring Machine Manufacturers (2009) Dimensional Measurement Equipment Interface. [47] BS EN ISO 8062 (2007) Geometrical Product Specifications (GPS) – Dimensional and Geometrical Tolerances for Moulded Parts Part 1: Vocabulary. [48] BS EN ISO 10135 (2009) Geometrical Product Specifications (GPS) – Drawing Indications for Moulded Parts in Technical Product Documentation (Tpd). [49] ISO 14649-1 (2002) Industrial Automation Systems and Integration – Physical Device Control – Data Model for Computerized Numerical Controllers. Part 1: Overview and Fundamental Principles. [50] ISO WD 14649-16 (2004) Industrial Automation Systems and Integration – Physical Device Control – Data Model for Computerized Numerical Controllers. Part 16: Data for Touch Probing Based Inspection. [51] Vichare P, Nassehi A, Newman ST (2009) A Unified Manufacturing Resource Model for Representation of CNC Machine Tools. Proceeding of the IMechE Part B Journal of Engineering Manufacture 223(5):463–483. [52] Vichare P, Nassehi A, Kumar S, Newman ST (2009) A Unified Manufacturing Resource Model for Representation of CNC Machining Systems Elements. Robotics and Computer Integrated Manufacturing 25(6):999–1007. [53] ISO 13584-42 (1998) Industrial Automation Systems and Integration. Part 42: Parts Library, Methodology for Structuring Parts Families. [54] ISO 13399-1 (2006) Cutting Tool Data Representation and Exchange. Part 1: Overview, Fundamental Principles and General Information Model. [55] ASME B5.59-2 (2005) Information Technology for Machine Tools. Part 2: Data Specification for Properties of Machine Tools for Milling and Turning (Draft). [56] BS EN ISO 10360 (2001) Geometrical Product Specifications (GPS) – Acceptance and Reverification Tests for Coordinate Measuring Machines (CMM). Part 1: Vocabulary. [57] BS EN ISO 14978 (2006) Geometrical Product Specifications (GPS) – General Concepts and Requirements for GPS Measuring Equipment. [58] ISO 10303-209 (2001) Industrial Automation Systems and Integration – Product Data Representation and Exchange. Part 209: Application Protocol: Composite and Metallic Structural Analysis and Related Design. Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 18 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx [59] ISO WD 10303-237 (2003) Industrial Automation Systems and Integration – Product Data Representation and Exchange. Part 237: Application Protocol: Fluid Dynamics Data. [60] Ramanathan R, Yunfeng J (2009) Incorporating Cost and Environmental Factors in Quality Function Deployment Using Data Envelopment Analysis. Omega 37(3):711–723. [61] Büyüközkan G, Ertay T, Kahraman C, Ruan D (2004) Determining the Importance Weights for the Design Requirements in the House of Quality Using the Fuzzy Analytic Network Approach. International Journal of Intelligent Systems 19(5):443–461. [62] Shimomura Y, Hara T, Arai T (2008) A Service Evaluation Method Using Mathematical Methodologies. CIRP Annals – Manufacturing Technology 57(1):437–440. [63] Chen LH, Ko WC (2009) Fuzzy Linear Programming Models for New Product Design Using QFD with FMEA. Applied Mathematical Modelling 33(2):633– 647. [64] Ullman D (1992) The Mechanical Design Process. McGraw-Hill, New York. [65] Kim D, Xirouchakis P (2010) CO2DE: A Decision Support System for Collaborative Design. Journal of Engineering Design 21(1):31–48. [66] Deng YM, Britton G, Tor S (2000) Constraint-based Functional Design Verification for Conceptual Design. Computer-Aided Design 32:889–899. [67] Stone RB, McAdams DA, Kayyalethekkel VJ (2004) A Product Architecturebased Conceptual DFA Technique. Design Studies 25(3):301–325. [68] Ulrich K, Eppinger S (2004) Product Design and Development. McGraw-Hill, New York. [69] Ericsson A, Erixon G (1999) Controlling Design Variants: Modular Product Platforms. ASME Press, New York. [70] Ceglarek D, Shi J (1995) Dimensional Variation Reduction for Automotive Body Assembly. Manufacturing Review 8(2):139–154. [71] Whitney DE (2006) The Role of Key Characteristics in the Design of Mechanical Assemblies. Assembly Automation 26(4):315–322. [72] Mathieu L, Marguet B (2001) Integrated Design Method to Improve Producibility Based on Product Key Characteristics and Assembly Sequences. Annals of the CIRP 50(1):85–88. [73] Zheng LY, McMahon CA, Li L, Ding L, Jamshidi J (2008) Key Characteristics Management in Product Lifecycle Management: A Survey of Methodologies and Practices. Proceedings of the IMechE Part B Journal of Engineering Manufacture 222(8):989–1008. [74] Thornton AC (1999) A Mathematical Framework for the Key Characteristic Process. Research in Engineering Design 11(3):145–157. [75] Thornton AC (2004) Variation Risk Management: Focusing Quality Improvements in Product Development and Production. Wiley, NJ. [76] Dai W, Tang XQ (2008) Quality Plan Model for Product Development. Proceedings of the 38th International Conference on Computers & Industrial Engineering, Bejing, China, 1535–1541. [77] Whitney DE (2004) Mechanical Assemblies: Their Design, Manufacture and Role in Product Development. Oxford University Press, USA. [78] Wang H, Ceglarek D (2005) Quality-driven Sequence Planning and Line Configuration Selection for Compliant Structure Assemblies. CIRP Annals – Manufacturing Technology 54(1):31–35. [79] Suri R, Frey DD, Otto KN (2001) Key Inspection Characteristics. Journal of Mechanical Design 123(4):479–485. [80] Maropoulos P, Zhang D, Rolt S, Chapman P, Rogers B (2006) Integration of Measurement Planning with Aggregate Product Modelling for Spacecraft Design and Assembly. Proceedings of the IMechE Part B Journal of Engineering Manufacture 220(10):1687–1695. [81] Maropoulos P, Zhang D, Chapman P, Bramall D, Rogers B (2007) Key Digital Enterprise Technology Methods for Large Volume Metrology and Assembly Integration. International Journal of Production Research 45(7):1539–1559. [82] Kuo T-C, Huang SH, Zhang H-C (2001) Design for Manufacture and Design for X: Concepts. Applications and Perspectives Computers and Industrial Engineering 41(3):241–260. [83] Boothroyd P, Knight WA (2002) Product Design for Manufacture and Assembly. CRC Press, New York, USA. [84] Andreasen MM (1988) Design for Assembly. Springer-Verlag, New York, USA. [85] Pahl G, Beitz W, Wallace K (1996) Engineering Design: A Systematic Approach. Springer, New York, USA. [86] Reik MP, McIntosh RI, Culley SJ, Mileham AR, Owen GW (2006) A Formal Design for Changeover Methodology. Part 1: Theory and Background. Proceedings of the IMechE Part B Journal of Engineering Manufacture 220(8):1225– 1235. [87] Yang K, Ei-Haik BS (2003) Design for Six Sigma: A Roadmap for Product Development. McGraw-Hill. [88] Wang GG (2002) Definition and Review of Virtual Prototyping. Journal of Computing and Information Science in Engineering 2(3):232–237. [89] Wörn H, Frey D, Keitel J (2000) Digital Factory – Planning and Running Enterprises of the Future. Proceedings of the 26th Annual Conference of the IEEE Electronics Society, 1286–1291. [90] Rooks B (1998) A Shorter Product Development Time with Digital Mock-up. Assembly Automation 18(1):34–38. [91] Garbade R, Dolezal WR (2007) DMU at Airbus – Evolution of the Digital Mockup (DMU) at Airbus to the Centre of Aircraft Development User Keynote, the Future of Product Development. Proceedings of the 17th CIRP Design Conference, 3–12. [92] Pratt MJ, Amnderson BD, Ranger T (2005) Towards the Standardized Exchange of Parameterized Feature-based CAD Models. Computer-Aided Design 37(12):1251–1265. [93] ISO 10303-109 (2004) Industrial Automation Systems and Integration – Product Data Representation and Exchange. Part 109: Integrated Application Resource: Kinematic and Geometric Constraints for Assembly Models. [94] Contero M, Company P, Vila C, Aleixios N (2002) Product Data Quality and Collaborative Engineering. IEEE Computer Graphics and Applications 22(3):32– 42. [95] ISO 10303-105 (1996) Industrial Automation Systems and Integration – Product Data Representation and Exchange. Part 105: Integrated Application Resources: Kinematics. [96] Kjellberg T, Von Euler-Chelpina A, Hedlinda M, Lundgrena M, Sivarda G, Chena D (2009) The Machine Tool Model – A Core Part of the Digital Factory. Annals of the CIRP 58(1):425–428. [97] Govindaluri MS, Shin S, Cho BR (2004) Tolerance Optimization Using the Lambert W Function: An Empirical Approach. International Journal of Production Research 42(16):3235–3251. [98] Singh PK, Jain PK, Jain SC (2009) Important Issues in Tolerance Design of Mechanical Assemblies. Part 1: Tolerance Analysis. Proceedings of the IMechE Part B Journal of Engineering Manufacture 223(10):1225–1247. [99] Singh PK, Jain PK, Jain SC (2009) Important Issues in Tolerance Design of Mechanical Assemblies. Part.2: Tolerance Synthesis. Proceedings of the IMechE Part B Journal of Engineering Manufacture 223(10):1249–1287. [100] Dantan J-Y, Qureshi A-J (2009) Worst-case and Statistical Tolerance Analysis Based on Quantified Constraint Satisfaction Problems and Monte Carlo Simulation. Computer-Aided Design 41(1):1–12. [101] Nigam SD, Turner JU (1995) Review of Statistical Approaches to Tolerance Analysis. Computer-Aided Design 27(1):6–15. [102] Shen Z, Ameta G, Shah JJ, Davidson JK (2005) A Comparative Study of Tolerance Analysis Methods. Journal of Computing and Information Science in Engineering 5(3):247–256. [103] Lin CY, Huang WH, Jeng MC, Doong JL (1997) Study of an Assembly Tolerance Allocation Model Based on Monte Carlo Simulation. Journal of Materials Processing Technology 70(1–3):9–16. [104] Spotts MF (1978) Dimensioning Stacked Assemblies. Machine Design 50(9):60–63. [105] Chun H (2008) Multibody Approach for Tolerance Analysis and Optimization of Mechanical Systems. Journal of Mechanical Science and Technology 22(2):276–286. [106] Rivest L, Fortin C, Desrochers A (1993) Tolerance Modeling for 3D Analysispresenting a Kinematic Formulation. Proceedings of 3rd CIRP Seminars on Computer Aided Tolerancing, France, 51–74. [107] Bourdet P, Ballot E (1995) Geometrical Behaviour Laws for Computer Aided Tolerancing. Proceedings of 4th CIRP Seminars on Computer Aided Tolerancing, Tokyo, 143–154. [108] Laperrière L, Desrochers A (2001) Modeling Assembly Quality Requirements Using Jacobian or Screw Transforms: A Comparison. Proceedings of the IEEE International Symposium on Assembly and Task Planning, 330–336. [109] Desrochers A, Ghie W, Laperrière L (2003) Application of a Unified JacobianTorsor Model for Tolerance Analysis. ASME Journal of Computing and Information Science in Engineering 3(1):2–14. [110] Ghie W, Laperrière L, Desrochers A (2009) Statistical Tolerance Analysis Using the Unified Jacobian-Torsor Model. International Journal of Production Research . 10.1080/00207540902824982. [111] Shin S, Govindaluri MS, Cho BR (2005) Integrating the Lambert Function to a Tolerance Optimization Problem. Quality and Reliability Engineering International 21(8):795–808. [112] Cheng B-W, Maghsoodloo S (1995) Optimization of Mechanical Assembly Tolerances by Incorporating Taguchi’s Quality Loss Function. Journal of Manufacturing Systems 14(4):264–276. [113] Benanzer TW, Grandhi RV, Krol WP (2009) Reliability-based Optimization of Design Variance to Identify Critical Tolerances. Advances in Engineering Software 40(4):305–311. [114] Ballu A, Plantec JY, Mathieu L (2008) Geometrical Reliability of Over Constrained Mechanisms with Gaps. Annals of the CIRP 57(1):159–162. [115] Jeang A (2001) Computer-aided Tolerance Synthesis with Statistical Method and Optimization Techniques. Quality and Reliability Engineering International 17(2):131–139. [116] Gadallah M, ElMaraghy HA (1998) A New Algorithim for Combinatorial Optimization: Application to Tolerance Synthesis with Optimum Process Selection. in ElMaraghy HA, (Ed.) Geometric Design Tolerancing: Theories, Standards and Applications. Kluwer Academic Publ, pp. 265–281. [117] Curran R, Gomis G, Castagne S, Butterfield J, Edgar T, Higgins C, McKeever C (2007) Integrated Digital Design for Manufacture for Reduced Life Cycle Cost. International Journal of Production Economics 109(1–2):27–40. [118] Salomons OW, Houten FJAMV, Kals HJJ (1993) Review of Research in Featurebased Design. Journal of Manufacturing Systems 12:113–132. [119] Subrahmanyam S, Wozny M (1995) An Overview of Automatic Feature Recognition Techniques for Computer-aided Process Planning. Computers in Industry 26(1):1–21. [120] Lin AC, Lin SY, Cheng SB (1997) Extraction of Manufacturing Features from a Feature-based Design Model. International Journal of Production Research 35(12):3249–3288. [121] ElMaraghy HA, ElMaraghy WH (1994) Computer-aided Inspection Planning (Caip). in Shah JJ, (Ed.) Advances in Feature Based Manufacturing. Elsevier Publ, pp. 363–396. [122] Case K (1994) Using a Design by Features CAD System for Process Capability Modeling. Computer Integrated Manufacturing Systems 7(1):39–49. [123] Wong TN, Wong KW (1998) Feature-based Design by Volumetric Machining Features. International Journal of Production Research 36(10):2839–2862. [124] Gu Z, Zhang YF, Nee AYC (1997) Identification of Important Features for Machining Operations Sequence Generation. International Journal of Production Research 35(8):2285–2307. [125] Laperriere L, ElMaraghy HA (1994) Assembly Sequences Planning for Simulaneous Engineering Applications. Internation Journal of Advanced Manufacturing Technology 9:231–244. Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx [126] Qiao L, Wang XY, Wang SC (2000) A GA-based Approach to Machining Operation Sequencing for Prismatic Parts. International Journal of Production Research 38(14):3283–3303. [127] Li WD, Ong SK, Nee AYC (2002) Hybrid Genetic Algorithm and Simulated Annealing Approach for the Optimization of Process Plans for Prismatic Parts. International Journal of Production Research 40(8):1899–1922. [128] Ong SK, Ding J, Nee AYC (2002) Hybrid GA and SA Dynamic Set-up Planning Optimization. International Journal of Production Research 40(18):4697–4719. [129] Azab A, ElMaraghy HA (2007) A Novel Qap Mathematical Programming Formulation for Process Planning in Reconfigurable Manufacturing. Proceedings of the 4th International CIRP Conference on Digital Enterprise Technology (DET’07), Bath, UK, 259–268. [130] Zhao J, Masood S (1999) An Intelligent Computer-aided Assembly Process Planning System. International Journal of Advanced Manufacturing Technology 15:332–337. [131] Wang H, Ceglarek D (2005) Quality-driven Sequence Planning for Compliant Structure Assemblies. Annals of the CIRP 54(1):31–35. [132] Jun Y, Liu JH, Ning RX, Zhang Y (2005) Assembly Process Modeling for Virtual Assembly Process Planning. International Journal of Computer Integrated Manufacturing 18(6):442–451. [133] Bullinger HJ, Richter M, Seidel KA (2000) Virtual Assembly Planning. Human Factors and Ergonomics in Manufacturing 10:331–341. [134] Jayaram S, Jayaram U, Wang Y, Tirumali H, Lyons K, Hart P (1999) Vade: A Virtual Assembly Design Environment. Computer Graphics and Applications IEEE 19(6):44–50. [135] Banerjee A, Banerjee P, Ye N, Dech F (1999) Assembly Planning Effectiveness Using Virtual Reality. Presence 8(2):204–217. [136] Ong SK, Pang Y, Nee AYC (2007) Augmented Reality Aided Assembly Design and Planning. Annals of the CIRP 56(1):49–52. [137] Butterfield J, Crosby S, Curran R, Price M, Armstrong CG, Raghunathan S, McAleenan D, Gibson C (2007) Optimization of Aircraft Fuselage Assembly Process Using Digital Manufacturing. Journal of Computing and Information Science in Engineering 7(3):269–275. [138] Esque S, Mattila J, Siuko M, Vilenius M, Järvenpää J, Semeraro L, Irving M, Damiani C (2009) The Use of Digital Mock-ups on the Development of the Divertor Test Platform 2. Fusion Engineering and Design 84(2–6):752–756. [139] Wöhlke G, Schiller E (2005) Digital Planning Validation in Automotive Industry. Computers in Industry 56(4):393–405. [140] Bernard A, Fischer A (2002) New Trends in Rapid Product Development. CIRP Annals – Manufacturing Technology 51(2):635–652. [141] Rosochowski A, Matuszak A (2000) Rapid Tooling: The State of the Art. Journal of Materials Processing Technology 106(1–3):191–198. [142] Ceglarek D, Li H, Tang Y (2001) Modeling and Optimization of Fixture for Handling Compliant Sheet Metal Parts. Transactions of ASME Journal of Manufacturing Science and Engineering 123(3):473–480. [143] Kong Z, Ceglarek D (2006) Fixture Workspace Synthesis for Reconfigurable Assembly Systems. Journal of Manufacturing Systems 25(1):25–38. [144] Phoomboplab T, Ceglarek D (2008) Process Yield Improvement through Optimal Design of Fixture Layout in 3d Multi-station Assembly Systems. ASME Transactions Journal of Manufacturing Science and Engineering 130:061005. [145] Huang W, Lin J, Bezdecny M, Kong Z, Ceglarek D (2007) Stream-of-Variation (SOVA) Modeling I: A Generic 3d Variation Model for Rigid Body Assembly in Single Station Assembly Processes. ASME Transactions on Journal of Manufacturing Science and Engineering 129(4):821–831. [146] Huang W, Lin J, Kong Z, Ceglarek D (2007) Stream-of-Variation (SOVA) Modeling II: A Generic 3D Variation Model for Rigid Body Assembly in Multi Station Assembly Processes. ASME Transactions on Journal of Manufacturing Science and Engineering 129(4):832–842. [147] Phoomboplab T, Ceglarek D (2007) Design Synthesis Framework for Dimensional Management in Multi-Station Assembly Systems. Annals of the CIRP 56(1):153–158. [148] Zhao F, Xu X, Xie SQ (2009) Computer-aided Inspection Planning – The State of the Art. Computers in Industry 60:453–466. [149] Cho MW, Lee H, Yoon GS, Choi J (2005) A Feature-based Inspection Planning System for Coordinate Measuring Machines. The International Journal of Advanced Manufacturing Technology 26:1078–1087. [150] Lin YJ, Mahabaleshwarkar R, Massina E (2001) CAD-based CMM Dimensional Inspection Path Planning: A Generic Algorithm. Robotica 19(2):137–148. [151] Hopp TH (1984) CAD-directed Inspection. Annals of the CIRP 33(1):357– 361. [152] ElMaraghy HA, Gu P (1987) Expert System for Inspection Planning. Annals of the CIRP 36(1):85–89. [153] ElMaraghy HA (1993) Evolution and Future Prespectives of CAPP. Annals of the CIRP 42:739–751. [154] ElMaraghy HA (2007) Reconfigurable Process Plans for Responsive Manufacturing Systems. in Cunha PF, Maropoulos PG, (Eds.) Digital Enterprise Technology: Perspectives and Future Challenges. Springer, US, Boston , pp. 35–44. [155] Lee H, Cho MW, Yoon GS, Choi JH (2004) A Computer-Aided Inspection Planning System for On-machine Measurement-Part I: Global Inspection Planning. KSME International Journal 18(8):1349–1357. [156] Limaiem A, ElMaraghy HA (1999) CATIP: A Computer-aided Tactile Inspection Planning System. International Journal of Production Research 37(2):447– 465. [157] Zhang SG, Ajmal A, Wootton J, Chisholm A (2000) A Feature-Based Inspection Process Planning System for Coordinate Measuring Machine. Journal of Materials Processing Technology 107:111–118. [158] Hwang CY, Tsai CY, Chang CA (2004) Efficient Inspection Planning for Coordinate Measuring Machines. The International Journal of Advanced Manufacturing Technology 23(9–10):732–742. 19 [159] Moroni G, Polini W, Semeraro Q (1998) Knowledge Based Method for Touch Probe Configuration in an Automated Inspection System. Journal of Materials Processing Technology 76(1–3):153–160. [160] Lu CG, Morton D, Wu MH, Myler P (1999) Genetic Algorithm Modelling and Solution of Inspection Path Planning on a Coordinate Measuring Machine. The International Journal of Advanced Manufacturing Technology 15(6):409–416. [161] Beg J, Shunmugam MS (2003) Application of Fuzzy Logic in the Selection of Part Orientation and Probe Orientation Sequencing for Prismatic Parts. International Journal of Production Research 41(12):2799–2815. [162] Beg J, Shunmugam MS (2002) An Object Oriented Planner for Inspection of Prismatic Parts: OOPIPP. The International Journal of Advanced Manufacturing Technology 19(12):905–916. [163] Mohib A, Azab A, ElMaraghy H (2009) Feature-based Hybrid Inspection Planning: A Mathematical Programming Approach. International Journal of Computer Integrated Manufacturing 22(1):13–29. [164] Baxter D, Gao J, Case K, Harding J, Young B, Cochrane S, Dani S (2007) An Engineering Design Knowledge Reuse Methodology Using Process Modeling. Research in Engineering Design 18(1):37–48. [165] Maropoulos P, Bramall D, McKay K (2003) Assessing the Manufacturability of Early Product Designs Using Aggregate Process Models. Proceedings of IMechE Part B Journal of Engineering Manufacture 217:1203–1214. [166] Maropoulos P, Guo Y, Jamshidi J, Cai B (2008) Large Volume Metrology Process Models: A Framework for Integrating Measurement with Assembly Planning. Annals of the CIRP 57:477–480. [167] Maropoulos P, Rogers B, Chapman P, McKay K, Bramall D (2003) A Novel Digital Enterprise Technology Framework for the Distributed Development and Validation of Complex Products. Annals of the CIRP 52(1):389–392. [168] Cai B, Guo Y, Jamshidi J, Maropoulos PG (2008) Large Volume Measurability Analysis for Early Design. Proceedings of the 5th International Conference on Digital Enterprise Technology, Nantes, France, 807–819. [169] ISO/IEC Guide (1995) General Metrology. Part 3: Guide to the Expression of Uncertainty in Measurement (GUM). [170] ASME B89.4.19 (2006) Performance Evaluation of Laser-based Spherical Coordinate Measurement Systems. [171] Peggs G, Maropoulos P, Hughes E, Forbes A, Robson S, Ziebart M, Muralikrishnan B (2009) Recent Developments in Large-scale Dimensional Metrology. Proceedigns of IMechE Part B Journal of Engineering Manufacture 223(6):571–595. [172] Estler W, Edmundson K, Peggs G, Parker D (2002) Large-scale Metrology – An Update. Annals of the CIRP 51(2):587–609. [173] Cai B, Dai W, Muelaner J, Maropoulos P (2009) Measurability Characteristics Mapping for Large Volume Metrology Instruments Selection. Proceedings of the 7th International Conference on Manufacturing Research (ICMR09), Warwick, UK, 438–442. [174] Muelaner J, Cai B, Maropoulos P (2010) Large Volume Metrology Instrument Selection and Measurability Analysis. Proceedings of IMechE Part B Journal of Engineering Manufacture . 10.1243/09544054JEM09541676. [175] Cuypers W, Van Gestel N, Voet A, Kruth J, Mingneau J, Bleys P (2009) Optical Measurement Techniques for Mobile and Large-scale Dimensional Metrology. Optical Measurements 47(3–4):292–300. [176] Morris AJ (1996) The Qualification of Safety Critical Structures by Finite Element Analysis Methods. Proceedigns of IMechE Part G Journal of Aerospace Engineering ;(210)203–208. [177] Rossi M, Meo M (2009) On the Estimation of Mechanical Properties of Singlewalled Carbon Nanotubes by Using a Molecular-mechanics Based Fe Approach. Composites Science and Technology 69(9):1394–1398. [178] Guida M, Meo M, Riccio M, Marulo F (2008) Analysis of Bird Impact on a Composite Tailplane Leading Edge. Applied Composite Materials 15(4–6):241– 257. [179] Denton JD, Dawes WN (1998) Computational Fluid Dynamics for Turbomachinery Design. Proceedings of IMechE Part C Journal of Mechanical Engineering Science 213:107–124. [180] Samad A, Kim KY (2008) Shape Optimization of an Axial Compressor Blade by Multi-objective Genetic Algorithm. Proceedings of IMechE Part A Journal of Power and Energy 222(6):599–611. [181] Chew JW, Hills NJ (2007) Computational Fluid Dynamics for Turbomachinery Internal Air Systems. Philosophical Transactions of the Royal Society (Series A): Theme Issue. Computational Fluid Dynamics in Aerospace Engineering 365:2587–2611. [182] Spalart PR, Allmaras SR (1994) A One-equation Turbulence Model for Aerodynamic Flows. Recherche Aerospatiale 1:5–21. [183] Shur ML, Strelets MK, Travin AK, Spalart PR (2000) Turbulence Modeling in Rotating and Curved Channels: Assessing the Spalart–Shur Correction. AIAA Journal 38(5):784–792. [184] Dawes WN (2008) Rapid Prototyping Design Optimization Using Flow Sculpting. Journal of Turbomachinery 130(3). 10.1115/1111.2777178. [185] Morton S, Steenman M, Cummings R, Forsythe J (2003) Des Grid Resolution Issues for Vortical Flows on a Delta Wing and an F/A-18c. AIAA Journal . 20031103. [186] DD CEN ISO/TS 17450-1 (2007) Geometrical Product Specifications (GPS) – General Concepts. Part 1: Model for Geometrical Specification and Verification. [187] BS EN ISO 14253-1 (1999) Geometrical Product Specifications (GPS) – Inspection by Measurement of Workpieces and Measuring Equipment. Part 1: Decision Rules for Proving Conformance or Non-Conformance with Specifications. [188] Savio E, Chiffre LD, Schmitt R (2007) Metrology of Freeform Shaped Parts. CIRP Annals – Manufacturing Technology 56(2):810–835. [189] Shortis MR, Clarke TA, Robson S (1995) Practical Testing of the Precision and Accuracy of Target Image Centring Algorithms. Proceedings of the SPIE 2598:65–76. [190] Dorsch RG, Hausler G, Herrmann JM (1994) Laser Triangulation: Fundamental Uncertainty in Distance Measurement. Applied Optics 33(7):1306–1314. Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005 G Model CIRP-598; No. of Pages 20 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx [191] Huntley JM (1999) Simple Model for Image-plane Polychromatic Speckle Contrast. Applied Optics 38(11):2212–2215. [192] DD CE ISO/TS 12780-2 (2007) Geometrical Product Specifications (GPS) – Straightness Part 2: Specification Operators. [193] DD CE ISO/TS 12181-2 (2007) Geometrical Product Specifications (GPS) – Roundness Part 2: Specification Operators. [194] DD CE ISO/TS 12180-2 (2007) Geometrical Product Specifications (GPS) – Cylindricity Part 2: Specification Operators. [195] Kimberley W (2004) The Drive for Quality. Automotive Engineer 29(9):40–42. [196] Quin WH, Hsieh LH, Seliqer G (1996) On the Optimization of Automobile Panel Fitting. Proceedings of the IEEE conference on Robotics and Automation, Minneapolis, MN, USA, 1268–1274. [197] Ciddor PE (1996) Refractive Index of Air: New Equations for the Visible and near Infrared. Applied Optics 35:1566–1573. [198] Muelaner J, Wang Z, Jamshidi J, Maropoulos PG, Mileham AR, Hughes EB, Forbes AB (2008) iGPS – An Initial Assessment of Technical and Deployment Capability. Proceedings of the 3rd International Conference on Manufacturing Engineering, Kassandra-Chalkidiki, Greece, 805–810. [199] BS EN ISO 1101 (2005) Geometrical Product Specifications (GPS) – Tolerances of Form, Orientation, Location and Run-Out. [200] Ashby MF (2005) Materials Selection in Mechanical Design. Elsevier, Butterworth-Heinemann. [201] CES EduPack (2009) Materials Selection Software. Granta Design. [202] Hodgkinson JM (2000) Mechanical Testing of Advanced Fibre Composites. Woodhead Publishing Ltd. [203] Sutton MA, McNeill SR, Helm JD, Chao YJ (2000) in Rastogi PK, (Ed.) Photomechanics. Springer-Verlag, pp. 323–372. [204] William LO, Timothy GT (2002) Verification and Validation in Computational Fluid Dynamics. Progress in Aerospace Sciences 38:209–272. [205] Williams NM, Wang Z, Gursul I (2008) Active Flow Control on a Nonslender Delta Wing. Journal of Aircraft 45(6):2100–2110. [206] Gursul I, Wang Z, Vardaki E (2007) Review of Flow Control Mechanisms of Leading-edge Vortice. Progress in Aerospace Sciences 43:246–270. [207] Willert CE, Gharib M (1992) Three-dimensional Particle Imaging with a Single Camera. Experiments in Fluids 12:353–358. [208] Pereira F, Gharib M, Dabiri D, Modarress D (2002) Defocusing Digital Particle Image Velocimetry: A 3-Component 3-Dimensional DPIV Measurement Technique: Application to Bubbly Flows. Experiments in Fluids 78–84. [209] Motorcu AR, Gullu A (2006) Statistical Process Control in Machining, a Case Study for Machine Tool Capability and Process Capability. Materials and Design 27(5):364–372. [210] Huang W, Kong Z (2008) Process Capability Sensitivity Analysis for Design Evaluation of Multi Station Assembly Systems. Proceedings of the IEEE International Conference on Automation Science and Engineering, Washington DC, USA, 400–405. [211] Douglas CM, William HW (2008) An Overview of Six Sigma. International Statistical Review 76(3):329–346. [212] Hahn GJ (2005) Six Sigma: 20 Key Lessons Learned. Quality and Reliability Engineering International 21(3):225–233. [213] Shiu W, Apley D, Ceglarek D, Shi J (2003) Tolerance Allocation for Sheet Metal Assembly Using Beam-Based Model. Transactions of IIE Design and Manufacturing 35(4):329–342. [214] Ceglarek D, Shi J (1996) Fixture Failure Diagnosis for Autobody Assembly Using Pattern Recognition. ASME Transactions Journal of Engineering Industry 118(1):55–66. [215] Ding Y, Ceglarek D, Shi J (2002) Fault Diagnosis of Multistage Manufacturing Assembly Processes by Using State Space Approach. ASME Transactions Journal of Manufacturing Science and Engineering 124(2):313–322. [216] Rong Q, Shi J, Ceglarek D (2001) Adjusted Least Squares Approach for Diagnosis of Compliant Assemblies in the Presence of III-Conditioned Problems. ASME Transactions Journal of Manufacturing Science and Engineering 123(3):453–461. [217] Liu G, Hu J (2005) Assembly Fixture Fault Diagnosis Using Designated Component Analysis. ASME Transactions Journal of Manufacturing Science and Engineering 127(2):358–368. [218] Apley D, Lee HY (2003) Identifying Spatial Variation Patterns in Multivariate Manufacturing Processes: A Blind Separation Approach. Technometrics 45(3):220–234. [219] Ceglarek D, Prakash. Tripathi A, Kong Z (2007) Diagnosis of Product Failures in Ill-Conditioned Multi-Station Assembly Systems Using Enhanced Pls [220] [221] [222] [223] [224] [225] [226] [227] [228] [229] [230] [231] [232] [233] [234] [235] [236] [237] [238] [239] [240] [241] [242] [243] Method. Proceedings of the 40th CIRP International Manufacturing Systems Seminar, Liverpool, UK, . Maisano DA, Jamshidi J, Franceschini F, Maropoulos PG, Mastrogiacomo L, Mileham AR, Owen GW (2009) A Comparison of Two Distributed Largevolume Measurement Systems: The Mobile Spatial Co-ordinate Measuring System and the Indoor Global Positioning System. Proceedings of IMechE Part B Journal of Engineering Manufacture 223(5):511–521. Jahangirian M, Eldabi T, Naseer A, Stergioulas LK, Young T (2010) Simulation in Manufacturing and Business: A Review. European Journal of Operations Research 203(1):1–13. Robinson S (2004) Simulation – The Practice of Model Development and Use. Wiley. Jamshidi J, Kayani A, Iravani P, Maropoulos PG, Summers MD (2010) Manufacturing and Assembly Automation by Integrated Metrology Systems for Aircraft Wing Fabrication. Proceedigns of IMechE Part B Journal of Engineering Manufacture 224(1):25–36. Zhong Y, Shirinzadeh B (2005) Virtual Factory for Manufacturing Process Visualization. Complexity International 12:1–12. Johansson B, Johnsson J, Kinnander A (2003) Information Structure to Support Discrete Event Simulation in Manufacturing Systems. Proceedings of the 2003 Winter Simulation Conference, LA, USA, . Huang GQ, Wright PK, Newman ST (2009) Wireless Manufacturing: A Literature Review. Recent Developments and Case Studies International Journal of Computer Integrated Manufacturing 22(7):1–16. Huang GQ, Zhang YF, Dai QY, Ho O, Xu FJ (2009) Agent-based Workflow Management for RFID-enabled Real-time Reconfigurable Manufacturing Collaborative Design and Planning for Digital Manufacturing. Springer, London. pp. 341–364. Zhang Y, Pingyu J, Huang G (2008) RFID-based Smart Kanbans for Just-inTime Manufacturing. International Journal of Materials and Product Technology 33(1/2):170–184. Intermec Technologies Corporation (2007) White Paper on Supply Chain RFID: How It Works and Why It Pays. Lee YM, Cheng F, Leung YT (2004) Exploring the Impact of RFID Supply Chain Dynamics. Proceedings of the Winter Simulation Conference, 1145–1152. Jun HB, Shin JH, Kim YS, Kiritsis D, Xirouchakis P (2009) A Framework for RFID Applications in Product Lifecycle Management. International Journal of Computer Integrated Manufacturing 22(7):595–615. Cheung W, Shaefer D (2009) Product Lifecycle Management: State-of-the-Art and Future Perspectives. in Cruz-Cunha M, (Ed.) Enterprise Information Systems for Business Integration in Smes: Technological, Organizational, and Social Dimensions. 37–55. Stark J (2005) Product Lifecycle Management: 21st Century Paradigm for Product Realisation. Springer-Verlag, London. Ameri F, Dutta D (2005) Product Lifecycle Management: Closing the Knowledge Loops. Computer-Aided Design and Applications 2(5):577–590. Abramovici M, Sieg O (2002) Status and Development Trends of Product Lifecycle Management Systems. Proceedings of the IPPD 2002 Wroclaw, . Rachuri S, Subrahmanian E, Bouras A, Fenves SJ, Foufou S, Sriram RD (2008) Information Sharing and Exchange in the Context of Product Lifecycle Management: Role of Standards. Computer-Aided Design 40(7):789–800. Peak RS, Lubell J, Srinivasan V, Waterbury SC (2004) STEP, XML, and UML: Complementary Technologies. Journal of Computing and Information Science in Engineering 4(4):379–390. Ming XG, Yan JQ, Wang XH, Li SN, Lu WF, Peng QJ, Ma YS (2008) Collaborative Process Planning and Manufacturing in Product Lifecycle Management. Computers in Industry 59(2–3):154–166. Gielingh W (2008) An Assessment of the Current State of Product Data Technologies. Computer-Aided Design 40(7):750–759. Pickering SG, Brace CJ (2007) Automated Data Processing and Metric Generation for Driveability Analysis. Proceedings of IMechE Part D Journal of Automobile Engineering 221(4):429–441. Zhao F, Xu X, Xie S (2009) Computer-Aided Inspection Planning-the State of the Art. Computers in Industry 60(7):453–466. Beaman J, Morse E (2010) Experimental Evaluation of Software Estimates of Task Specific Measurement Uncertainty for Cmms. Precision Engineering 34(1):28–33. Germani M, Mandorli F, Mengoni M, Raffaeli R (2010) CAD-based Environment to Bridge the Gap between Product Design and Tolerance Control. Precision Engineering 34(1):7–15. Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005