DESIGN FOR SIX SIGMA & ROBUST DESIGN OF PRODUCTS AND PROCESSES FOR QUALITY Gülser Köksal EM507 METU, Ankara 2009 OUTLINE • DFSS – DMAIC vs. DFSS – Different DFSS roadmaps – DMADV Roadmap – How To implement DFSS • • • Robust design examples A review of design of experiments and orthogonal arrays Taguchi’s robust design approach – Loss Fuunctions – Signal-to-Noise Ratios • Robust design case study: Cookie recipe design Slides 3-16 are selected from the following presentation: DFSS – DMAIC To design new products or processes, or to improve the designs of existing ones in order to satisfy customer requirements To improve the existing processes in order to satisfy customer requirements. Six Sigma Process Management To achieve the business results, managing the processes efficiently. Optimization Characterization DMAIC Define Define the problem with outputs and potential inputs Measure Analyze the existing process: Is the process measured correctly? If so, what is the capability of the process? Analyze Analyze and identify the important factors that cause the variation of the process: Where and when do the defects occur? Improve Optimize the output by optimizing the inputs: To reach at the six sigma process, what should be the levels of each factor? Control Which controls should be done in order to continue process at six sigma? Improvement Strategies Customer Requirements Process Capability DMAIC DFSS NO Is the gap small? YES Fundamental Redesign Iterative Improvement • Design a new product / process • Broad approach • Blank sheet of paper approach • High risk • Longer time span • Addressing many CTQs • Goal: Quantum Leap • • • • • • • Fix an existing process Narrow Focus Use current process model Low risk Shorter Time Span Addressing few CTQs Goal: Improvement When to Go for DFSS Changing customer expectations: by the time the current problems are solved, new problems will occur Technology development: new technologies allow to meet all customer requirements at lower cost or gain a competitive edge Next generation: the existing products remaining lifetime is very short, a successor will be needed soon System limits: the performance gap is due to system / business model configurations that cannot be changed or the available technology does not allow to meet CTQs Process entirely broken: the existing process is unable to meet many CTQs, too many successive DMAIC projects required Influence on cost, cycle time and quality MANUFACTURING TRANSACTION %20-30 DESIGN MANU.-TRAN. DESIGN %70-80 Different DFSS Methodologies Several roadmaps have been proposed. They are very similar to each other. The underlying tools are the same DFSS Methodology: DMADV Define the project goals and customer requirements. Measure and determine customer needs and specifications; benchmark competitors and industry. Analyze the process options to meet the customer needs. Design (detailed) the process to meet the customer needs. Verify the design performance and ability to meet customer needs. DFSS Methodology: DCCDI Define the project goals. Customer analysis. Concept ideas are developed, reviewed and selected. Design is performed to meet the customer and business specifications. Implementation is completed to develop and commercialize the product/service. DFSS Methodology: IDOV Identify the customer and specifications (CTQs). Design translates the customer CTQs into functional requirements and into solution alternatives. Optimize uses advanced statistical tools and modeling to predict and optimize the design and performance. Validate makes sure that the design developed will meet the customer CTQs. DFSS Methodology: DMADV Define the project goals, customer requirements, and opportunities Measure in detail customer needs and priorities, market conditions, and benchmark competitors Analyze the data collected, prioritize CTQs, determine relations between CTQs and parts/processes Develop concept, innovative solutions, and optimal solutions to product and process design Validate the solutions and implement DFSS Methodology: DMADV TOOLS DEFINE MEASURE ANALYZE DEVELOP VALIDATE Project management QFD Benchmarking Value analysis Financial analysis SIPOC IPDS FMEA TRIZ Design scorecards MSA Basic statistical techniques DOE Optimization Simulation Robust design Tolerance design Reliability engineering Design for manufacture and assembly All methodologies are similar Measure in detail customer needs and priorities, market conditions, and benchmark competitors Define the project goals, customer requirements , and opportunitie s Define Measure Identify Analyze the data collected, prioritize CTQs, determine relations between CTQs and parts/processe s Develop concept, innovative solutions, and optimal solutions to product and process design Validate the solutions and implement Analyze Develop Validate Design Optimize Verify How is it implemented? 2 weeks of DFSS training Six Sigma BB or GB knowledge required for participation 2 project groups and 1 project per group In between training and after training there are a lot of MBB coachings (2-3 days/project-month) Six Sigma Black Belt BB Week 1 BB Week 2 BB Week 3 BB Week 4 Design For Six Sigma Six Sigma Green Belt GB Week 1 DFSS Week 1 DFSS Week 2 GB Week 2 Or combined Six Sigma / DFSS Black Belt training program Totally 5 weeks of training Black Belts work on the design project Team members may participate on a common project Robust Design Problem To make system outputs insensitive to variation in inputs, process and environmental factors. X1 X2 X3 X4 X5 Y1 Product or Process Y2 Y3 X6 Inputs (Control Factors) Outputs W1 W2 W3 Noise Factors (uncontrollable) Robust Product Design Example Making system output robust to environmental usage conditions Sugar Flour Egg Milk Oil Making a cake using a cake mix Taste Texture Baking powder Inputs Oven type Altitude from sea level Customer requirements (controllable) Noise factors (uncontrollable) A robust cake mix recipe reduces variability in taste and texture. Robust Product Design Example Making system output robust to component variability Period Utilizing the second degree relationship between system output and inputs What should be the pendulum length to minimize variation in the period? Pendulum length Robust Process Design Example W2 Making system robust to process variability W1 What should be the amount of steam blown and amount of water sprayed into the closed system to generate a level of 20% humidity? %10 Water %20 %30 Steam S2 S1 Guidelines for Robust Design through Statistical Experimentation 1. Choose control factors and their levels 2. Identify uncontrollable (noise) factors and decide on how they will be simulated 3. Select the response variable(s) and determine the performance measures (mean, standard deviation, SNR, etc.) 4. Setup the experimental layout (choose appropriate design array(s)) 5. Conduct the experiments and collect data 6. Analyze the data (effects, ANOVA, regression) 7. Choose optimal control factor levels and predict the performance measure at these levels 8. Confirm the optimal levels by experimentation Orthogonal Arrays Ortogonal No. of Max. no. of Array L4 L8 L9 L12 L16 L16 L18 L25 L27 L32 L32 L36 L36 L50 L54 L64 L64 L81 rows 4 8 9 12 16 16 18 25 27 32 32 36 36 50 54 64 64 81 factors 3 7 4 11 15 5 8 6 13 31 10 23 16 12 26 63 21 40 Max. no. of factors at these levels 2 3 7 11 15 1 31 1 11 3 1 1 63 - 3 4 7 13 12 13 25 40 4 5 9 21 - 5 6 11 - Ortogonal Array Construction Example One factor with 2 levels, 6 factors with 3 levels Quality (Consumer) Loss The quality of a product is measured by estimating “the total loss to the customers due to variation in the product’s functions. For ideal quality, loss is zero. Higher the loss, lower the quality. (b) (b) LSL µ=T USL (b) Quality characteristic (X) Quality Loss = b (x-T)2 Average Quality Loss = b (2+(-T)2) (Taguchi,1989) Loss coefficient Smaller-the-Better Response • Loss Function L(Y) Examples: A 2 L( y) 2 y L(y) y A L 2 (y2 s2 ) • Signal to Noise Ratio: 1 n 2 SNR 10 log( yi ) n i 1 10 log( y 2 s 2 ) • Gas, Energy etc. consumption • Noise • Radiation Larger-the-Better Response • Loss Function L(Y) Examples: 1 L( y ) A 2 y 2 L(y) y 2 1 s L A2 2 (1 3 2 ) y y • Signal to Noise Ratio: 1 n 1 SNR 10log( 2 ) n i 1 y i 1 s2 10log( 2 (1 3 2 )) y y • Mechanical power • Strength • Wearout resistance Nominal-the-Best Response • Loss Function L(Y) Examples: L( y ) L(y) L y A 2 2 ( s ( y T ) ) 2 • Signal to Noise Ratio: 1. Minimize variance SNR 10logs 2 2. Bring the mean to the target SNR 10log(ny 2 ) A 2 ( y T ) 2 • Dimension (mm) • Strength • Voltage (V) A Robust Design Experiment Layout 3 3 2 n 2 2 1 y 12 y 22 y 32 y 1n y 2n y 3n …………… y 11 y 21 y 31 ……….... ……….... ……….... ……….... ……….... …………… 2 2 1 2 …………… 1 2 3 …………… …………… m 1 2 3 …………… 1 1 1 …………… i 1 2 3 ……….... …………… 1 1 1 1 1 2 3 Control Factors ……….... …………… j Noise Factors 1 y m1 y m2 ……….... y mn Performance Measures y1, s12 , SNR 1 y2 , s22 , SNR 2 ym , s2m , SNR m Cookie Recipe Robust Design (A largerthe-better robust design problem) Objective: To find the control factor levels that maximize cookie chewiness under uncontrollable effects of the noise factors. Control Factors: A: Cooking temperature B: Syrup content C: cooking time D: cooking pan E: Shortening type Levels: Low, high Low, high Short, long Solid, mesh Corn, coconut Noise Factors: Z1: Cookie position Z2: Temperature at test Levels: Side, middle Low, high (Source: W.J. Kolarik, 1995, Creating Quality, McGraw-Hill) The experimental design layout, and data collected Chewiness measurements Z1: side side middle middle Z2: low high low high y 1 4 s ( yi y ) 2 3 i 1 loge s 1 4 1 SNR 10 log( 2 ) 4 i 1 yi Response Tables Taguchi Analysis: y1; y2; y3; y4 versus A; B; C; D; E Response Table for Signal to Noise Ratios Larger is better The following terms cannot be estimated, and were removed. A*B Level 1 2 Delta Rank A 22,18 15,70 6,49 2 B 12,69 25,19 12,50 1 C 17,40 20,48 3,08 5 D 16,03 21,85 5,82 3 E 16,74 21,13 4,39 4 A*C A*E Response Table for Means B*C Level 1 2 Delta Rank B*D B*E C*D A 20,250 14,000 6,250 4 B 8,750 25,500 16,750 1 C 12,750 21,500 8,750 2 D 13,750 20,500 6,750 3 E 16,250 18,000 1,750 5 C*E D*E Response Table for Standard Deviations Level 1 2 Delta Rank A 4,774 7,781 3,007 1 B 5,437 7,117 1,680 2 C 5,756 6,798 1,042 3 D 6,132 6,422 0,289 5 E 6,665 5,889 0,776 4 Marginal Average (Main Effect) Plots Main Effects Plot (data means) for SN ratios A B C 24 21 Mean of SN ratios 18 15 12 1 2 1 D 2 1 2 E 24 21 18 15 12 1 2 1 2 Signal-to-noise: Larger is better Variables A, B, D and E have significant effects on SNR. C does not seem to be significant. But let us check this with ANOVA as well. Interaction Plots Interaction Plot (data means) for SN ratios 1 2 24 21 A 1 2 18 A 15 12 D 1 2 24 21 18 D 15 12 1 2 Signal-to-noise: Larger is better Only AD interaction could be estimated and it seems to be insignificant. ANOVA for SNR Analysis of Variance for SN ratios Source DF Seq SS Adj SS Adj MS F P A 1 84,126 84,126 84,126 28,18 0,034 B 1 312,689 312,689 312,689 104,74 0,009 C 1 18,981 18,981 18,981 6,36 0,128 D 1 67,643 67,643 67,643 22,66 0,041 E 1 38,553 38,553 38,553 12,91 0,069 Residual Error 2 5,970 5,970 2,985 Total 7 527,962 Stat DOE Taguchi Analyze Taguchi Design-Analysis choose ‘fit linear model for Signal to Noise ratios’ Predict Results at the Optimal Levels Stat DOE Taguchi Predict Taguchi Results Predict Results at the Optimal Levels Taguchi Analysis: y1; y2; y3; y4 versus A; B; C; D; E Predicted values S/N Ratio Mean StDev Log(StDev) 35,0761 37,25 5,89143 1,83763 Conduct confirmation experiments at these levels! Factor levels for predictions A 1 B 2 C 2 D 2 E 2 Eˆ (SNR) T (A1 T ) (B2 T ) (C2 T ) (D2 T ) (E2 T )