Robust Design – Multidisciplinary System Design Optimization ESD.77

advertisement
ESD.77 – Multidisciplinary System Design Optimization
Robust Design
main
effects
two-factor interactions
n-k
Run a resolution III
on noise factors
Change
one factor
Again, run a resolution III on
noise factors. If there is an
improvement, in transmitted
variance, retain the change
k
b
k
n k
2
a c
b
b
a c
a c
If the response gets worse,
go back to the previous state
B
k 1
2
C
b
A
a c
Stop after you’ve changed
every factor once
Dan Frey
Associate Professor of Mechanical Engineering and Engineering Systems
Research Overview
Outreach
to K-12
Concept
Design
Adaptive Experimentation
and Robust Design
n
Pr
Complex
Systems
12
x1 x 2
0
12
ij
1 n
2
erf
0
C
B
D
two-factor interactions
AB
AC
AD
BC
BD
CD
three-factor interactions
ABC
ABD
ABCD
ACD
1 x1
2 INT
2
x2 2
2
INT
e
2
INT
x12
1
ME
( n 2)
2
2
ME
2
INT
(n 2)
1
2
2
INT
1
2
2
dx2dx1
2
Methodology
Validation
main effects
A
x2
2
BCD
four-factor interactions
Outline
• Introduction
– History
– Motivation
• Recent research
– Adaptive experimentation
– Robust design
“An experiment is simply a question put to nature …
The chief requirement is simplicity: only one question
should be asked at a time.”
Russell, E. J., 1926, “Field experiments: How they are made and what
they are,” Journal of the Ministry of Agriculture 32:989-1001.
“To call in the statistician after the
experiment is done may be no more
than asking him to perform a postmortem examination: he may be able
to say what the experiment died of.”
- Fisher, R. A., Indian Statistical Congress, Sankhya, 1938.
Estimation of Factor Effects
(bc)
Say the independent
i
l error off
experimental
observations
(a) (ab)
(a),
(ab), et cetera is σε.
(b)
(ab)
+
B
We define the main effect
estimate Α to be
(abc)
-
(1)
(a)
A
+
1
A ≡ [(abc) + (ab) + (ac) + (a ) − (b) − (c) − (bc) − (1)]
4
The standard deviation of the estimate is
1
1
σA =
2σ ε
8σ ε =
4
2
(ac)
(c)
+
-
C
-
A factor of two improvement in
efficiency as compared to
“single question methods”
Fractional Factorial Experiments
“It will sometimes be advantageous
deliberately to sacrifice all possibility of
obtaining information on some points, these
being confidently believed to be unimportant
… These comparisons to be sacrificed will be
deliberately confounded with certain elements
of the soil heterogeneity… Some additional
care should, however, be taken…”
Fisher, R. A., 1926, “The Arrangement of Field Experiments,”
Journal of the Ministry of Agriculture of Great Britain, 33: 503-513.
Fractional Factorial Experiments
+
B
+
- C
-
A
+
2
3 1
III
Fractional Factorial Experiments
Trial
1
2
3
4
5
6
7
8
A
-1
-1
-1
-1
+1
+1
+1
+1
B
-1
-1
+1
+1
-1
-1
+1
+1
C
-1
-1
+1
+1
+1
+1
-1
-1
D
-1
+1
-1
+1
-1
+1
-1
+1
E
-1
+1
-1
+1
+1
-1
+1
-1
F
-1
+1
+1
-1
-1
+1
+1
-1
G
-1
+1
+1
-1
+1
-1
-1
+1
FG=-A
+1
+1
+1
+1
-1
-1
-1
-1
27-4 Design (aka “orthogonal array”)
Every factor is at each level an equal number of times (balance).
High replication numbers provide precision in effect estimation.
Resolution III.
Robust Parameter Design
Robust Parameter Design … is a statistical /
engineering methodology that aims at reducing
the performance variation of a system (i.e. a
product or process) by choosing the setting of
its control factors to make it less sensitive to
noise variation.
Wu, C. F. J. and M. Hamada, 2000, Experiments: Planning, Analysis, and
Parameter Design Optimization, John Wiley & Sons, NY.
Cross (or Product) Arrays
2
Noise Factors
Control Factors
2
7 4
III
1
2
3
4
5
6
7
8
A
-1
-1
-1
-1
+1
+1
+1
+1
B
-1
-1
+1
+1
-1
-1
+1
+1
C
-1
-1
+1
+1
+1
+1
-1
-1
D
-1
+1
-1
+1
-1
+1
-1
+1
E
-1
+1
-1
+1
+1
-1
+1
-1
F
-1
+1
+1
-1
-1
+1
+1
-1
G
-1
+1
+1
-1
+1
-1
-1
+1
a
b
c
-1
-1
-1
3 1
III
-1 +1
+1 -1
+1 +1
2
7 4
III
Taguchi, G., 1976, System of Experimental Design.
2
+1
+1
-1
3 1
III
Step 1
Identify Project
and Team
Step 2
Step 4 Summary:
• Determine control factor levels
• Calculate the DOF
• Determine if there are any interactions
• Select the appropriate orthogonal array
Formulate
Engineered
System: Ideal
Function / Quality
Characteristic(s)
Step 3
Formulate
Engineered
System:
Parameters
Step 5 Sum
Step 1 Summary:
Step 5
• Form cross function team of experts.
• Determin
• Clearly define project objective.
• Determin
• Define Assign
roles and Noise
responsibilities to team- Surro
memb
• Translate
customer
intent non-technical- terms
Comp
Factors
to Outer
• Identify product quality issues.
- Treat
Array
• Isolate the boundary conditions and•describe
th
Establish
Step 2 Summary:
Step 6 Sum
Step 6
• Select a response function(s).
• Cross fun
• Select a signal parameter(s).
logistical a
• DetermineConduct
if problem is static or dynamic
phaseparam
of th
and one
or more responses.
multip
Experiment
and Dynamic• has
Identify
F
signals.
• Determin
Collect Data
• Determine the S/N function. See section Step
Step 3 Summary:
Step 7
• Select control factor(s).
• Rank control factors.
• Select
noise factors(s).
Analyze
Data and
Select Optimal
Design
Step 7 Sum
• Calculate
•
•
•
•
•
• Interpret
•
•
•
Step 4
Assign Control
Factors to Inner
Array
Step 4 Summary:
8 levels. Step 8 Sum
• DetermineStep
control factor
• TBD - Ca
• Calculate the DOF.
• Determine if there are any interactions between
Predict
andOrthogonal Array.
• Select the
appropriate
Confirm
Majority View on “One at a Time”
One way of thinking of the great advances of the
science of experimentation in this century is as
the final demise of the “one factor at a time”
method, although it should be said that there are
still organizations which have never heard of
factorial experimentation and use up many man
hours wandering a crooked path.
Logothetis, N., and Wynn, H.P., 1994, Quality Through Design: Experimental
Design, Off-line Quality Control and Taguchi’s Contributions, Clarendon
Press, Oxford.
Minority Views on “One at a Time”
“…the factorial design has certain deficiencies … It devotes
observations to exploring regions that may be of no
interest…These deficiencies … suggest that an efficient
design for the present purpose ought to be sequential;
that is, ought to adjust the experimental program at
each stage in light of the results of prior stages.”
Friedman, Milton, and L. J. Savage, 1947, “Planning Experiments
Seeking Maxima”, in Techniques of Statistical Analysis, pp. 365-372.
“Some scientists do their experimental work in single steps.
They hope to learn something from each run … they see
and react to data more rapidly …If he has in fact found out
a good deal by his methods, it must be true that the effects
are at least three or four times his average random error
per trial.” Cuthbert Daniel, 1973, “One-at-a-Time Plans”, Journal of the
American Statistical Association, vol. 68, no. 342, pp. 353-360.
My Observations of Industry
• Farming equipent company has reliability problems
• Large blocks of robustness experiments had been
planned at outset of the design work
• More than 50% were not finished
• Reasons given
– Unforseen changes
– Resource pressure
– Satisficing
“Well, in the third experiment, we
found a solution that met all our
needs, so we cancelled the rest
of the experiments and moved on
to other tasks…”
More Observations of Industry
•
•
•
•
Time for design (concept to market) is going down
Fewer physical experiments are being conducted
Greater reliance on computation / CAE
Poor answers in computer modeling are common
– Right model → Inaccurate answer
– Right model → No answer whatsoever
– Not-so right model → Inaccurate answer
• Unmodeled effects
• Bugs in coding the model
Human Subjects Experiment
• Hypothesis: Engineers using a flawed simulation
are more likely to detect the flaw while using
OFAT than while using a more complex design.
• Method: Between-subjects experiment with
human subjects (engineers) performing parameter
design with OFAT vs. designed experiment.
Treatment: Design Space Sampling Method
Trial
A
B
C
D
E
F
G
1
-
-
-
-
-
-
-
2
+
-
-
-
-
-
-
+
-
-
-
-
-
+
-
-
-
-
+
-
-
-
-
-
+
-
3
4
5
filled in as
required to adapt
6
+
7
8
• Adaptive OFAT
– One factor changes in
each trial
+
Trial
A
B
C
D
E
F
G
1
-
-
-
-
-
-
-
2
+
-
+
-
-
+
+
3
+
+
+
-
+
-
-
4
-
+
-
-
+
+
+
5
+
-
-
+
+
-
+
6
-
-
+
+
+
+
-
7
-
+
+
+
-
-
+
8
+
+
-
+
-
+
-
• Plackett-Burman L8
– Four factors change
between any two trials
Using a 27 system avoids possible confounding factor of number of trials.
Increasing number of factors likely means increasing discrepancy in detection.
Larger effect sizes require fewer test subjects for given Type I and II errors.
Parameter Design of Catapult to Hit Target
• Modeled on XPultTM
• Commonly used in
DOE demonstrations
• Extended to 27 system
by introducing
– Arm material
– Air relative humidity
– Air temperature
Control Factors and Settings
Control Factor
Nominal Setting
Alternate Setting
Relative Humidity
25%
75%
Pullback
30 degrees
40 degrees
Type of Ball
Orange (Large-ball TT)
White (regulation TT)
Arm Material
Magnesium
Aluminum
Launch Angle
60 degrees
45 degrees
Rubber Bands
3
2
Ambient
Temperature
72 F
32 F
50
45
80
35
30
60
Effect Size
25
Cumulative Percent
20
40
15
10
20
5
id
it y
em
pe
ra
tu
re
al
l
Am
bi
en
tT
e
Hu
m
of
B
at
iv
Re
l
Ty
pe
An
gl
e
at
er
ia
l
La
un
ch
M
Pu
llb
ac
k
0
Ar
m
of
R
ub
be
rB
an
ds
0
No
.
• Arm material selected for its
moderate effect size
• Computer simulation equations
are “correct”, but intentional
mistake is that arm material
properties are reversed
100
40
• Control factor tied directly to
simulation mistake
• Control factor ordering is not
random, to prevent variance due
to learning effect
• “Bad” control factor placed in 4th
column in both designs
Results of Human Subjects Experiment
• Pilot with N = 8
• Study with N = 55 (1 withdrawal)
• External validity high
– 50 full time engineers and 5 engineering students
– experience ranged from 6 mo. to 40+ yr.
• Outcome measured by subject debriefing at end
Method
Detected
Not detected
Detection Rate (95% CI)
OFAT
14
13
(0.3195,0.7133)
PBL8
1
26
(0.0009,0.1897)
Adaptive OFAT Experimentation
Do an experiment
If there is an improvement,
retain the change
Change one factor
If the response gets worse, go
back to the previous state
+
B
+
-
A
+
-
C
Stop after you’ve changed
every factor
Frey, D. D., F. Engelhardt, and E. Greitzer, 2003, “A Role for One Factor at a Time
Experimentation in Parameter Design”, Research in Engineering Design 14(2): 65-74.
Empirical Evaluation of
Adaptive OFAT Experimentation
• Meta-analysis of 66 responses from
published, full factorial data sets
• When experimental error is <25% of the
combined factor effects OR interactions
are >25% of the combined factor
effects, adaptive OFAT provides more
improvement on average than fractional
factorial DOE.
Frey, D. D., F. Engelhardt, and E. Greitzer, 2003, “A Role for One Factor at a Time
Experimentation in Parameter Design”, Research in Engineering Design 14(2): 65-74.
Detailed Results
0.4 MS FE
0.1 MS FE
OFAT/FF
Interaction
Strength
Gray if OFAT>FF
0
Mild
100/99
Moderate 96/90
Strong
86/67
Dominant 80/39
0.1
99/98
95/90
85/64
79/36
0.2
98/98
93/89
82/62
77/34
Strength of Experimental Error
0.3
0.4
0.5
0.6
0.7
96/96 94/94 89/92 86/88 81/86
90/88 86/86 83/84 80/81 76/81
79/63 77/63 72/64 71/63 67/61
75/37 72/37 70/35 69/35 64/34
0.8
77/82
72/77
64/58
63/31
0.9
73/79
69/74
62/55
61/35
1
69/75
64/70
56/50
59/35
AM
Mathematical
th
ti l Model
M d l off Ad
Adaptive
ti OFAT
O0 = y (~
x1 , ~
x2 ,K ~
xn )
initial observation
observation with
first factor toggled
first factor set
O1 = y (− ~
x1 , ~
x2 ,K ~
xn )
x1∗ = ~
x1sign{O0 − O1}
for i = 2 K n
repeatt for
f allll
remaining factors
(
Oi = y x1∗ ,K x ∗i −1 ,− ~
xi , ~
xi +1 ,K ~
xn
)
xi∗ = ~
xi sign{max(O0 , O1 ,KOi −1 ) − Oi }
[
]
process ends after n+1 observations with E y (x1∗ , x2∗ ,K xn∗ )
Frey, D. D., and H. Wang, 2006, “Adaptive One-Factor-at-a-Time Experimentation
and Expected Value of Improvement”, Technometrics 48(3):418-31.
A Mathematical Model of a
Population of Engineering Systems
n −1
n
n
y ( x1 , x2 ,K xn ) = ∑ β i xi + ∑ ∑ β ij xi x j + ε k
i =1
system
response
(
β i ~ Ν 0, σ ME
main effects
ymax ≡
2
)
i =1 j =i +1
(
β ij ~ Ν 0, σ INT
2
)
(
ε k ~ Ν 0,σ ε
two-factor interactions
2
)
experimental
error
the largest
g
response
p
within the space
p
of discrete, coded, two-level factors xi ∈ {− 1,+1}
Model adapted from Chipman, H., M. Hamada, and C. F. J. Wu, 2001, “A Bayesian Variable Selection
Approach for Analyzing Designed Experiments with Complex Aliasing”, Technometrics 39(4)372-381.
Probability of Exploiting an Effect
• The ith main effect is said to be “exploited” if
*
x
0
i i
• The two-factor interaction between the ith and
jth factors is said to be “exploited” if
xx 0
ij i
j
• The probabilities and conditional probabilities
of exploiting effects provide insight into the
mechanisms by which a method provides
improvements
The Expected
p
Value of the Response
p
after the First Step
[
]
[
E ( y ( x1* , ~
x 2 ,K, ~
x n )) = E β 1 x1∗ + ( n − 1) E β 1 j x1∗ ~
xj
[
]
Eβ x =
∗
1 1
π
[
]
xn =
E β1 j x ~
2
σ ME
2
2
2
σ ME
+ ( n − 1)σ INT
∗
1
1
+ σ ε2
2
]
2
σ INT
2
π
1
2
2
2
σ ME
+ ( n − 1)σ INT
+ σ ε2
1
main
effects
[
E β1 j x1∗ ~
xn
Legend
]
×
0.8
]
0.6
Theorem 1
Simulation
σ ε σ ME = 0.1
Theorem 1
Simulation
σ ε σ ME = 1
Theorem 1
σ ε σ ME = 10
+ Simulation
P
[
E β 1 x1∗
two-factor interactions
0.4
0.2
0
n=7
0
0.25
0.5
σ INT σ ME
0.75
1
P b bilit off Exploiting
Probability
E l iti the
th First
Fi t M
Main
i Eff
Effectt
1 1
Pr (β x > 0) = + sin −1
2 π
σ ME
*
1 1
σ ME + ( n − 1)σ INT
2
2
1 2
+ σε
2
1
Legend
×
09
0.9
If interactions are
small and error is
not too large,
OFAT will tend to
exploit main
effects
Theorem
Theorem 2 1
Simulation
σ ε σ ME = 0.1
Theorem
Theorem2 1
σ ε σ ME = 1
Simulation
Theorem 2 1
Theorem
+ Simulation
0.8
σ ε σ ME = 10
0.7
0.6
0.5
0
σ INT σ ME
0.25
0.5
0.75
1
The Expected Value of the Response After
the Second Step
[
]
[
] [
E ( y ( x1* , x2* , ~
x3 ,K, ~
xn )) = 2 E β1 x1* + 2(n − 2) E β1 j x1* + E β12 x1* x2*
⎡
⎢
2
σ INT
2⎢
∗ ∗
E β 12 x1 x 2 =
π⎢ 2
σ ε2
2
⎢ σ ME + ( n − 1)σ INT +
2
⎣
[
]
⎤
⎥
⎥
⎥
⎥
⎦
1
main
effects
[
[
] [
Eβ x~
x j = E β 2 j x2∗ ~
xj
∗
1j 1
Legend
]
×
0.8
]
= E β 2 x2∗
Theorem
3 1
Theorem
Simulation
Theorem
3 1
Theorem
Simulation
]
σ ε σ ME = 0.1
σ ε σ ME = 1
0.6
+
P
[
E β1 x1∗
two-factor interactions
Theorem
3 1
Theorem
Simulation
σ ε σ ME = 10
0.4
[
E β 12 x1∗ x 2∗
]
0.2
0
0
0.25
0.5
σ INT σ ME
0.75
1
]
Probability of Exploiting the First Interaction
(
)
Pr β12 x 1∗ x ∗2 > 0 β12 > β ij >
(
σ INT
)
1 1
Pr β x x > 0 = + tan −1
2 π
∗ ∗
12 1 2
1
2
2
2
σ ME
+ (n − 2)σ INT
+ σ ε2
∞ ∞
1 ⎛n⎞
⎜ ⎟
π ⎜⎝ 2 ⎟⎠ ∫0 −∫x2
⎡ ⎛ 1 x1
⎢erf ⎜⎜
⎣ ⎝ 2 σ INT
⎞⎤
⎟⎟⎥
⎠⎦
− x12
⎛n⎞
⎜⎜ ⎟⎟ −1
⎝2⎠
2σ INT
e
2
+
− x2 2
1
⎞
⎛
2 ⎜ σ ME 2 + ( n − 2 )σ INT 2 + σ ε 2 ⎟
2
⎝
⎠
σ INT σ ME + ( n − 2)σ INT
2
2
dx2 dx1
1 2
+ σε
2
Legend
1
Theorem
Theorem 6 1
× Simulation σ ε σ ME = 0.1
1
Legend
×
0.9
Theorem
Theorem 51
Simulation
σ ε σ ME = 0.1
Theorem
Theorem5 1
σ ε σ ME = 1
Simulation
0.8
Theorem 51
Theorem
+ Simulation
Theorem
Theorem6 1
+
0.6
0.6
0.25
0.5
σ INT σ ME
σ ε σ ME = 10
σ ε σ ME = 10
0.7
0
Theorem 6 1
Theorem
Simulation
σ ε σ ME = 1
0.8
0.7
0.5
Simulation
0.9
0.75
1
0.5
0
0.25
0.5
σ INT σ ME
0.75
1
And it Continues
two-factor interactions
n-k
E( y( x1* , x2* ,, xk* , ~
xk 1 ,, ~
xn )) / ymax
main
effects
k
k
n k
2
k 1
2
1
Legend
Eqn. 20
0.8
0 . 25
ME
Simulation
INT
0.5
ME
0.6
0.4
0.2
0
0
1
2
3
4
5
6
7
k
Pr
x xj
ij i
0
Pr
x x2
12 1
0
We can prove that the probability of exploiting interactions is sustained.
Further we can now prove exploitation probability is a function of j only
and increases monotonically.
Final Outcome
1
Legend
1
Theorem
1
Eqn 21
Simulation
E( y( x1*, x2*,, xn* )) / ymax
0.8
Simulation
Eqn 21
Theorem
1
Simulation
Eqn.
20 1
Theorem
0.4
Theorem
Eqn.20 1
Simulation
ME
0.1
ME
1
ME
10
Eqn.20
Theorem 1
Simulation
0.2
0
ME
1
ME
10
0.6
Legend
Simulation
0.1
Eqn
21
Theorem
1
0.8
0.6
ME
0.4
0.2
0
0.2
0.4
0.6
INT
ME
Adaptive OFAT
0.8
1
0
0
0.2
0.4
INT
0.6
0.8
ME
Resolution III Design
1
Final Outcome
1
0.8
ME
1
0.6
0.4
0.2
0
0
~0.25
0
0.2
0.4
INT
Adaptive OFAT
0.6
0.8
ME
Resolution III Design
1
Adaptive “One Factor at a Time” for
Robust Design
Run a resolution III
on noise factors
Change
one factor
b
a c
b
b
a c
a c
Again, run a resolution III on
noise factors. If there is an
improvement, in transmitted
variance, retain the change
If the response gets worse,
go back to the previous state
B
C
b
A
a c
Stop after you’ve changed
every factor once
Frey, D. D., and N. Sudarsanam, 2007, “An Adaptive One-factor-at-a-time Method for Robust Parameter Design:
Comparison with Crossed Arrays via Case Studies,” accepted to ASME Journal of Mechanical Design.
Sheet Metal Spinning
Image by MIT OpenCourseWare.
Smaller-the-better signal to noise ratio (dB)
Results for Three Methods of Robust Design Applied to
the Sheet Metal Spinning Model
-4
Maximum S/N
-5
aOFAT x 2
3_1
Informed
-6
-7
-8
-9
-10
2
6_3
3_1
x2
3_1
Largest noise
effect
aOFAT x 2
Largest control
effect
Random
-11
-12
Average S/N
0
0.5
1
1.5
2
2.5
3
2
Standard deviation of pure experimental error (mm )
Image by MIT OpenCourseWare.
Paper Airplane
Results for Three Methods of Robust Design Applied to
the Paper Airplane Physical Experiment
MIT Design of Experiments Exercise v2.0
B1 (up)
Parameter B: B2 (flat)
Stabilizer Flaps B3 (down)
__
___
___ ___
t # ____ _
n
e
_
_
erim ___
___
Exp ance _ _____
t
_
s
_
i
D
e __
Nam
C:
ter
ame gth
Par se Len
No
Maximum signal to noise ratio
41
_
aOFAT x 23 1 Informed
Parameter D:
Wing Angle
D1
D2
D3
D1
D2
D3
40
D1
D2
D3
Parameter A:
A1 Weight Position
A2
A3
39
_
3_1
aOFAT x 23 1 Random
L9 x 2
38
Average signal-to-noise ratio
0
Expt.
#
Weight.
A
Stabiliz.
B
Nose
C
Wing
D
1
A1
B1
C1
D1
2
A1
B2
C2
D2
3
A1
B3
C3
D3
4
A2
B1
C2
D3
5
A2
B2
C3
D1
6
A2
B3
C1
D2
7
A3
B1
C3
D2
8
A3
B2
C1
D3
9
A3
B3
C2
D1
Largest control factor effect
Combined effects of noise
37
10
20
30
40
50
Standard deviation of pure experimental error (inches)
Image by MIT OpenCourseWare.
Results Across Four Case studies
Method used
_
Fractional array
array X
x 2kkIIIIIIp p
Fractional
_
aOFAT x 2kIII p
Informed
Random
Sheet metal
Low ε
51%
75%
56%
Spinning
High ε
36%
57%
52%
Low ε
99%
99%
98%
High ε
98%
88%
87%
Low ε
43%
81%
68%
High ε
41%
68%
51%
Low ε
94%
100%
100%
High ε
88%
85%
85%
Low ε
74%
91%
84%
High ε
66%
70%
64%
Low ε
43% to 99%
75% to 100%
56% to 100%
High ε
36% to 88%
57% to 88%
51% to 87%
Op amp
Paper airplane
Freight transport
Mean of four cases
Range of four cases
Image by MIT OpenCourseWare.
Frey, D. D., N. and Sudarsanam, 2006, “An Adaptive One-factor-at-a-time Method for
Robust Parameter Design: Comparison with Crossed Arrays via Case Studies,” accepted to
ASME Journal of Mechanical Design.
Ensembles of aOFATs
100
100
- Ensemble aOFATs (4)
- Fractional Factorial 27-2
90
80
80
70
70
60
60
50
- Ensemble aOFATs (8)
- Fractional Factorial 27-1
90
50
Expected Value of Largest Control Factor = 16
Expected Value of Largest Control Factor = 16
40
40
0
5
10
15
20
25
Comparing an Ensemble of 4 aOFATs with a 27-2
Fractional Factorial array using the HPM
0
5
10
15
20
25
Comparing an Ensemble of 8 aOFATs with a 27-1
Fractional Factorial array using the HPM
Conclusions
• A new model and theorems show that
– Adaptive OFAT plans exploit two-factor
interactions especially when they are large
– Adaptive OFAT plans provide around 80%
of the benefits achievable via parameter
design
• Adaptive OFAT can be “crossed” with
factorial designs which proves to be
highly effective
Frey, D. D., and N. Sudarsanam, 2007, “An Adaptive One-factor-at-a-time Method for Robust Parameter Design:
Comparison with Crossed Arrays via Case Studies,” accepted to ASME Journal of Mechanical Design.
MIT OpenCourseWare
http://ocw.mit.edu
ESD.77 / 16.888 Multidisciplinary System Design Optimization
Spring 2010
For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
Download