Class 5 Outline

advertisement
Class 5 Outline
Experiment Design and Output Analysis:
1. Theory and Construction of Confidence
Intervals
2. Monte-Carlo and Terminating Simulations
3. Non-Terminating Simulations (Steady State)
2002 - Jérémie Gallien
Experimental Design Issues
Static Simulation
(Monte-Carlo)
Dynamic Simulation
(Discrete-Event)
• How many replications
should be run ?
Terminating
• How many replications
should be run ?
• How many replications
should be run?
Non-terminating
For each replication:
• When to start collecting
data?
• How much data should be
collected?
2002 - Jérémie Gallien
How Many Replications?
•
Simulation output: Y1, Y2, Y3, …, Yn
We (typically) want to estimate E[Y]!
1. What is the accuracy of the estimator:
Y(n) = (Y1+Y2+Y3+ …+Yn ) / n ?
2. How much should n be (number of independent
replications) in order to guarantee a given
estimation accuracy?
2002 - Jérémie Gallien
What is a “Fractile”?
• Let α be a number in (0,1) e.g. 0.05
• z1-α/2 is the (1-α/2)-th fractile of a standard
normal distribution, and is defined as:
PN0, 1 ≤ z 1−/2   1 − /2
1−α = 0.95
α/2 = 0.025
α/2 = 0.025
-Z1-α/2
Z1-α/2
2002 - Jérémie Gallien
Statistical Estimation Theory
• Define: Yn 
n
1
n
∑ Yi
S 2 n 
i1
n
1
n−1
∑Y i − Yn 2
i1
• A version of the CLT says that when n → ∞ :
Y n−EY 
S 2 n
n
→ N0, 1
• For smaller (n<30) values of n, a good
approx. is:
Y n − E Y 
S 2 n 
n
→ t n−1
2002 - Jérémie Gallien
So What?
• From the CLT:
Y n−EY 
S 2 n
n
→ N0, 1
• So that when n is large, we can write:
P−z1−/2 ≤
Yn−EY
S2 n
n
≤ z1−/2   1 − 
• Re-arranging gives:
PYn − z1−/2
S 2 n
n
≤ EY ≤ Yn  z1−/2
S 2 n
n
  1−
• This is the definition of a (1-α)% confidence interval!!!
2002 - Jérémie Gallien
Building Confidence Intervals
• For n large (n>30), the (1- α)% confidence interval is:
Yn  z1−/2
S2n
n
fractile of the
std. normal
distribution
• For n small (n<30), use:
Yn  t n−1,1−/2
S2n
n
fractile of the
t (student)
distribution
with n-1 d.f.
2002 - Jérémie Gallien
So, How Many Replications?
•
W = UB – LB is the width of the confidence
interval, centered around Y(n)
•
A measure of the relative estimation error is
thus W / Y(n)
•
So a good termination criteria is to set an
estimation error β and run a number of
replications n such that:
W / Y(n) < β
2002 - Jérémie Gallien
Example
Suppose we build a simulation model to
assess the weekly cost of a proposed supply
chain inventory policy
From the simulation data output (Spreadsheet
“Confidence Interval” on SloanSpace), we want
to estimate:
1.
2.
The average weekly cost C with CI
P(C>$2M) with CI
2002 - Jérémie Gallien
Number of experiment replications?
(n)
Steady-State Analysis
Warm-up period length?
(L)
Data collection period length?
(m-L)
2002 - Jérémie Gallien
Steady-State Analysis
•
Let Y1, Y2, Y3, … , Yt , … be a stochastic process
(e.g. queue length at time index t).
We want to estimate lim E[ Yt ] when t → ∞ !
•
Data Yij : j-th observation in the i-th replication
1st replication
Y11, Y12, Y13, … , Y1j, … , Y1m
2nd replication
Y21, Y22, Y23, … , Y2j, … , Y2m
last replication
Yn1, Yn2, Yn3, … , Ynj, … , Ynm
first observation
last observation
2002 - Jérémie Gallien
Replication/Deletion Approach
( Yij : j-th observation in the i-th replication )
•
For each replication i, instead of the estimator:
Yi(m) = (Yi1+Yi2+Yi3+ …+Yim ) / m,
consider the modified estimator:
Yi(m,L) = (YiL+Yi(L+1)+Yi(L+2)+ …+Yim ) / (m-L+1)
•
An estimator for lim E[Yt] when t → ∞ is then
Y(n) = ( Y1(m,L) + Y2(m,L) … + Yn(m,L) ) / n
2002 - Jérémie Gallien
Experimental Data Plot
90
Queue
80
70
60
50
40
Series1
Series2
30
Series3
Series4
Series5
20
Series6
Series7
Series8
10
Series9
Series10
Time
0
1
39
77
115 153 191 229 267 305 343 381 419 457 495 533 571 609 647 685 723 761 799 837 875 913 951 989
2002 - Jérémie Gallien
Welch’s Method for Warm-up
( Yij : j-th observation in the i-th replication )
•
Compute the average across replications for each
time point:
Yt[n] = (Y1t+Y2t+Y3t+ …+Ynt ) / n
•
Welch’s method is to plot the moving average
process of Yt[n] based on various lags w:
Yt[n,w] = (Yt-w[n] + … + Yt[n] + … +Yt+w[n]) / (2w + 1)
2002 - Jérémie Gallien
Welch’s Method for Warm-up
Moving
Average
50
40
30
Average
20
40
20
60
80
100
10
Time
0
1
37
73
109 145 181 217 253 289 325 361 397 433 469 505 541 577 613 649 685 721 757 793 829 865 901 937 973
2002 - Jérémie Gallien
Download