Chapter8

advertisement
Random-Variate
Generation
Purpose & Overview

Develop understanding of generating samples
from a specified distribution as input to a
simulation model.

Illustrate some widely-used techniques for
generating random variates.
 Inverse-transform
technique
 Convolution technique
 Acceptance-rejection technique
 A special technique for normal distribution
2
Inverse-transform Technique

The concept:
For cdf function: r = F(x)
 Generate R sample from uniform (0,1)
 Find X sample:
r = F(x)
r1
X = F-1(R)
x1

1
P
r
(
X

x
)

P
r
(
F
(
R
)

x
)

P
r
(
R

F
(
x
)
)(
F
x
)
3
Exponential Distribution

[Inverse-transform]
Exponential Distribution:

Exponential cdf:
r = F(x) = 1 – e-x

for x 0
To generate X1, X2, X3 ,…
generate R1, R2, R3 ,…
Xi = F-1(Ri) = -(1/ ln(1-Ri)
Figure: Inverse-transform
technique for exp( = 1)
4
Exponential Distribution


[Inverse-transform]
Example: Generate 200 variates Xi with distribution
exp(= 1)
Matlab Code
for i=1:200,
expnum(i)=-log(rand(1));
end
R and (1 – R) have
U(0,1) distribution
5
Uniform Distribution

[Inverse-transform]
Uniform Distribution:

Uniform cdf:
xa
0
xa

rFx
( )
axb
ba
bx

1

To generate X1, X2, X3 ,…, generate R1, R2, R3 ,…
X
a
(b
aR
)i
i
6
Uniform Distribution


[Inverse-transform]
Example: Generate 500 variates Xi with distribution
Uniform (3,8)
Matlab Code
for i=1:500,
uninum(i)=3+5*rand(1);
end
7
Discrete Distribution
[Inverse-transform]
All discrete distributions can be generated by the
Inverse-transform technique.
F(x)
x
p(x)
a
b
c
p1
p2
p3
p1 + p 2 + p 3
p1 + p 2
R1
p1
General Form
X

m
in
{
xF
: ()
x
r
}
a
b
c
8
Discrete Distribution

[Inverse-transform]
Example: Suppose the number of shipments, x, on the
loading dock of IHW company is either 0, 1, or 2


Data - Probability distribution:
x
p(x)
F(x)
0
1
2
0.50
0.30
0.20
0.50
0.80
1.00
Method - Given R, the generation
scheme becomes:
,
R0
.5
0

x 
1
, 0
.5R0
.8
2
.8R1
.0
, 0
Consider R1 = 0.73:
F(xi-1) < R <= F(xi)
F(x0) < 0.73 <= F(x1)
Hence, x1 = 1
9
Empirical Continuous Dist’n


[Inverse-transform]
When theoretical distribution is not applicable
To collect empirical data:

Resample the observed data
 Interpolate between observed data points to fill in the gaps

For a small sample set (size n):

Arrange the data from smallest to largest
x
x


x
(1)
(2)
(n)

Assign the probability 1/n to each interval
x(i-1)xx(i)
i

1
)
 (


1
ˆ
X

F
(
R
)

x

a
R



(
i

1
) i
 n

where
x

x
x

x
(
i
)
(
i

1
)
(
i
)
(
i

1
)
a


i
1
/
n

(
i

1
)
/
n 1
/
n
10
Empirical Continuous Dist’n

[Inverse-transform]
Example: Suppose the data collected for 100 brokenwidget repair times are:
i
Interval
(Hours)
Frequency
Relative
Frequency
Cumulative Slope,
Frequency, c i
ai
1
0.25 ≤ x ≤ 0.5
31
0.31
0.31
0.81
2
0.5 ≤ x ≤ 1.0
10
0.10
0.41
5.0
3
1.0 ≤ x ≤ 1.5
25
0.25
0.66
2.0
4
1.5 ≤ x ≤ 2.0
34
0.34
1.00
1.47
Consider R1 = 0.83:
c3 = 0.66 < R1 < c4 = 1.00
X1 = x(4-1) + a4(R1 – c(4-1))
= 1.5 + 1.47(0.83-0.66)
= 1.75
11
Convolution Technique

Use for X = Y1 + Y2 + … + Yn

Example of application
 Erlang

distribution
Generate samples for Y1 , Y2 , … , Yn and then
add these samples to get a sample of X.
12
Erlang Distribution


[Convolution]
Example: Generate 500 variates Xi with distribution
Erlang-3 (mean: k/
Matlab Code
for i=1:500,
erlnum(i)=-1/6*(log(rand(1))+log(rand(1))+log(rand(1)));
end
13
Acceptance-Rejection technique



Useful particularly when inverse cdf does not exist in closed form
a.k.a. thinning
Steps to generate X with pdf f(x)

Step 0: Identify a majorizing function g(x) and a pdf h(x) satisfying
x, g(x)  f (x)

Efficiency parameter is c
c   g(x)dx  

g(x)
h(x) 
c

Generate Y,U
no
Step 1: Generate Y with pdf h(x)
 Step 2: Generate U ~ Uniform(0,1) independent of Y
f(Y
)
 Step 3:
U


se
tX
Y
Condition
yes
Output X=Y
g
(Y
)
e
lsere
p
e
a
tfro
m
S
te
p1
.
14
Triangular Distribution
[Acceptance-Rejection]
Generate Y~U(0,0.5)
and U~U(0,1)
no
yes
Output X = Y
15
Triangular Distribution
[Acceptance-Rejection]
Matlab Code: (for exactly 1000 samples)
i=0;
while i<1000,
Y=0.5*rand(1);
U=rand(1);
if Y<=0.25 & U<=4*Y | Y>0.25 & U<=2-4*Y
i=i+1;
X(i)=Y;
end
end
180
160
140
120
100
80
60
40
20
0
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0.45
0.5
16
Poisson Distribution
[Acceptance-Rejection]
e  n
p ( n)  P ( N  n ) 
, n  0,1, 2, ...
n!
N can be interpreted as number of arrivals from a
Poisson arrival process during one unit of time
Then, the time between the arrivals in the process are
exponentially distributed with rate 
N n 
n
A
i 1
i
n 1
 1   Ai
i 1
Poisson Distribution
n
A
i 1
i
n 1
n
 1   Ai   
i 1
i 1
[Acceptance-Rejection]
1

n 1
ln Ri  1   
i 1
n
n 1
i 1
i 1
1

ln Ri
  Ri  e    Ri
Step 1. Set n = 0, and P = 1
Step 2. Generate a random number Rn+1 and let P = P.
Rn+1
Step 3. If P < e-, then accept N = n. Otherwise, reject
current n, increase n by one, and return to step 2
How many random numbers will be used on the average
to generate one Poisson variate?
Normal Distribution

[Special Technique]
Approach for normal(0,1):

Consider two standard normal random variables, Z1 and Z2,
plotted as a point in the plane:
 Uniform(0,2
In polar coordinates:
Z1 = B cos 
Z2 = B sin 


B2 = Z21 + Z22 ~ chi-square distribution with 2 degrees of freedom
2ln
R
)1/2
= Exp( = 1/2). Hence, B(
The radius B and angle  are mutually independent.
1
/
2
Z

(

2
l
n)
R
o
s
(
2

R
)
1
1 c
2
1
/
2
Z

(

2
l
n)
R
i
n
(
2

R
)
2
1 s
2
19
Normal Distribution

[Special Technique]
Approach for normal(,):
 Generate
Zi ~ N(0,1)
Xi =  +  Zi

Approach for lognormal(,):
 Generate
X ~ N(,)
Yi = eXi
20
Normal Distribution
[Special Technique]
Generate 1000 samples of Normal(7,4)
 Matlab Code
for i=1:500,
R1=rand(1);
R2=rand(1);
Z(2*i-1)=sqrt(-2*log(R1))*cos(2*pi*R2);
Z(2*i)=sqrt(-2*log(R1))*sin(2*pi*R2);
end

for i=1:1000,
Z(i)=7+2*Z(i);
end
21
Download