Theorem The product of n mutually independent log normal random

advertisement
Theorem The product of n mutually independent log normal random variables is log normal.
Proof (an elegant proof that uses the L property of the normal distribution) Let X1 ∼
lognormal (µ1 , σ12 ) and X2 ∼ lognormal (µ2 , σ22 ) (using the parametrization in the chart,
′
′
µ = ln α and β = σ) be independent. Then X1 = eX1 and X2 = eX2 for independent
normally distributed random variables X1′ ∼ N (µ1 , σ12 ) and X2′ ∼ N (µ2 , σ22 ). We have
′
′
Y = X1 X2 = eX1 +X2 and the result for n = 2 follows from the linear combination property
(L) of normal distribution. The general result follows by induction.
Proof [A brutish, unfinished proof that remains ... UNDER CONSTRUCTION] Let X1 ∼
log normal(α1 , β1 ) and X2 ∼ log normal(α2 , β2 ) be independent log normal random variables.
We can write their probability density functions as
fX1 (x1 ) =
e− 2 (ln(x1 /α1 )/β1 )
1
√
e− 2 (ln(x2 /α2 )/β2 )
x1 β1 2π
and
fX2 (x2 ) =
1
√
x2 β2 2π
1
2
x1 > 0
1
2
x2 > 0.
Since X1 and X2 are independent, the joint probability density function of X1 and X2 is
fX1 ,X2 (x1 , x2 ) =
2
2
1
1
e− 2 [(ln(x1 /α1 )/β1 ) +(ln(x2 /α2 )/β2 ) ]
2πx1 x2 β1 β2
x1 > 0, x2 > 0.
Consider the 2 × 2 transformation
Y1 = g1 (X1 , X2 ) = X1 X2
and
Y2 = g2 (X1 , X2 ) = X2
which is a 1–1 transformation from X = {(x1 , x2 ) | x1 > 0, x2 > 0} to Y = {(y1 , y2 ) | y1 >
0, y2 > 0} with inverses
X1 = g1−1 (Y1 , Y2 ) =
and Jacobian
Y1
Y2
and
1
Y1
− 2
J = Y2
Y2
0
1
X2 = g2−1 (Y1 , Y2 ) = Y2
= 1.
Y2
Therefore by the transformation technique, the joint probability density function of Y1 and
Y2 is
fY1 ,Y2 (y1 , y2 ) = fX1 ,X2 g1−1 (y1 , y2 ), g2−1 (y1 , y2 ) |J|
1
− 21 [(ln(y1 /α1 y2 )/β1 )2 +(ln(y2 /α2 )/β2 )2 ] 1 e
=
y2 2πy1 β1 β2
1
for y1 > 0, y2 > 0. The probability density function of Y1 is
Z ∞
fY1 ,Y2 (y1 , y2 ) dy2
fY1 (y1 ) =
0
Z ∞
2
2
1
1
=
e− 2 [(ln(y1 /α1 y2 )/β1 ) +(ln(y2 /α2 )/β2 ) ] dy2
2πy1 y2 β1 β2
0
=
which is the probability density function of a log normal random variable. This integral can
be computed in Maple with the statements below.
assume(alpha1 > 0);
assume(alpha2 > 0);
assume(beta1 > 0);
assume(beta2 > 0);
int(exp(-((log(y1/(alpha1 * y2)) / beta1) ^ 2 +
(log(y2/alpha2) / beta2) ^ 2) / 2) /
(2 * Pi * y1 * y2 * beta1 * beta2), y2 = 0 .. infinity);
Induction can be used with the above result to verify that the product of n mutually independent log normal random variables is log normal. Let X1 , X2 , . . . , Xn be n mutually
independent log normal random variables. Consider their product X
Q1kX2 . . . Xn . By the result above, X1 X2 is log normal. Suppose we’ve demonstrated that i=1 Xi is a log normal
Qk+1
Q
random variable. Consider k+1
i=1 Xi is log normal
i=1 Xi . Since Xk+1 is also log normal,
by the result above. It follows by induction that X1 X2 . . . Xn must be log normal.
2
Download