0.1. Sums of Random Variables

advertisement
0.1. Sums of Random Variables
Sums of independent random variables often arise in practice. In examples,
determine the distribution of a sum of
S = X1 +X2 where X1 and X2 are continuous
random variables.
1. Let
Example
X1
X2 be independent and uniform, Xi ∼ U N IF (0, 1).
T = X1 and S = X1 + X2 . Find the pdf of S .
and
Consider the transformation
Solution.
The pdf of
X
is
(
fS (s) =
s
2−s
,0 < s < 1
,1 ≤ x < 2
fU (u) = e−u , u > 0 and fV (v) = 2v, 0 < v < 1
X = U + V , then the pdf of X is?
2. Let
Example
are independent. If
.
U
and
V
Solution.
(1) Consider the transformation
(
X =U +V
and
Y =V.
The pdf of
X
is
´x
2ye−(x−y) dy = ... , 0 < x < 1
´1
2ye−(x−y) dy = ... , 1 ≤ x < ∞
0
(2) Consider the transformation
X = U + V and Y = U . The
´x
2 (x − y) e−y dy = ...
,0 < x < 1
0
´x
−y
2 (x − y) e dy = ... , 1 ≤ x < ∞
x−1
(
fX (x) =
X
is
0
fX (x) =
The pdf of
X
pdf of
is
(
2x + 2e−x − 2
2e−x
fX (x) =
Example
3. Let
X1
,0 < x < 1
,x ≥ 1
X2 be independent and uniform, Xi ∼ U N IF (0, 1).
X1
and Y2 = X1 .X2 . Find the pdf of Y1 and Y2 .
Y1 = X
2
and
Consider the transformation
Solution.
Example 4. Let X1 and
α−1 β−1 −x1 −x2
1
x2 e
,0 <
Γ(α)Γ(β) x1
X1
and Y2 = X +X .
1
2
The joint pdf of
Y1
and
X2 are independent gamma variables, f (x1 , x2 ) =
xi < ∞. Consider the transformation Y1 = X1 + X2
Y2
is
fY 1 ,Y2 (y1 , y 2 ) = ..., (y1 , y2 ) ∈ ...
And the pdf of
Y1
is
Example 5. Let X1 , X2 and X3 are independent gamma variables, Xi ∼
Xi
GAM (1, αi ) , i = 1, 2, 3. Consider the transformation Yi = P
, i = 1, 2 and
3
Xj
j=1
Y3 =
3
P
Xj .
j=1
1
0.1. SUMS OF RANDOM VARIABLES
The joint pdf of
Y 1 , Y2 ,
and
2
is
Y3
fY 1 ,Y2 ,Y3 (y1 , y 2 , y 3 ) = ... , (y1 , y 2 , y 3 ) ∈ ...
And the pdf of
Y3
is
A technique based on moment generating functions usually is much more convenient than using transformations for determining the distribution of sums of
independent random variables.
If X1 , X2 , ..., Xn are independent random variables with MGFs
n
P
(t) then the MGF of Y =
Xi is
6.
Theorem
MXi
i=1
MY (t) = MX1 (t) ...MXn (t)
Proof.
tX1
tX1
n)
etY = et(X1 +...+X
so
= e tX ...e
t(X1 +...+Xn )
1
=E e
=E e
...E etX1 = MX1 (t) ...MXn (t)
Notice that
MY (t) = E e
Example
tY
7. Let X1 , X2 , ..., Xk be independent binomial random variables with
respective parameters
ni
and
p, Xi ∼ BIN (ni , p),
and let
Y =
k
P
Xi .
i=1
It follows that
Thus,
MY (t) = ...
Y ∼ ...
8. Let
Example
X1 , X2 , ..., Xn
variables with respective parameters
be independent Poisson-distributed random
ni
and
p, Xi ∼ P OI (µi ),
and let
Y =
n
P
Xi .
i=1
It follows that
Thus,
MY (t) = ...
Y ∼ ...
Example
9. Let
X1 , X2 , ..., Xn
be independent Gamma-distributed with re-
spective shape parameter κ1 , κ2 , ..., κn and common scale parameter θ ,
for
i = 1, 2, ..., n,
and let
Y =
n
P
Xi ∼ GAM (θ, κi )
Xi .
i=1
It follows that
Thus,
X1 , X2 , ..., Xn be independent
n
P
Xi ∼ N µi , σi2 , and let Y =
Xi .
Example
variables,
MY (t) = ...
X ∼ ...
10. Let
normally distributed random
i=1
It follows that
Thus,
MY (t) = ...
Y ∼ ...
This includes the special case of a random sample
mally distributed population, say
σ 2 = σi2
for all
i = 1, 2, ..., n,
Xi ∼ N µ, σ 2
n
P
X1 , X2 , ..., Xn from a norµ = µi and
2
Xi ∼ N nµ, nσ . It also fol-
and consequently
.
In this case,
i=1
n
P
lows readily in the case that the sample mean,
X̄ ∼ N µ,
σ2
n
.
X̄ =
Xi
i=1
n
is normally distributed,
Related documents
Download