Uploaded by Mary Maina

prob3160ch13

advertisement
CHAPTER 13
Moment generating functions
13.1. Denition and examples
Denition (Moment generating function)
The
moment generating function
(MGF) of a random variable
X
is a function
mX (t)
dened by
mX (t) = EetX ,
provided the expectation is nite.
In the discrete case
mX
is equal to
P
etx p(x)
x
and in the continuous case
∞
−∞
etx f (x)dx.
Let us compute the moment generating function for some of the distributions we have been
working with.
Example 13.1 (Bernoulli).
mX (t) = e0·t (1 − p) + e1·t p = et p + 1 − p.
Example 13.2 (Binomial).
Eet
where the
Xi
P
Xi
Using independence,
=E
Y
etXi =
Y
EetXi = (pet + (1 − p))n ,
are independent Bernoulli random variables. Equivalently
n
X
etk nk
n
X
k
n−k
p (1 − p)
=
k=0
n
k
pet
k
(1 − p)n−k = pet + (1 − p)
k=0
by the Binomial formula.
Example 13.3 (Poisson).
Ee
tX
=
∞
X
etk e−λ λk
k!
k=0
−λ
=e
k=0
Example 13.4 (Exponential).
tX
Ee
∞
X
(λet )k
∞
etx λe−λx dx =
=
0
if
t < λ,
and
∞
if
k!
t > λ.
171
t
t
= e−λ eλe = eλ(e −1) .
λ
λ−t
n
172
13. MOMENT GENERATING FUNCTIONS
Example 13.5 (Standard normal).
1
mZ (t) = √
2π
etx e−x
Example 13.6 (General normal).
Z ∼ N (0, 1),
Suppose
2 /2
2 /2
dx = et
Suppose
1
√
2π
then
e−(x−t)
2 /2
2 /2
dx = et
.
X ∼ N (µ, σ 2 ), then we can write X = µ + σZ ,
and therefore
2 /2
mX (t) = EetX = Eetµ etσZ = etµ mZ (tσ) = etµ e(tσ)
2 σ 2 /2
= etµ+t
.
Proposition 13.1
Suppose
X1 , ..., Xn
are
n
independent random variables, and the random variable
Y
is dened by
Y = X1 + ... + Xn .
Then
mY (t) = mX1 (t) · ... · mXn (t) .
Proof. By independence of
X
and
Y
and Proposition 12.1 we have
mY (t) = E etX1 · ... · etXn = EetX1 · ... · EetXn = mX1 (t) · ... · mXn (t) .
Proposition 13.2
Suppose for two random variables
an interval, then
X
and
Y
X
Y
and
we have
mX (t) = mY (t) < ∞
for all
t
in
have the same distribution.
We will not prove this, but this statement is essentially the uniqueness of the Laplace transform
L.
Recall that the
numbers
Laplace transform
s>0
of a function
f (x)
dened for all positive real
∞
f (x) e−sx dx
(Lf ) (s) :=
0
Thus if
X
is a continuous random variable with the PDF such that
then
fX (x) = 0
for
x < 0,
∞
etx fX (x)dx = LfX (−t).
0
Proposition 13.1 allows to show some of the properties of sums of independent random
variables we proved or stated before.
Example 13.7
2
Y ∼ N (c, d ),
(Sums of independent normal random variables)
and
X
and
Y
.
If
are independent, then by Proposition 13.1
mX+Y (t) = eat+b
2 t2 /2
2 t2 /2
ect+d
= e(a+c)t+(b
2 +d2 )t2 /2
,
X ∼ N (a, b2 )
and
13.1. DEFINITION AND EXAMPLES
which is the moment generating function for
2
2
implies that X + Y ∼ N (a + c, b + d ).
Example 13.8
173
N (a + c, b2 + d2 ).
Therefore Proposition 13.2
(Sums of independent Poisson random variables)
are independent Poisson random variables with parameters
t
a
and
.
b,
t
Similarly, if
X
and
Y
respectively, then
t
mX+Y (t) = mX (t)mY (t) = ea(e −1) eb(e −1) = e(a+b)(e −1) ,
which is the moment generating function of a Poisson with parameter
is a Poisson random variable with parameter
a + b, therefore X + Y
a + b.
One problem with the moment generating function is that it might be innite.
One way
characteristic function
Fourier
to get around this, at the cost of considerable work, is to use the
√
ϕX (t) = EeitX , where i = −1. This is always nite, and is the analogue of the
transform.
Denition
Joint MGF The
joint moment generating function
of
X
and
Y
is
mX,Y (s, t) = EesX+tY .
If
X
and
Y
are independent, then
mX,Y (s, t) = mX (s)mY (t)
by Proposition 13.2. We will not prove this, but the converse is also true: if
mX (s)mY (t)
for
all s and t, then X
and
Y
are independent.
mX,Y (s, t) =
174
13. MOMENT GENERATING FUNCTIONS
13.2. Further examples and applications
Example 13.9.
Suppose that m.g.f of
X
is given by
t
m(t) = e3(e −1) .
Find
P (X = 0).
Solution :
We can match this MGF to a known MGF of one of the distributions we considered
t
3(et −1)
and then apply Proposition 13.2. Observe that m(t) = e
= eλ(e −1) , where λ = 3. Thus
X ∼ P oisson(3),
and therefore
−λ λ
P (X = 0) = e
0
0!
= e−3 .
mX (t) is called the moment generating function.
Namely we can use it to nd all the moments of X by dierentiating m(t) and then evaluating
at t = 0. Note that
d tX d tX
0
m (t) = E e
e
= E XetX .
=E
dt
dt
This example is an illustration to why
Now evaluate at
t=0
to get
m0 (0) = E Xe0·X = E [X] .
Similarly
m00 (t) =
d tX E Xe
= E X 2 etX ,
dt
so that
m00 (0) = E X 2 e0 = E X 2 .
Continuing to dierentiate the MGF we have the following proposition.
Proposition 13.3 (Moments from MGF)
For all
n>0
we have
Example 13.10.
E [X n ] = m(n) (0) .
Suppose
What is the PMF of
Solution :
X
is a discrete random variable and has the MGF
1
3
2
1
mX (t) = e2t + e3t + e5t + e8t .
7
7
7
7
X ? Find EX .
this does not match any of the known MGFs directly. Reading o from the MGF
we guess
4
1 2t 3 3t 2 5t 1 8t X txi
e + e + e + e =
e p(xi )
7
7
7
7
i=1
©
Copyright 2017 Phanuel Mariano, Patricia Alonso Ruiz, 2020 Masha Gordina.
13.2. FURTHER EXAMPLES AND APPLICATIONS
then we can take
p(2) = 71 , p(3) = 73 , p(5) =
2
and
7
p(8) =
1
. Note that these add up to
7
175
1,
so
this is indeed a PMF.
To nd
E[X] we can use Proposition 13.3 by taking the derivative of the moment generating
function as follows.
2
9
10
8
m0 (t) = e2t + e3t + e5t + e8t ,
7
7
7
7
so that
E [X] = m0 (0) =
Example 13.11.
Suppose
X
2 9 10 8
29
+ +
+ = .
7 7
7
7
7
has the MGF
1
mX (t) = (1 − 2t)− 2
Find the rst and second moments of
Solution :
for
1
t< .
2
X.
We have
3
3
1
m0X (t) = − (1 − 2t)− 2 (−2) = (1 − 2t)− 2 ,
2
5
5
3
m00X (t) = − (1 − 2t)− 2 (−2) = 3 (1 − 2t)− 2 .
2
So that
3
EX = m0X (0) = (1 − 2 · 0)− 2 = 1,
5
EX 2 = m00X (0) = 3 (1 − 2 · 0)− 2 = 3.
176
13. MOMENT GENERATING FUNCTIONS
13.3. Exercises
Exercise 13.1.
Suppose that you have a fair 4-sided die, and let
X
be the random variable
representing the value of the number rolled.
(a) Write down the moment generating function for
X.
(b) Use this moment generating function to compute the rst and second moments of
Exercise 13.2.
Let
X
X.
be a random variable whose probability density function is given
by
(
e−2x + 21 e−x
fX (x) =
0
(a) Write down the moment generating function for
x>0
otherwise
.
X.
(b) Use this moment generating function to compute the rst and second moments of
Exercise 13.3.
X.
Suppose that a mathematician determines that the revenue the UConn
X , with
1
mX (t) =
(1 − 2500t)4
Dairy Bar makes in a week is a random variable,
moment generating function
Find the standard deviation of the revenue the UConn Dairy bar makes in a week.
Exercise 13.4.
Let
X
and
Y
be two independent random variables with respective moment
generating functions
mX (t) =
Find
1
1
t < , mY (t) =
,
5
(1 − 5t)2
if
if
1
t< .
5
E (X + Y )2 .
Exercise 13.5.
eters
1
,
1 − 5t
λx , λy ,
Exercise 13.6.
Exp (λx
Suppose
X
and
Y
are independent Poisson random variables with param-
respectively. Find the distribution of
+ λy ).
True or False?
Justify your answer.
If
X +Y.
X ∼ Exp (λx )
and
Y ∼
Exp (λy ) then
X +Y ∼
13.4. SELECTED SOLUTIONS
13.4. Selected solutions
Solution to Exercise 13.1(A):
1
1
1
1
mX (t) = E etX = e1·t + e2·t + e3·t + e4·t
4
4
4
4
1 1·t
=
e + e2·t + e3·t + e4·t
4
Solution to Exercise 13.1(B): We have
1 1·t
e + 2e2·t + 3e3·t + 4e4·t ,
4
1
m00X (t) =
e1·t + 4e2·t + 9e3·t + 16e4·t ,
4
m0X (t) =
so
EX = m0X (0) =
1
5
(1 + 2 + 3 + 4) =
4
2
and
EX 2 = m00X (0) =
15
1
(1 + 4 + 9 + 16) = .
4
2
Solution to Exercise 13.2(A): for t < 1 we have
∞
1 −x
−2x
e
e
+ e
dx
2
0
x=∞
1 tx−2x
1
=
=
e
+
etx−x
t−2
2(t − 1)
x=0
1
1
=0−
+0−
2−t
2 (t − 1)
1
1
t
=
+
=
t − 2 2 (1 − t)
2(2 − t)(1 − t)
mX (t) = E etX =
tx
Solution to Exercise 13.2(B): We have
1
1
2 +
(2 − t)
2 (1 − t)2
2
1
m00X (t) =
3 +
(2 − t)
(1 − t)3
m0X (t) =
and so
EX = m0X (0) =
3
and
4
EX 2 = m00X =
5
.
4
177
178
13. MOMENT GENERATING FUNCTIONS
Solution to Exercise 13.3:
We have
p
SD (X) =
Var(X) and
Var(X) = EX 2 − (EX)2 .
Therefore we can compute
m0 (t) = 4 (2500) (1 − 2500t)−5 ,
m00 (t) = 20 (2500)2 (1 − 2500t)−6 ,
EX = m0 (0) = 10, 000
EX 2 = m00 (0) = 125, 000, 000
Var(X) = 125, 000, 000 − 10, 0002 = 25, 000, 000
p
SD(X) = 25, 000, 000 = 5, 000.
Solution to Exercise 13.4:
First recall that if we let
W = X +Y,
and using that
X, Y
are independent, then we see that
mW (t) = mX+Y (t) = mX (t)mY (t) =
recall that
E [W 2 ] = m00W (0),
1
,
(1 − 5t)3
which we can nd from
15
,
(1 − 5t)4
300
m00W (t) =
,
(1 − 5t)5
m0W (t) =
thus
E W 2 = m00W (0) =
Solution to Exercise 13.5:
Since
300
= 300.
(1 − 0)5
X ∼ Pois(λx )
and
Y ∼ Pois(λy )
then
mX (t) = eλx (e −1) ,
t
mY (t) = eλy (e −1) .
t
Then
mX+Y (t) = mX (t)mY (t)
independence
=
Thus
of
t
t
t
X + Y ∼ Pois (λx + λy ).
Solution to Exercise 13.6:
of
eλx (e −1) eλy (e −1) = e(λx +λy )(e −1) .
X +Y
V is
We will use Proposition 13.2. Namely, we rst nd the MGF
and compare it to the MGF of a random variable
mV (t) =
By independence of
X
and
λx + λy
λx + λy − t
for
V ∼ Exp (λx + λy ).
t < λx + λy .
Y
mX+Y (t) = mX (t)mY (t) =
λx
λy
·
,
λx − t λy − t
The MGF
13.4. SELECTED SOLUTIONS
but
λx + λy
λx
λy
6=
·
λx + λy − t
λx − t λy − t
and hence the statement is false.
179
Download