Fundamental Tools - Probability Theory III MSc Financial Mathematics September 30, 2015

advertisement
Jointly distributed random variables
Sum of independent random variables
Generating functions
Fundamental Tools - Probability Theory III
MSc Financial Mathematics
The University of Warwick
September 30, 2015
MSc Financial Mathematics
Fundamental Tools - Probability Theory III
1 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Joint distribution functions
Conditional distributions
Independent random variables
Joint distribution function
For two random variables X and Y , we define their joint distribution
function by
FXY (x, y ) = P(X 6 x, Y 6 y ),
−∞ < x, y < ∞.
Two important classes:
X and Y are jointly discrete if both X and Y are discrete. It is
convenient to work with their joint probability mass function
pXY (x, y ) = P(X = x, Y = y ).
X and Y are jointly continuous if there exists a non-negative
function fXY : R2 → R such that
Z u=x Z v =y
P(X 6 x, Y 6 y ) = FXY (x, y ) =
fXY (u, v )dvdu.
u=−∞
v =−∞
We call fXY the joint probability density function of X and Y .
MSc Financial Mathematics
Fundamental Tools - Probability Theory III
2 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Joint distribution functions
Conditional distributions
Independent random variables
Recovering marginal distribution
Given a joint probability distribution/mass/density function of (X , Y ), we
can recover the the corresponding marginal characteristics of X as
follows:
Marginal distribution function
FX (x) = P(X 6 x) = P(X 6 x, Y < ∞) = FXY (x, ∞).
Marginal probability mass function (if (X , Y ) are jointly discrete)
pX (x) = P(X = x) = P(X = x, Y ∈ SY ) =
X
pXY (x, y ).
y
Marginal probability density function (if (X , Y ) are jointly
continuous)
fX (x) =
d
d
FX (x) =
FXY (x, ∞) =
dx
dx
MSc Financial Mathematics
Z
∞
fXY (x, y )dy .
−∞
Fundamental Tools - Probability Theory III
3 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Joint distribution functions
Conditional distributions
Independent random variables
Working with jointly continuous random variables
From definition, the joint distribution and density function of (X , Y ) are
related via
d2
fXY (x, y ) =
FXY (x, y ).
dxdy
In univariate case, we compute probabilities involving a continuous
random variable via simple integration:
Z
P(X ∈ A) =
f (x)dx.
A
In bivariate case, probabilities are computed via double integration
ZZ
P((X , Y ) ∈ A) =
fXY (x, y )dxdy .
A
and the calculation is not necessarily straightforward.
MSc Financial Mathematics
Fundamental Tools - Probability Theory III
4 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Joint distribution functions
Conditional distributions
Independent random variables
Example 1
Let (X , Y ) be a pair of jointly continuous random variables with joint
density function f (x, y ) = 1 on (x, y ) ∈ [0, 1]2 (and is zero elsewhere).
Find P(X − Y < 1/2).
Sketch of solution
Define the set A = {(x, y ) : (x, y ) ∈ [0, 1]2 , x − y 6 1/2}. Then
ZZ
ZZ
P(X − Y < 1/2) = P((X , Y ) ∈ A) =
f (x, y )dxdy =
dxdy
A
A
= The area of A.
By sketching the set A, the area is found to be 7/8.
MSc Financial Mathematics
Fundamental Tools - Probability Theory III
5 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Joint distribution functions
Conditional distributions
Independent random variables
Example 2
Let (X , Y ) be a pair of jointly continuous random variables with joint
density function f (x, y ) = e −y on 0 < x < y < ∞ (and is zero
elsewhere). Verify that the given f is a well-defined joint density
function. Find the marginal density function of X .
Sketch of solution
To show that fXYR is aRwell-defined joint density function, one needs to
∞
∞
show f > 0 and −∞ −∞ fXY (x, y )dxdy = 1. The first property is
obvious. To show the latter, define A = {(x, y ) : 0 < x < y < ∞}. Then
Z ∞Z ∞
ZZ
Z ∞Z y
fXY (x, y )dxdy =
e −y dxdy =
e −y dxdy
−∞ −∞
A
0
0
Z ∞
−y
ye dy = 1.
=
0
The marginal density of X is, for x > 0,
Z ∞
Z ∞
fX (x) =
fXY (x, y )dy =
e −y dy = e −y |x∞ = e −x .
−∞
MSc Financial Mathematics
x
Fundamental Tools - Probability Theory III
6 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Joint distribution functions
Conditional distributions
Independent random variables
Expected value involving joint distribution
Let (X , Y ) be a pair of random variables. For a given function g (·, ·), the
expected value of the random variable g (X , Y ) is given by
(P
g (x, y )pXY (x, y );
E(g (X , Y )) = R ∞x,y R ∞
g (x, y )fXY (x, y )dxdy .
−∞ −∞
Two important specifications of g (·, ·):
Set g (x, y ) = x + y . One could obtain E(X + Y ) = E(X ) + E(Y ).
Set g (x, y ) = xy . This leads to computation of the covariance
measure between X and Y defined by
Cov (X , Y ) = E(XY ) − E(X )E(Y )
and correlation measure defined by
corr (X , Y ) = p
MSc Financial Mathematics
Cov (X , Y )
var(X ) var(Y )
.
Fundamental Tools - Probability Theory III
7 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Joint distribution functions
Conditional distributions
Independent random variables
Conditional distributions
If (X , Y ) are jointly discrete, the conditional probability mass function of
X given Y = y is
pX |Y (x|y ) =
P(X = x, Y = y )
pXY (x, y )
=
.
P(Y = y )
pY (y )
If (X , Y ) are jointly continuous, the conditional probability density
function of X given Y = y is
fX |Y (x|y ) =
fXY (x, y )
fY (y )
which can be interpreted as follows:
P(x 6 X 6 x + δx, y 6 Y 6 y + δy )
P(y 6 Y 6 y + δy )
fXY (x, y )δxδy
≈
= fX |Y (x|y )δx.
fY (y )δy
P(x 6 X 6 x + δx|y 6 Y 6 y + δy ) =
MSc Financial Mathematics
Fundamental Tools - Probability Theory III
8 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Joint distribution functions
Conditional distributions
Independent random variables
Conditional probability and expectation
With conditional probability mass/density function, we can work out the
conditional probability and expectation as follows:
Conditional probability:
(P
pX |Y (x|y );
f (x|y )dx.
A X |Y
x∈A
P(X ∈ A|Y = y ) =
R
Conditional expectation:
(P
E(X |Y = y ) =
MSc Financial Mathematics
xpX |Y (x|y );
xf (x|y )dx.
−∞ X |Y
R ∞x
Fundamental Tools - Probability Theory III
9 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Joint distribution functions
Conditional distributions
Independent random variables
Example
−x/y −y
Let the joint density function of X and Y be fXY (x, y ) = e y e
on
0 < x, y < ∞ (and zero elsewhere). Find the conditional density fX |Y , and
compute P(X > 1|Y = y ) and E(X |Y = y ).
Sketch of solution
We first work out fY by integrating x out from fXY :
Z ∞ −x/y −y
Z ∞
e
e
fXY (x, y )dx =
fY (y ) =
dx = e −y
y
0
−∞
fXY (x,y )
fY (y )
for y > 0. Then fX |Y (x|y ) =
=
e −x/y
y
for x > 0.
∞
Z
P(X > 1|Y = y ) =
Z
∞
e −x/y
dx = e −1/y
y
∞
xe −x/y
dx = y
y
fX |Y (x|y )dx =
1
1
and
Z
E(X |Y = y ) =
∞
Z
xfX |Y (x|y )dx =
0
MSc Financial Mathematics
0
Fundamental Tools - Probability Theory III
10 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Joint distribution functions
Conditional distributions
Independent random variables
Independent random variables
We say that X and Y are independent random variables if
P(X ∈ A, Y ∈ B) = P(X ∈ A)P(X ∈ B)
for any Borel set A, B ∈ B(R).
The following are equivalent conditions for X and Y being independent:
E(f (X )g (Y )) = E(f (X ))E(g (Y )) for all functions f , g .
pXY (x, y ) = pX (x)pY (y ) or equivalently pX |Y (x|y ) = pX (x) in case
(X , Y ) are jointly discrete.
fXY (x, y ) = fX (x)fY (y ) or equivalently fX |Y (x|y ) = fX (x) in case
(X , Y ) are jointly continuous.
MSc Financial Mathematics
Fundamental Tools - Probability Theory III
11 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Joint distribution functions
Conditional distributions
Independent random variables
Independence and zero correlation
If X and Y are independent, then E(XY ) = E(X )E(Y ). This leads to:
1
Cov (X , Y ) = 0, which also implies the correlation between X and
Y is zero.
2
var(X + Y ) = var(X ) + var(Y ).
The reverse is not true in general. The most importantly, zero correlation
does not imply independence. See problem sheet.
MSc Financial Mathematics
Fundamental Tools - Probability Theory III
12 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Sum of independent random variables
We are interested in the following question: Suppose X and Y are two
independent random variables. What is the distribution of X + Y then?
The procedure is easier in the discrete case
Example: Suppose X ∼ Poi(λ1 ) and Y ∼ Poi(λ2 ). Then for k = 0, 1, 2...
pX +Y (k) = P(X + Y = k) =
k
X
P(X = j)P(Y = k − j)
j=0
=
k
X
e −λ1 λj1 e −λ2 λk−j
2
j!
(k − j)!
j=0
k
e −(λ1 +λ2 ) X k j k−j
e −(λ1 +λ2 )
=
Cj λ1 λ2 =
(λ1 + λ2 )k .
k!
k!
j=0
This is the pmf of a Poi(λ1 + λ2 ) random variable.
MSc Financial Mathematics
Fundamental Tools - Probability Theory III
13 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Sum of independent random variables
Assume (X , Y ) are jointly continuous and let Z = X + Y . Then the
CDF of Z is
ZZ
FZ (z) = P(Z 6 z) = P(X + Y 6 z) =
fX (x)fY (y )dxdy
x+y 6z
Z
y =∞
Z
x=z−y
fX (x)dx fY (y )dy
=
y =−∞
Z y =∞
x=−∞
FX (z − y )fY (y )dy .
=
y =−∞
Differentiation w.r.t z gives the density of Z as
Z y =∞
fZ (z) =
fX (z − y )fY (y )dy .
y =−∞
MSc Financial Mathematics
Fundamental Tools - Probability Theory III
14 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Example
Let X and Y be two independent U[0, 1] random variables. Find the
density function of Z = X + Y .
Sketch of solution
We have fX (x) = 1 on x ∈ [0, 1] and fY (y ) = 1 on y ∈ [0, 1]. Obviously
Z can only take value in [0, 2]. The density of Z is given by
Z y =∞
Z
fZ (z) =
fX (z − y )fY (y )dy =
dy
y =−∞
A
where A = {y : 0 6 y 6 1, z − 1 6 y 6 z}. Thus:
Rz
dy = z.
R1
When 1 < z 6 2, A = [z − 1, 1], and then fZ (z) = 1−z dy = 2 − z.
When 0 6 z 6 1, A = [0, z], and then fZ (z) =
0
And the density function is zero elsewhere.
MSc Financial Mathematics
Fundamental Tools - Probability Theory III
15 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Probability generating functions
Moment generating functions
Probability generating function
Probability generating function is only defined for a discrete random
variable X taking values in non-negative integers {0, 1, 2, ...}. It is
defined as
∞
X
GX (s) = E(s X ) =
s k pX (k).
k=0
View GX as a Taylor’s expansion in s:
GX (s) = pX (0) + pX (1)s + pX (2)s 2 + · · ·
G
(n)
(0)
We could then deduce pX (n) = Xn! , i.e. GX uniquely determines
the pmf of X . In other words, if the probability generating functions
of X and Y are equal, then X and Y have the same distribution.
If X and Y are independent,
GX +Y (s) = E(s X s Y ) = E(s X )E(s Y ) = GX (s)GY (s).
Hence we can study the distribution of X + Y via GX (s)GY (s).
MSc Financial Mathematics
Fundamental Tools - Probability Theory III
16 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Probability generating functions
Moment generating functions
Moments calculation from probability generating function
Given GX , we can derive the moments of X .
(1)
d X
s ) = E(Xs X −1 )
ds
(1)
=⇒ E(X ) = GX (1)
GX (s) = E(
(2)
d2 X
s ) = E(X (X − 1)s X −2 )
ds 2
(2)
=⇒ E(X (X − 1)) = GX (1)
GX (s) = E(
(3)
d3 X
s ) = E(X (X − 1)(X − 2)s X −3 )
ds 3
(3)
=⇒ E(X (X − 1)(X − 2)) = GX (1)
GX (s) = E(
MSc Financial Mathematics
Fundamental Tools - Probability Theory III
17 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Probability generating functions
Moment generating functions
Example
Find the pgf of Poi(λ). If X ∼ Poi(λ1 ) and Y ∼ Poi(λ2 ), what is the
distribution of X + Y ?
Sketch of solution
For N ∼ Poi(λ),
G (s) = E(s N ) =
∞
X
k=0
∞
sk
X (sλ)k
e −λ λk
= e −λ
= e −λ e sλ = e λ(s−1) .
k!
k!
k=0
Now if X ∼ Poi(λ1 ) and Y ∼ Poi(λ2 ) are independent, the pgf of
GX +Y (s) is GX (s)GY (s). Then
GX +Y (s) = e λ1 (s−1) e λ2 (s−1) = e (λ1 +λ2 )(s−1)
which is the pgf of a Poi(λ1 + λ2 ) distribution. Hence we conclude that
X + Y has a Poi(λ1 + λ2 ) distribution by the unique correspondence
between pgf and probability mass function.
MSc Financial Mathematics
Fundamental Tools - Probability Theory III
18 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Probability generating functions
Moment generating functions
Moment generating function (mgf)
Moment generating function (mgf) can be defined for general random variable
via
(P
e tx pX (x),
if X is discrete;
mX (t) = E(e tX ) = R ∞x tx
e
f
(x)dx,
if X is continuous.
X
−∞
Consider the n-th derivative of mX (t):
(n)
mX (t) =
dn
dn
E(e tX ) = E( n e tX ) = E(X n e tX )
n
dt
dt
(n)
from which we obtain E(X n ) = mX (0) on letting t = 0.
A mgf also uniquely determines the underlying distribution:
If X and Y have the same mgf, then they must have the same
distribution.
Suppose X and Y are independent, the mgf of X + Y is
mX +Y (t) = E(e tX e tY ) = E(e tX )E(e tY ) = mX (t)mY (t).
Hence we can study the distribution of X + Y via mX (t)mY (t), just like
pgf.
MSc Financial Mathematics
Fundamental Tools - Probability Theory III
19 / 20
Jointly distributed random variables
Sum of independent random variables
Generating functions
Probability generating functions
Moment generating functions
Example
Find the moment generating function of N(µ, σ 2 ). If X ∼ N(µ1 , σ12 ) and
Y ∼ N(µ2 , σ22 ), what is the distribution of X + Y ?
∞
(x − µ)2
1
dx
exp −
2σ 2
2πσ
−∞
Z ∞
(x − (µ + σ 2 t))2
1 2 2
1
√
= exp µt + σ t
dx
exp −
2
2σ 2
2πσ
−∞
1
= exp µt + σ 2 t 2 .
2
mX (t) = E(e tX ) =
Z
e tx √
We obtain the mgf of X + Y using the fact mX +Y (t) = mX (t)mY (t):
1
1
mX +Y (t) = exp µ1 t + σ12 t 2 exp µ2 t + σ22 t 2
2
2
1
= exp (µ1 + µ2 )t + (σ12 + σ22 )t 2
2
which is the mgf of N(µ1 + µ2 , σ12 + σ22 ). Thus X + Y ∼ N(µ1 + µ2 , σ12 + σ22 )
by the unique correspondence between mgf and probability distribution.
MSc Financial Mathematics
Fundamental Tools - Probability Theory III
20 / 20
Download