Chapter 2 Multivariate Distributions

advertisement
Chapter 2
Multivariate Distributions
Math 6203
Fall 2009
Instructor: Ayona Chatterjee
Random Vector
• Given a random experiment with a sample
space C. Consider two random variables X1
and X2 which assign to each element c of C
one and only one ordered pair of numbers
X1(c)=x1 and X2(c)=x2. Then we say that (X1, X2)
is a random vector.
• The space of (X1, X2) is the set of ordered pairs
D={(x1, x2) : X1(c)=x1 and X2(c)=x2 }
Cumulative Distribution Function
• The joint cumulative distribution function of
(X1, X2) is denoted by FX1,X2 (x1, x2) and is given
as FX1,X2 (x1, x2) =P[X1≤x1, X2 ≤x2)].
• A random vector (X1, X2 )is a discrete random
variable is its space D is finite or countable.
• A random vector (X1, X2 ) with space D is
continuous if its cdf FX1,X2 (x1, x2) is continuous.
Probability Mass Function
• For discrete random variables X1 and X2, the
joint pmf is defined as
p X 1 X 2 ( x1 , x 2 )  P [ X 1  x1 , X 2  x 2 ]
Note
* 0  p X 1 X 2 ( x1 , x 2 )  1
*

D
p X 1 X 2 ( x1 , x 2 )  1
Probability Density Function
• For a continuous random vector
x1 x 2
F X 1 X 2 ( x1 , x 2 ) 

f X 1 X 2 ( w1 , w 2 ) dw 1 dw 2

 F X 1 X 2 ( x1 , x 2 )
2
 x1  x 2
 f X 1 X 2 ( x1 , x 2 )
Note
* f X 1 X 2 ( x1 , x 2 )  0
*  f X 1 X 2 ( x1 , x 2 ) dx 1 dx 2  1
D
Marginals
• The marginal distributions can be obtained
from the joint probability density function.
• For a discrete and continuous random vector
the marginals can be obtained as below:
p X 1 ( x1 ) 

x2  
p X 1 X 2 ( x1 , x 2 )

f X 1 ( x1 ) 


f X 1 X 2 ( x1 , x 2 ) dx 2
Expectation
• Suppose (X1, X2) is of the continuous type.
Then E(Y) exists if
 

g ( x1 , x 2 ) f X 1 X 2 ( x1 , x 2 ) dx 1 dx 2  

then
 
E (Y ) 
  g(x , x
1

2
) f X 1 X 2 ( x1 , x 2 ) dx 1 dx 2
Theorem
• Let (X1, X2) be a random vector. Let Y1 = g1(X1,
X2) and Y2 = g2 (X1, X2) be a random variable
whose expectations exits. Then for any real
numbers k1 and k2.
E(k1 Y1 + k2 Y2 )= k1E(Y1 ) + k2 E(Y2 )
Note
 
E ( g ( X 2 )) 
  g(x


2
) f X 1 X 2 ( x1 , x 2 ) dx 1 dx 2 
 g(x

2
) f X 2 ( x 2 ) dx 2
Moment Generating Function
• Let X = (X1. X2 )’ be a random vector. If
E(et1x1+t2x2 ) exists for |t1 |<h1 and |t2 |<h2
where h1 and h2 are positive, the mgf is given
as
M
X1X 2
(t )  E [e
'
t X
]
where
t  ( t1 , t 2 )
M
X1X 2
'
( t1 , 0 ) is the mgf of X 1 and M
X1X 2
( 0 , t 2 ) is the mgf of X 2 .
2.3 CONDITIONAL DISTRIBUTIONS AND
EXPECTATIONS
• So far we know
– How to find marginals given the joint distribution.
• Now
– Look at conditional distribution, distribution of
one of the random variable when the other has a
specific value.
Conditional pmf
• We define
p X 2 | X 1 ( x 2 | x1 ) 
P ( X 1  x1 , X 2  x 2 )
P ( X 1  x1 )

p X 1 , X 2 ( x1 , x 2 )
p X 1 ( x1 )
x2  S X 2
• SX2 is the support of X2.
• Here we assume pX1 (x1) > 0.
• Thus conditional probability is the joint
divvied by the marginal.
Conditional pdf
• Let fX1x2 (x1, x2 ) be the joint pdf and fx1 (x1) and
fx2 (x2) be the marginals for X1 and X2
respectively then the conditional pdf of X2,
given X1 is
f X 2 | X 1 ( x 2 | x1 ) 
f X 1 ( x1 )  0
f X 1 , X 2 ( x1 , x 2 )
f X 1 ( x1 )
 f 2 |1 ( x 2 | x1 )
Note
* f 2 |1 ( x 2 | x1 )  0

*


f 2 |1 ( x 2 | x1 ) dx 2  1
Conditional Expectation and Variance
If u(X 2 ) is a function
of X 2 , the conditiona
X 1  x1if it exists, is given by
u(X 2 ), given that

E [ u ( X 2 ) | x1 ] 
l expectatio n of
 u(x
2
) f 2 |1 ( x 2 | x1 ) dx 2

var( X 2 | x1 )  E ( X
2
2
| x1 )  [ E ( X 2 | x1 )]
2
Theorem
• Let (X1, X2) be a random vector such that the
variance of X2 is finite. Then
– E[E(X2 |X1)]=E(X2)
– Var[E(X2 |X1 )]≤ var(X2 )
2.4 The Correlation Coefficient
E [( X   1 )( Y   2 )]  E ( XY )  E ( X ) E (Y )
Cov ( X , Y )  E [( X   1 )( Y   2 )]
 
Cov ( X , Y )
 1 2
Here ρ is called the correlation coefficient
of X and Y.
Cov(X,Y) is the covariance between X and Y.
The Correlation Coefficient
• Note that -1 ≤ ρ≤ 1.
• For the bivariate case
– If ρ = 1, the graph of the line y = a + bx (b > 0)
contains all the probability of the distribution of X
and Y.
– For ρ = -1, the above is true for the line y = a + bx
with b < 0.
– For the non-extreme case, ρ can be looked as a
measure of the intensity of the concentration of
the probability of X and Y about a line y = a + bx.
Theorem
• Suppose (X,Y) have a joint distribution with
the variance of X and Y finite and positive.
Denote the means and variances of X and Y by
µ1 , µ2 and σ12 , σ22 respectively, and let ρ be
the correlation coefficient between X and Y. If
E(Y|X) is linear in X then
E (Y | X )   2  
2
1
( X  1 )
E (var( Y | X ))   2 (1   )
2
2
2.5 Independent random Variables
• If the conditional pdf f2|1 (x2|x1) does not depend
upon x1 then the marginal pdf of X2 equals the
conditional pdf f2|1 (x2|x1) .
• Let the random variables X and Y have joint pdf
f(x,y) and the marginals fx (x) and fy (y)
respectively. The random variables X and Y are
said to be independent if and only if
– f(x,y)= fx (x) fy (y)
– Similar defintion can be wriiten for discrete random
variables.
– Random variables that are not independent are said
to be dependent.
Theorem
• Let the random variables X and Y have support
S1 and S2, respectively and have the joint pdf
f(x,y). Then X and Y are independent if and
only if f(x,y) can be written as a product of a
nonnegative function of x and a nonnegative
function of y. That is f(x,y)=g(x)h(y) where
g(x)>0 and h(y)>0.
Note
• In general X and Y must be dependent of the
space of positive probability density of X and Y
is bounded by a curve that is neither a
horizontal or vertical line.
• Example; f(x,y)=8xy, 0< x< y < 1
– S={(x,y): 0< x< y < 1} This is not a product space.
Theorems
• Let (X, Y) have the joint cfd F(x,y) and let A and
Y have the marginal cdfs Fx (x) and Fy (y)
respectively. Then X and Y are independent if
and only if
– F(x,y)= Fx (x)Fy (y)
• The random variable X and Y are independent
if and only if the following condition holds.
– P(a < X≤ b, c < Y ≤ d)= P(a < X≤ b)P( c < Y ≤ d)
– For ever a < b, c < d and a,b,c and are constants.
Theorems
• Suppose X and Y are independent and that
E(u(X)) and E(v(Y)) exist, then
– E[u(x), v(Y)]=E[u(X)]E[v(Y)]
• Suppose the joint mgf M(t1,t2) exists for the
random variables X and Y. Then X and Y are
independent if and only if
– M(t1,t2) = M(t1,0)M(0,t2)
• That is the joint mfg if the product of the marginal
mgfs.
Note
• If X and Y are independent then the
correlation coefficient is zero.
• However a zero correlation coefficient does
not imply independence.
Download