2/36

advertisement
CIS 2033 based on
Dekking et al. A Modern Introduction to Probability and Statistics. 2007
Slides by Michael Maurizi
Instructor Longin Jan Latecki
C9: Joint Distributions and Independence
9.1 – Joint Distributions of Discrete
Random Variables

Joint Distribution: the combined distribution of two or more random
variables defined on the same sample space Ω

Joint Distribution of two discrete random variables:
The joint distribution of two discrete random variables X and Y can be
obtained by using the probabilities of all possible values of the pair (X,Y)
Joint Probability Mass function p of two discrete random variables X and Y:
p : R 2  [0,1]
p XY (a, b)  p(a, b)  P(X  a, Y  b)
for    a, b  
Joint Distribution function F of two random variables X and Y: Can be thought
of as the sum of the elements in box it makes with the upper-left corner.
p : R 2  [0,1]
F (a, b)  P(X  a, Y  b)
for    a, b  
9.1 – Joint Distributions of Discrete
Random Variables
Example: Joint Distribution of S and M.
S = The sum of two dice, M = The maximum of two dice.
Quick exercise 9.1
List the elements of the event
{S = 7,M = 4} and
compute its probability.
b
pS(a)
a
1
2
3
4
5
6
2
1/36
0
0
0
0
0
1/36
3
0
2/36
0
0
0
0
2/36
4
0
1/36
2/36
0
0
0
3/36
5
0
0
2/36
2/36
0
0
4/36
6
0
0
1/36
2/36
2/36
0
5/36
7
0
0
0
2/36
2/36
2/36
6/36
8
0
0
0
1/36
2/36
2/36
5/36
9
0
0
0
0
2/36
2/36
4/36
10
0
0
0
0
1/36
2/36
3/36
11
0
0
0
0
0
2/36
2/36
12
0
0
0
0
0
1/36
1/36
pM(b)
1/36
3/36
5/36
7/36
9/36
11/36
1
9.1 – Joint Distributions of Discrete
Random Variables
Example: Joint Distribution of S and M.
S = The sum of two dice, M = The maximum of two dice.
Quick exercise 9.1
List the elements of the event
{S = 7,M = 4} and
compute its probability.
The only possibilities with the
sum equal to 7 and the
maximum equal to 4 are the
combinations (3, 4) and (4, 3).
They both have probability 1/36,
so that P(S = 7,M = 4) = 2/36.
b
pS(a)
a
1
2
3
4
5
6
2
1/36
0
0
0
0
0
1/36
3
0
2/36
0
0
0
0
2/36
4
0
1/36
2/36
0
0
0
3/36
5
0
0
2/36
2/36
0
0
4/36
6
0
0
1/36
2/36
2/36
0
5/36
7
0
0
0
2/36
2/36
2/36
6/36
8
0
0
0
1/36
2/36
2/36
5/36
9
0
0
0
0
2/36
2/36
4/36
10
0
0
0
0
1/36
2/36
3/36
11
0
0
0
0
0
2/36
2/36
12
0
0
0
0
0
1/36
1/36
pM(b)
1/36
3/36
5/36
7/36
9/36
11/36
1
9.1 – Marginal Distributions of Discrete
Random Variables


Marginal Distribution:
Obtained by adding up the
rows or columns of a joint
probability mass function table.
Literally written in the margins.
Let p(a,b) be a joint pmf of RVs
S and M. The marginal pmfs are
then given by
P( S  a)  pS (a)   p(a, b)
b
P( M  b)  pM (b)   p(a, b)
a
Example: Joint Distribution of S and M.
S = The sum of two dice, M = The maximum of two dice.
b
pS(a)
a
1
2
3
4
5
6
2
1/36
0
0
0
0
0
1/36
3
0
2/36
0
0
0
0
2/36
4
0
1/36
2/36
0
0
0
3/36
5
0
0
2/36
2/36
0
0
4/36
6
0
0
1/36
2/36
2/36
0
5/36
7
0
0
0
2/36
2/36
2/36
6/36
8
0
0
0
1/36
2/36
2/36
5/36
9
0
0
0
0
2/36
2/36
4/36
10
0
0
0
0
1/36
2/36
3/36
11
0
0
0
0
0
2/36
2/36
12
0
0
0
0
0
1/36
1/36
pM(b)
1/36
3/36
5/36
7/36
9/36
11/36
1
9.1 – Joint Distributions of Discrete
Random Variables: Examples
P( S  a)  pS (a)   p(a, b)
Example: Joint Distribution of S and M.
S = The sum of two dice, M = The maximum of two dice.
b
b
P( S  9)  pS (9)
  p(9, b) 
b
2
2
4


36 36 36
Compute joint distribution
function F(5, 3).
F (a, b)  P(S a, M  b)
pS(a)
a
1
2
3
4
5
6
2
1/36
0
0
0
0
0
1/36
3
0
2/36
0
0
0
0
2/36
4
0
1/36
2/36
0
0
0
3/36
5
0
0
2/36
2/36
0
0
4/36
6
0
0
1/36
2/36
2/36
0
5/36
7
0
0
0
2/36
2/36
2/36
6/36
8
0
0
0
1/36
2/36
2/36
5/36
9
0
0
0
0
2/36
2/36
4/36
10
0
0
0
0
1/36
2/36
3/36
11
0
0
0
0
0
2/36
2/36
12
0
0
0
0
0
1/36
1/36
pM(b)
1/36
3/36
5/36
7/36
9/36
11/36
1
P( X  a)  pX (a)   p(a, b)
b
P(Y  b)  pY (b)   p(a, b)
a
9.4 – Independent Random Variables
Tests for Independence: Two random variables X and Y are independent if
and only if every event involving X is independent of every event involving Y.
for all possible a and b
P (X  a, Y  b)  P (X  a )  P (Y  b)
or
P (X  a, Y  b)  P (X  a )  P (Y  b)
or
F ( a, b)  FX ( a )  FY (b)
or
f ( a , b)  f X ( a )  fY ( b)
Example 3.6 (Baron book)
A program consists of two modules. The number of errors, X, in the first
module and the number of errors, Y, in the second module have the joint
distribution, P(0, 0) = P(0, 1) = P(1, 0) = 0.2, P(1, 1) = P(1, 2) = P(1, 3) = 0.1,
P(0, 2) = P(0, 3) = 0.05. Find (a) the marginal distributions of X and Y, (b) the
probability of no errors in the first module, and (c) the distribution of the
total number of errors in the program. Also, (d) find out if errors in the two
modules occur independently.
Example 3. 6, p. 48, in Baron book
Table 4.2: (Baron book)
Joint and marginal distributions in discrete and continuous cases.
9.2 – Joint Distributions of Continuous
Random Variables

Joint Continuous Distribution: Like an ordinary continuous random
variable, only works for a range of values. There must exist a function f
that fulfills the following properties for there to be a joint continuous
distribution:
2
f :R  R
b1 b2
P(a1  X  b1 , a2  Y  b2 )    f ( x, y )dxdy
for all a1  b1 and a2  b2
a1 a2
f ( x, y )  0
for all x and y
 
  f ( x, y)dxdy  1
  
Marginal distribution function of X:
FX (a)  P(X  a)  F (a, )  lim F (a, b)
b 
Marginal distribution function of Y:
FY (b)  P(Y  b)  F (b, )  lim F (a, b)
a 
9.2 – Joint Distributions of Continuous
Random Variables
Joint distribution function:
F(a,b) can be constructed given f(x,y), and vice versa
a b
F (a, b) 
  f ( x, y)dxdy
and

2
f(x,y)
F ( x, y)
xy
Marginal probability density function:
You need to integrate out the unwanted random variable to get the
marginal distribution.

f X ( x) 
 f ( x, y)dy


and
fY ( y ) 
 f ( x, y)dx

The marginal distribution function of X:
The marginal distribution function of X:
We can also determine fY(y) directly from f(x,y) (Quick Exercise 9.5).
For y between 1 and 2:
9.3 – More than Two Random Variables
Assuming we have n random variables X1, X2, X3, … Xn. We can get the
joint distribution function and the joint probability mass functions.
for    a1 , a2 ,, an  
Joint dist ribut ion funct ion:
F ( a1 , a2 ,, an )  P( X1  a1 , X 2  a2 ,, X n  an )
Joint probability mass funct ion:
p ( a1 , a2 ,, an )  P( X1  a1 , X 2  a2 ,, X n  an )
9.4 – Independent Random Variables
Tests for Independence of more than two random variables.
9.5 – Propagation of Independence
Independence after a change of variable:
If a function is applied to several independent random variables, the new
resulting random variables will also be independent.
Download