Joint Probability distribution Joint Probability Distributions

advertisement
JOINT PROBABILITY
DISTRIBUTION
Joint Probability Distributions

Given two random variables X and Y that are
defined on the same probability space,
the Joint Distribution for X and Y defines the
probability of events defined in terms of
both X and Y.

In the case of only two random variables, this
is called a Bivariate Distribution.

The concept generalizes to any number of
random variables, giving a Multivariate
Distribution.
Example





Consider the roll of a die and let A = 1 if the
number is even (2, 4, or 6) and A = 0 otherwise.
Furthermore, let B = 1 if the number is prime
(2, 3 or 5) and B = 0 otherwise. Find the Joint
Distribution of A and B?
P(A = 0, B = 0) = {1} = 1/6
P(A = 0, B = 1) = {3, 5} = 2/6
P(A = 1, B = 0) = {4, 6} = 2/6
P(A = 1, B = 1) = {2} = 1/6
Example
Y
Y1 = 0
Y2 = 1
Rows Total
X1 = 0
1/6
2/6
3/6 = 1/2
X2 = 1
2/6
1/6
3/6 = 1/2
Column Total
3/6 = 1/2
3/6 = 1/2
1
X
Joint Probability Distributions
Discrete Joint Probability Distribution
•
X and Y are Independent
f(x, y) = f(x) x f(y)
• X and Y are Dependent
 X  Y  X
f ( x, y )    
 x  y  n  x 
 N
 /  
y n 
Discrete Joint Probability Distribution

A die is flipped and a coin is tossed
Y
1
2
3
4
5
6
Row
Totals
X
Head
f(H, 1) = f(H, 1) = f(H, 1) = f(H, 1) = f(H, 1) = f(H, 1) =
1/12
1/12
1/12
1/12
1/12
1/12
1/2
Tail
f(T, 1) = f(T, 1) = f(T, 1) = f(T, 1) = f(T, 1) = f(T, 1) =
1/12
1/12
1/12
1/12
1/12
1/12
1/2
Column
Totals
1/6
1/6
1/6
1/6
1/6
1/6
1
Marginal Probability Distributions

Marginal Probability Distribution: the individual
probability distribution of a random variable.
Discrete Joint Probability Distribution
A die is flipped and a coin is tossed

Y
1
2
3
4
5
6
X
Head
Tail
Marginal
Probability
of Y
Marginal
Probability
of X
f(H, 1) f(H, 1) = f(H, 1) = f(H, 1) = f(H, 1) = f(H, 1)
= 1/12 1/12
1/12
1/12
1/12 = 1/12
1/2
f(T, 1) f(T, 1) =
= 1/12 1/12
1/2
1/6
1/6
f(T, 1) =
1/12
f(T, 1) =
1/12
1/6
1/6
f(T, 1) = f(T, 1)
1/12 = 1/12
1/6
1/6
1
Joint Probability Distributions
Continuous Two Dimensional Distribution
y x
F ( x, y ) 

f XY ( x, y )dxdy

F ( x, y )  0
P(a1  X  b1 , a2  Y  b2 ) 
b2 b1
  f ( x, y)dxdy
a2 a1
Joint Probability Distributions

X: the time until a computer server connects to your
machine , Y: the time until the server authorizes you as
a valid user. Each of these random variables measures
the wait from a common starting time and X <Y.
Assume that the joint probability density function for X
and Y is
6
f XY ( x, y)  6 10 exp( 0.001x  0.002 y), x  y

Find the probability that X<1000 and Y<2000.
Joint Probability Distributions
Marginal Probability Distributions
Functions of Random Variables
• Z = g(X, Y)
• If we roll two dice and X and Y are the number of
dice turn up in a trial, then Z = X + Y is the sum of
those two numbers.
F ( z )  P( Z  z ) 
  f ( x, y )
Discrete Distribution
Function
g ( x , y ) z
F ( z )  P( Z  z ) 
  f ( x, y)dxdy
g ( x , y ) z
Continuous
Distribution Function
Addition of Means


The Mean (Expectation) of a sum of random
variables equals the sum of Means (Expectations).
E(X1 + X2 + … + Xn ) = E(X1) + E(X2) + … + E(Xn)
Multiplication of Means


The Mean (Expectation) of the product of
Independent random variables equals the
product of Means (Expectations).
E(X1 X2 … Xn ) = E(X1) E(X2) … E(Xn)
Independence
Two continuous random variables X
and Y are said to be Independent, if
fXY(x, y) = fx(X) fy(Y) for all x and y
Addition of Variances





The Variance of the sum of Independent random
variables equals the sum of Variances of these
variables.
Б2 = Б12 + Б22 -2 БXY
БXY = Covariance of X and Y = E(XY) - E(X) E(Y)
If X and Y are independent, then
E(XY) = E(X) E(Y)
Б 2 = Б1 2 + Б2 2
Problem 1
Let f(x, y) = k when 8 ≤ x ≤ 12 and
0 ≤ y ≤ 2 and zero elsewhere. Find k.
Find P(X ≤ 11 and 1 ≤ Y ≤ 1.5) and
P(9 ≤ X ≤ 13, and Y ≤ 1).
Problem 3
Let f(x, y) = k. If x > 0, y > 0, x +y < 3
and zero otherwise. Find k.
Find P(X + Y ≤ 1) and P(Y > X).
Problem 7
What are the mean thickness and
standard deviation of transfer cores each
consisting of 50 layers of sheet metal
and 49 insulating paper layers, if the
metal sheets have mean thickness 0.5
mm each with a standard deviation of
0.05 mm and paper layers have mean
thickness 0.05mm each with a standard
deviation of 0.02 mm?
Problem 9
A 5-gear Assembly is put together with
spacers between the gears. The mean
thickness of the gears is 5.020 cm with a
standard deviation of 0.003 cm. The mean
thickness of spacers is 0.040 cm with a
standard deviation of 0.002 cm. Find the
mean and standard deviation of the
assembled units consisting of 5 randomly
selected gears and 4 randomly selected
spacers.
Problem 11
Show that the random variables with the
density f(x, y) = x + y and
g(x, y) = (x+1/2)(y+1/2)
If 0 ≤ x ≤ 1 and 0 ≤ y ≤ 1 and
f(x, y) = 0 and g(x, y) = 0 and zero
otherwise,
have
the
same
Marginal
Distribution.
Problem 13
An electronic device consists of two components.
Let X and Y [months] be the length of time until
failure of the first and second components,
respectively. Assume that X and Y have the
Probability Density
-0.1(x + y)
f(x, y) = 0.01e
If x > 0 and y > 0 and zero otherwise.
a. Are X and Y dependent or independent?
b. Find densities of Marginal Distribution.
c. What is the probability that the first component has a
lifetime of 10 months or longer?
Problem 15
Find P(X > Y) when (X, Y) has the
Probability Density
-0.5(x + y)
f(x, y) = 0.25e
If x ≥ 0, y ≥ 0 and zero otherwise.
Problem 17
Let (X, Y) have the Probability Function
f(0, 0) = f(1, 1) = 1/8
f(0, 1) = f(1, 0) = 3/8
Are X and Y independent?
Marginal Probability Distributions

Example: For the random variables in the previous example, calculate the
probability that Y exceeds 2000 milliseconds.
Conditional Probability Distributions

When two random variables are defined in a
random experiment, knowledge of one can change
the probabilities of the other.
Conditional Mean and Variance
Conditional Mean and Variance
Example: From the previous example, calculate
P(Y=1|X=3), E(Y|1), and V(Y|1).

P(Y  1 | X  3)  P( X  3, Y  1) / P( X  3)
 f x , y (3,1) / f x (3)  0.25 / 0.55  0.454
E (Y | 1)   yfY |1 ( y )
y
 1(0.05)  2(0.1)  3(0.1)  4(0.75)  3.55
V (Y | 1)   ( y  Y | x ) 2 fY |1 ( y )
y
 (1  3.55) 2 0.05  (2  3.55) 2 0.1  (3  3.55) 2 0.1  (4  3.55) 2 0.75
 0.748
Independence


In some random experiments, knowledge of the values of X
does not change any of the probabilities associated with the
values for Y.
If two random variables are independent, then
Multiple Discrete Random Variables


Joint Probability Distributions
Multinomial Probability Distribution
Joint Probability Distributions


In some cases, more than two random variables are
defined in a random experiment.
Marginal probability mass function
Joint Probability Distributions

Mean and Variance
Joint Probability Distributions

Conditional Probability Distributions

Independence
Multinomial Probability Distribution

A joint probability distribution for multiple discrete random
variables that is quite useful in an extension of the binomial.
Multinomial Probability Distribution




Example: Of the 20 bits received, what is the probability that 14 are
Excellent, 3 are Good, 2 are Fair, and 1 is Poor? Assume that the
classifications of individual bits are independent events and that the
probabilities of E, G, F, and P are 0.6, 0.3, 0.08, and 0.02, respectively.
One sequence of 20 bits that produces the specified numbers of bits in
each class can be represented as: EEEEEEEEEEEEEEGGGFFP
P(EEEEEEEEEEEEEEGGGFFP)= 0.6140.330.0820.021  2.708 10 9
20!
 2325600
The number of sequences (Permutation of similar objects)=
14!3!2!1!
 P(14E ' s,3G' s,2F ' s,1P)  2325600  2.708 109  0.0063
Two Continuous Random Variables




Joint Probability Distributions
Marginal Probability Distributions
Conditional Probability Distributions
Independence
Conditional Probability Distributions
Conditional Probability Distributions

Example: For the random variables in the previous example, determine the
conditional probability density function for Y given that X=x ( f Y | x ( y ))
fY | x ( y ) 

f XY ( x, y )
,
f X ( x)
Determine P(Y>2000|x=1500)
for
f X ( x)  0
Conditional Probability Distributions

Mean and Variance
Conditional Probability Distributions

Example: For the random variables in the previous example, determine the
conditional mean for Y given that x=1500
Independence
Independence


Example: Let the random variables X and Y denote the lengths of two dimensions of
a machined part, respectively.
Assume that X and Y are independent random variables, and the distribution of X is
normal with mean 10.5 mm and variance 0.0025 (mm)2 and that the distribution of Y
is normal with mean 3.2 mm and variance 0.0036 (mm)2.

Determine the probability that 10.4 < X < 10.6 and 3.15 < Y < 3.25.

Because X,Y are independent
Multiple Continuous Random Variables
Multiple Continuous Random Variables

Marginal Probability
Multiple Continuous Random Variables

Mean and Variance

Independence
Covariance and Correlation


When two or more random variables are defined
on a probability space, it is useful to describe how
they vary together.
It is useful to measure the relationship between the
variables.
Covariance

Covariance is a measure of linear relationship between the
random variables.
\
The expected value of a function of two random variables
h(X, Y ).

Covariance
 
  (x  
E[(Y  Y )( X   X )] 
X
)( y  Y ) f XY ( x, y )dxdy
  
 

  [ xy  
X
y  xY   X Y ] f XY ( x, y )dxdy (1)
  
Now
 


yf
(
x
,
y
)
dxdy


yf
(
x
,
y
)
dxdy


X
XY
X
XY


  
 

 
( 2)
 
From E (h( y )) 
  h( y ) f
XY
( x, y )dxdy
  
 
For h( y )  y; E ( y ) 
  yf
XY
( x, y )dxdy  Y
  
 
Substitute in (2),
 
X
yf XY ( x, y )dxdy   X Y , and
  
 
  x
y
f XY ( x, y )dxdy   X Y
  
Substitute in (1), E[(Y  Y )( X   X )] 
 
  xyf
XY
( x, y )dxdy   X Y   X Y   X Y
  
 

  xyf
  
XY
( x, y )dxdy   X Y  E ( XY )  X Y
Covariance
Covariance

Example: For the discrete random variables X, Y with the joint distribution
shown in Fig. Determine  XY and  XY
Correlation


The correlation is a measure of the linear
relationship between random variables.
Easier to interpret than the covariance.
Correlation

For independent random variables
Correlation

Example: Two random variables
and correlation between X and Y.
f XY ( x, y ) 
1
xy ,
16
calculate the covariance
Bivariate Normal Distribution
Correlation
Bivariate Normal Distribution

Marginal distributions

Dependence
Bivariate Normal Distribution

Conditional probability
Y | x  Y   X 
Y Y

x
X X
 Y2| x   Y2 (1   2 )
Bivariate Normal Distribution
Ex. Suppose that the X and Y dimensions of an injection-modeled part have a bivariate
normal distribution with  x  0.04,  y  0.08,  x  3.00,  y  7.70,   0.8
Find the P(2.95<X<3.05,7.60<Y<7.80)
Bivariate Normal Distribution

Ex. Let X, Y : milliliters of acid and base needed for equivalence,
respectively. Assume X and Y have a bivariate normal distribution with
 x  5,  y  2,  x  120,  y  100,   0.6

Covariance between X and Y

Marginal probability distribution of X

P(X<116)

P(X|Y=102)

P(X<116|Y=102)
Linear Combination of random
variables
Linear Combination of random
variables

Mean and Variance
Linear Combination of random
variables
Ex. A semiconductor product consists of 3 layers. The variances in thickness of
the first, second, and third layers are 25,40,30 nm2 . What is the variance
of the thickness of the final product?
Let X1, X2, X3, and X be random variables that denote the thickness of the
respective layers, and the final product.
V(X)=V(X1)+V(X2)+V(X3)=25+40+30=95 nm2
Discrete Joint Probability Distribution

Given a bag containing 3 black balls, 2 blue balls
and 3 green balls, a random sample of 4 balls is
selected. Find Joint Probability Distribution of X and
Y, if X = black balls and Y = blue balls.
f(x, y)
0
1
2
Row Totals
0
0
2/70
3/70
5/70
1
3/70
18/70
9/70
30/70
2
9/70
18/70
3/70
30/70
3
3/70
2/70
0/70
5/70
Column Total
15/70
40/70
15/70
1
Marginal Probability Distributions
f(x, y)
0
1
2
Marginal
Probability
of Y
0
0
2/70
3/70
5/70
1
3/70
18/70
9/70
30/70
2
9/70
18/70
3/70
30/70
3
3/70
2/70
0/70
5/70
Marginal
Probability of
X
15/70
40/70
15/70
1
Independence
Quiz # 4

Given a bag containing 3 black balls, 2 blue
balls and 3 green balls, a random sample of
4 balls is selected. Find Joint Probability
Distribution of X and Y, if X = black balls
and Y = blue balls.
Download