Uploaded by Saswat Panda

TWODIMEN RV

advertisement
Two Dimensional Random Variables
2.1 Introduction: So far, we have concerned with the properties of a single random
variable defined on a given sample space. In real life situation, we deal with two or more
random variables defined on the sample space. For example, in tossing a coin, while one
is interested in checking for the happening of heads in every trial, the other may be
interested in checking for tails simultaneously. In a manufacturing industry, the quality
inspector may be interested in measuring both length and width of every unit produced in
order to check whether the product conforms to the set specifications on length and width
is also an example, where two-dimensional random variable are useful.
Definition 2.1: Let S be the sample space of a random experiment. Let X and Y be two
random variables defined on S . Then, the pair ( X , Y ) is called a two dimensional
random variable or a bi-variate random variable or random vector.
Example 1: Consider the random experiment of tossing two fair coins once. Let X
denote the number of getting heads and Y denote the number of getting tails. Then, the
random variable (X,Y) is bi-variate. Note that X and Y are discrete random variables.
Example 2: An electronic component has two fuses. Let the time when fuse one blows
be a random variable X and the time when fuse two blows be an another random
variable Y . Then, (X,Y) is a two dimensional random variable. Note that X and Y are
continuous random variables.
2.2.Two Dimensional Discrete Random Variable
If the possible values of ( X , Y ) , ( xi , y j ) , i  1,2,...,m; j  1,2,..., n , are finite or
countably infinite, then ( X , Y ) is called a two dimensional discrete random variable.
2.3. Probability Function of ( X , Y ) or Joint Probability Mass Function of a
Discrete Random Variable
Let X and Y be a two dimensional discrete random variable on a sample space S ,
which can take the values x1, x2 ,..., xm for X and y1, y2 ,..., yn for Y such that
P( X  xi , Y  y j )  P( xi , y j )  pij ( called the probability of xi and y j . The function
pij is called the joint probability mass function (pmf) of the random variable X and Y if
pij must satisfy the following conditions:
(i) pij  0 for all i and j
and (iv)   pij  1 .
i j
General Joint Probability Table
y1
y2
…..
yj
…..
yn
Total
P( X  xi )
x1
p11
p12
…..
p1 j
…..
p1n
p1
x2
p21
p22
…..
p2 j
…..
p2n
p2

xi

pi1

pi 2

…..

pij

…..

pin

pi 

xm

pm1


pmj

…..

pm2

…..
pmn

pm 
Total
P (Y  y j )
p1
p2
…..
p j
…..
p n
1
Y
X
The set of
all triplets {( xi , y j , pij ) , i  1,2,...,m;
j  1,2,..., n} is called the joint
probability distribution of ( X , Y ) .
2.4.Marginal Probabilities
(i) Marginal Probability Function of X :
Suppose the joint distribution of two random variables X and Y is given, then the
probability distribution of X is determined as follows:
P( X  xi ) = P( X  xi , Y  y1)  P( X  xi , Y  y2 )  
= pi1  pi 2  ... =  pij = pi  ,
j
is called the marginal probability function of X .
The collection of pairs {( xi , pi ) ; i  1,2,...} is called the marginal probability
distribution of X .
(ii) Marginal Probability Function of Y :
Suppose the joint distribution of two random variables X and Y is given, then the
probability distribution of Y is determined as follows:
P (Y  y j ) = P( X  x1, Y  y j )  P( X  x2 , Y  y j )  
= p1 j  p2 j  ... =  pij = p j
i
is called the marginal probability function of Y .
The collection of pairs {( y j , p j ) ; j  1,2,...} is called the marginal probability
distribution of Y .
2.5. Conditional Probabilities
(i) Conditional Probability of X given Y :
Suppose X and Y are two random variables with joint probability mass function
pij  P( X  xi , Y  y j )  P( xi , y j ) . Then
P( X  xi Y  y j ) 
P( X  xi , Y  y j )

P( xi , y j )

pij
P(Y  y j )
p j
p j
is called the conditional probability function of X given Y  y j .

The collection of pairs ( xi ,


), i  1,2,... is called the conditional probability
p j

pij
distribution of X given Y  y j .
(ii) Conditional Probability of Y given X :
Suppose X and Y are two random variables with joint probability mass function
pij  P( X  xi , Y  y j )  P( xi , y j ) . Then
P(Y  y j X  xi ) 
P( X  xi , Y  y j )

P( xi , y j )

pij
P( X  xi )
pi 
pi 
is called the conditional probability function of Y given X  xi .

The collection of pairs ( y j ,


), j  1,2,... is called the conditional probability
pi 

pij
distribution of Y given X  xi .
2.6 Joint Distribution Function of the Discrete Random Variable
The joint cumulative distribution function or briefly the joint distribution function
for a two dimensional discrete random variable X and Y is defined by
FXY ( x, y )  F ( x, y )  P( X  x, Y  y )    pij .
j i
The joint distribution function F ( x, y) has the following properties:
(i) F (, y)  0  F ( x,) and F (, )  1
(ii) P{a  X  b, Y  y}  F (b, y)  F (a, y)
(iii) P{ X  x, c  Y  d}  F ( x, d )  F ( x, c)
(iv) P{a  X  b, c  Y  d}  F (b, d )  F (a, d )  F (b, c)  F (a, c)
2.7. Independent Random Variables
Two random variables X and Y are said to be independent if
P( X  xi , Y  y j )  P( X  xi ).P(Y  y j ) , that is, pij  pi   p j for all i and j ;
otherwise, they are said to be dependent.
Example 1: The joint probability function of two discrete random variables X and Y is
c(2 x  y ); 0  x  2; 0  y  3
given by f ( x, y )  
;
otherwise
0
(a) Find the value of c (b) Find P( X  1, Y  2)
Solution: (a) Given: f ( x, y)  c(2 x  y); 0  x  2; 0  y  3
Y
0
1
2
3
Total
0
2c
4c
6c
c
3c
5c
9c
2c
4c
6c
12 c
3c
5c
7c
15 c
6c
14 c
22 c
42 c
3
Total
3
42
5
42
7
42
15
42
6
42
14
42
22
42
X
0
1
2
Total
By definition, we know that   pij  1
j i
i.e., 42c  1
1
i.e., c 
42
Therefore, the joint probability distribution of ( X , Y ) is
0
1
2
Y
X
0
0
1
2
42
42
1
2
3
4
42
42
42
2
4
5
6
42
42
42
Total
6
9
12
42
42
42
1
(b) P( X  1, Y  2) = P( X  1, Y  0)  P( X  1, Y  1)  P( X  1, Y  2) 
P( X  2, Y  0)  P( X  2, Y  1)  P( X  2, Y  2)
2
3
4
4
5
6
24
=
+
+ + +
+
=
42 42 42 42 42 42
42
Example 2: Let X denote the number of times a certain numerical control machine will
malfunction: 1,2 times on any given day. Let Y denote the number of times a technician
is called on an emergency call. Their joint probability distribution is given as
Y
1
2
X
1
0.45
0.05
2
0.10
0.40
Find the marginal probability function of X and Y .
Solution: (i) Marginal probability function of X :
P( X  1) = P( X  1, Y  1)  P( X  1, Y  2) = 0.45 + 0.05 = 0.50.
P( X  2) = P( X  2, Y  1)  P( X  2, Y  2) = 0.10 + 0.40 = 0.50.
Probability distribution of X is
X x
P ( X  x)
1
2
0.50
0.50
(iii) Marginal probability function of Y :
P(Y  1) = P( X  1, Y  1)  P( X  2, Y  1) = 0.45 + 0.10 = 0.55
P(Y  2) = P( X  1, Y  2)  P( X  2, Y  2) = 0.05 + 0.40 = 0.45
Probability distribution of Y is
Yy
1
2
P (Y  y )
0.55
0.45
Example 3: The joint probability mass function of a bi-variate random variable ( X , Y ) is
k (2 x  y ) ; x  1,2 and y  1,2
given by P( X  x, Y  y )  
0
; otherwise

where k is constant. Then find (i) the value of k (ii) P( X  x Y  2)
Solution: (i) The joint probability matrix with marginal probabilities of X and Y can be
written as
1
2
Y
P( X  xi )
X
1
3k
4k
7k
2
5k
6k
11k
8k
10 k
18k
P (Y  y j )
By definition, we know that,   pij  1
j i
That is, 18k  1 . Then, it implies that k 
(i)
P( X  x, Y  2)
P(Y  2)
P( X  1, Y  2) 4k 2
Therefore, P( X  1 Y  2) 


P(Y  2)
10k 5
P( X  2, Y  2) 6k 3
P( X  2 Y  2) 


P(Y  2)
10k 5
1
.
18
We know that, P( X  x Y  2) 
Example 4: The joint probability distribution of the number X of cars and the number
Y of buses per signal cycle at a proposed left turn lane is displayed in the accompanying
joint probability table:
1
2
3
X
Y
1
0
1
1
12
6
2
0
1
1
9
5
3
1
1
2
18
4
15
Y
X
(i) Evaluate marginal distribution of
and .
(ii) Find the conditional distribution of X given Y  2 .
(iii) Find the conditional distribution of Y given X  3 .
(iv) Find P( X  2, Y  3) .
Solution: Now, the joint probability distribution table with marginal probabilities of X
and Y can be written as
1
2
3
P(Y  y)
X
Y
1
0
1
1
1
12
6
4
2
0
1
1
14
9
5
45
3
1
1
2
79
18
4
15
180
P( X  x )
5
19
1
1
36
36
3
(i) Marginal distribution of X :
5
36
19
P( X  2) = P( X  2, Y  1)  P( X  2, Y  2)  P( X  2, Y  3) =
36
P( X  1) = P( X  1, Y  1)  P( X  1, Y  2)  P( X  1, Y  3) =
P( X  3) = P( X  3, Y  1)  P( X  3, Y  2)  P( X  3, Y  3) =
1
3
Marginal distribution of Y :
P(Y  1) = P( X  1, Y  1)  P( X  2, Y  1)  P( X  3, Y  1) =
1
4
14
45
79
P( X  3) = P( X  3, Y  1)  P( X  3, Y  2)  P( X  3, Y  3) =
180
P( X  2) = P( X  2, Y  1)  P( X  2, Y  2)  P( X  2, Y  3) =
(ii) Conditional distribution of X given Y  2 :
P( X  x, Y  2)
where x  1,2,3.
P( X  x Y  2) 
P(Y  2)
P( X  1, Y  2)
P( X  1 Y  2) 
0
P(Y  2)
1
P( X  2, Y  2)
5
P( X  2 Y  2) 
 9 
14
P(Y  2)
14
45
1
P( X  3, Y  2)
9
P( X  3 Y  2) 
 5 
14
P(Y  2)
14
45
(iii) Conditional distribution of Y given X  3 :
P( X  3, Y  y)
where y  1,2,3.
P(Y  y X  3) 
P( X  3)
P( X  3, Y  1)
P(Y  1 X  3) 
0
P( X  3)
1
P( X  3, Y  2)
3
P(Y  2 X  3) 
 5
1
P( X  3)
5
3
2
P( X  3, Y  3)
2
P(Y  3 X  3) 
 15 
1
P( X  3)
5
3
11
(iv) P( X  2, Y  3)  P( X  1, Y  3)  P( X  2, Y  3) 
36
Example 5: A service station has both self-service and full-service islands. On each
island, there is a single regular unleaded pump with two hoses. Let X denote the number
of hoses being used on the self-service island at a particular time, and let Y denote the
number of hoses on the full-service island in use at that time. The joint probability mass
function of X and Y appears in the accompanying tabulation:
Y
0
1
2
X
0
0.10
0.04
1
0.08
0.20
2
0.06
0.14
Show that the random variables are not independent?
0.02
0.06
0.30
Solution: If X and Y are independent then by definition we know that, pij  pi   p j
From the above table, we have p00  0.10 , p0  0.16 , p0  0.24
Now, p0  p0  0.16  0.24  0.0384  0.1  p00 , that is, p0  p0  p00 .
Therefore, X and Y are not independent.
Exercise 2.1.
1. The joint probability mass function of a bi-variate discrete random variable ( X , Y ) is
given by the following table:
X
1
2
3
0.1
0.2
0.1
0.3
0.2
0.1
Y
1
2
Find the marginal probability mass function of X and Y .
0.3 for

Solution: Marginal probability of X is P( x)  0.4 for
0.3 for

0.4 for
and Marginal probability of Y is P( y )  
0.6 for
2. The joint distribution of X and Y is given by f ( x, y) 
x 1
x2
x3
y 1
y2
x y
; x  1,2,3; y  1,2 .Find
21
the marginal distributions.
5
 21 for x  1
 7
for x  2
Solution: Marginal distribution of X is P( x)  
 21
 9 for x  3
 21
9
for y  1

and Marginal probability of Y is P( y )   21
12

for y  2
 21
3. The joint probability distribution of two random variables X and Y is given below
X
1
2
3
4
1
4
36
1
36
3
36
2
36
2
36
3
36
3
36
1
36
5
36
1
36
1
36
1
36
1
36
2
36
1
36
5
36
Y
2
3
4
Find (a) the marginal distributions of X and Y .
(b) the conditional distributions of X given Y  1 and that of Y given X  2 .
Solution: (a) Marginal distributions of X and Y is
10
12
;
for
x

1
 36
 36 ; for y  1
9
7
 ; for x  2
 ; for y  2
 36
 36
and P( y )   8
P( x)   8
 36 ; for x  3
 36 ; for y  3
9
9
 ; for x  4
 ; for y  4
 36
 36
 0; otherwise
 0; otherwise
(b) Conditional distributions of X given Y  1 and Y given X  2 .
2
1
for x  1
 9 ; for y  1
 3;
1
1
 ; for y  2
 ; for x  2
3
6
and
P( X Y  1)   5
P(Y X  2)   1
12 ; for x  3
 3 ; for y  3
1
1
 ; for x  4
 ; for y  4
12
9
 0; otherwise
 0; otherwise
4. From the following joint distribution of X and Y , where X represents number of
girls and Y represents number of runs.
1
2
3
0
1
0.14
0
0
0.26
0
0.13
2
3
0
0.11
0.24
0
0.12
0
Y
X
Find i) the marginal probability of one girl.
ii) the conditional probability of one girl given a run of two.
iii) Are X and Y independent?
Solution: i) 0.39 ii) 0.52
iii) X and Y are independent
2.8. Two Dimensional Continuous Random Variable
If ( X , Y ) can assume all values in a specified region R in the xy  plane, then
( X , Y ) is called a two dimensional continuous random variable.
2.9. Joint Probability Density Function
If ( X , Y ) is a two dimensional continuous random variable such that
dx
dx
dy
dy
P{x 
 X  x
and y 
 Y  y  }  f ( x, y)dxdy then the function
2
2
2
2
f ( x, y) is said to be a joint probability density function if it satisfies the following
conditions:
(i) f ( x, y)  0 ,    x  
 
(ii) 
 f ( x, y )dxdy  1
 
2.10. Marginal Probability Density Functions
Let ( X , Y ) be a continuous bivariate random variable with joint probability density
function f ( x, y) . The marginal probability density functions of X and Y , f (x) and f ( y )
respectively are defined as

i) f ( x)  f X ( x)   f ( x, y )dy for    x   and


ii) f ( y)  fY ( y)   f ( x, y )dx for    y   .

2.11. Conditional Probabilities
(i) Conditional Probability Density Function of X given Y :
If ( X , Y ) is a continuous bi-variate random variable with joint probability density
function f ( x, y) then the conditional probability density function of X given Y  y is
f ( x, y )
f ( x y) 
provided f ( y)  0.
fY ( y )
(ii) Conditional Probability Density Function of Y given X :
If ( X , Y ) is a continuous bi-variate random variable with joint probability density
function f ( x, y) then the conditional probability density function of Y given X  x is
f ( x, y )
f ( y x) 
provided f ( x)  0.
f X ( x)
2.12. Joint Distribution Function of the Continuous Random Variable
The joint cumulative distribution function or briefly the joint distribution function
for a two dimensional continuous random variable X and Y is defined as follows:
x
y
FXY ( x, y)  F ( x, y)  P( X  x, Y  y)    f ( x, y)dydx .
 
The joint distribution function F ( x, y) has the following properties:
(i) F (, y)  0  F ( x,) and F (, )  1
(ii) P{a  X  b, Y  y}  F (b, y)  F (a, y)
(iii) P{ X  x, c  Y  d}  F ( x, d )  F ( x, c)
(iv) P{a  X  b, c  Y  d}  F (b, d )  F (a, d )  F (b, c)  F (a, c)
(v) At points of continuity of f ( x, y) ,
 2 F ( x, y )
 f ( x, y )
xy
2.13. Independent Random Variables
Two random variables X and Y are said to be independent if f ( x, y)  f ( x). f ( y)
otherwise they are said to be dependent.
Example 6: A candy company distributes boxes of chocolates with a mixture of creams,
toffees and nuts coated in both light and dark chocolate. For a randomly selected box, let
X and Y respectively be the proportions and suppose that the joint density function is
 2
(2 x  3 y ) ;0  x  1; 0  y  1
f ( x, y )   5

0
; otherwise
Check whether f ( x, y) is a joint probability density function.
Solution: Consider,
 
11 2
  f ( x, y )dxdy    (2 x  3 y )dxdy
 
00 5
1  2 x2
= 
0  5
1
6 xy 

 dy
5 
0
2 1
1 2 6 y 
 2 3y
=     dy =  
 = 1.
5
5 
0 5
 5
0
2
f ( x, y)  (2 x  3 y); 0  x  1; 0  y  1 is a joint probability density
5
function of a random variable X and Y .
Therefore,
Example 7: The co-ordinates of a laser on a circular target are given by the random
vector
with
the
following
probability
density
function
( X ,Y )

f ( x, y)  Cx2 (8  y), x  y  2 x, 0  x  2 , find C .
 
Solution: By definition we know that, 
 f ( x, y )dydx  1
 
2 2x
That is,   Kx 2 (8  y )dydx  1
0 x
2 2x

y
 dx  1
This implies that, K  x 2  8 y 


2
0 
x
2
3x 4 
That is, K   8 x 3 
dx  1

2 
0
2
This implies that K 
5
.
112
Example 8: Let X be the proportion of persons who will respond to one kind of mailorder solicitation and let Y be the proportion of persons who will respond to a second
type of mail-order solicitation. Suppose the joint probability density function of a twodimensional random variable ( X , Y ) is given by f ( x, y )  xy 2 
1
Compute P( X  1), P(Y  ), P( X  1 Y  1 ) .
2
2
Solution:
1 2
 
x 2 
19
(i) P( X  1)    f ( x, y )dxdy =    xy 2 
.
dxdy =


8
24
 
01

1
2 2
2
 
1
1
2 x 

dxdy = .
(ii) P(Y  )    f ( x, y)dxdy =   xy 


8 
2  
4
0 0
x2
, 0  x  2 , 0  y  1.
8
1
2 2
1
x2 
(iii) P( X  1, Y  )     xy 2  dxdy =
2 0 1 
8 
5
.
24
1
P( X  1, Y  )
5
5
1
2
(iv) P( X  1 Y  ) =
= 24 = .
1
1
2
6
P(Y  )
4
2
Example 9: The joint density for the two dimensional random variable ( X , Y ) , where
X is the unit temperature change and Y is the proportion of spectrum shift that a certain
atomic particle produces is
k ( x  y ) ;0  x  2; 0  y  2
f ( x, y )  
; otherwise
 0
where k is constant. Determine the value of k and compute the marginal density
functions of X and Y .
 
Solution: (i) By definition we know that, 
 f ( x, y )dxdy  1 .
 
22
That is,   k ( x  y )dxdy  1 .
00
2
This implies that
k  (2  2 y )dy  1 .
0
2

2 y2 
That is, k 2 y 
  1.
2 

0
1
This implies that k  .
8
(ii) Marginal probability density function of X is given by
2
12
1
y2 
1 x
f X ( x)   f ( x, y )dy =  ( x  y )dy =  xy   =
; 0 x2
80
8 
2 
4

0

Marginal probability density function of Y is given by

12
fY ( y)   f ( x, y)dx =  ( x  y )dx =
80


1  x2
  xy 
8  2

2
=
0
y 1
; 0 y2
4
Example 10: The fraction X of male runners and the fraction Y of female runners who
compete in marathon race is described by the joint density function
2 ;0  y  x  1
f ( x, y )  
0 ; otherwise
Obtain the marginal and conditional probability density functions.
Solution: Marginal probability density functions are
(i) Marginal probability density function of X is given by

x

0
f X ( x)   f ( x, y )dy =  2dy  2 y 0x  2 x; 0  x  1
(ii) Marginal probability density function of Y is given by

1

y
fY ( y)   f ( x, y)dx =  2dx  2 x 1y  2(1  y ); 0  y  1
Conditional probability density functions are
(i) The conditional probability density function of X given Y is given by
f ( x, y )
2
1
f ( x y) 
=

; y  x 1
fY ( y ) 2(1  y) (1  y)
(ii) The conditional probability density function of Y given X is given by
f ( x, y ) 2 1
f ( y x) 
=
 ; 0 y x
f X ( x) 2 x x
Example 11: If the joint distribution function of X and Y is given by
(1  e x )(1  e y ) for x  0, y  0

F ( x, y)  

0,
otherwise

Find the marginal densities of X and Y .
Solution: Given: F ( x, y)  (1  e x )(1  e y )  1  e x  e y  e( x  y )
The joint probability density function is given by
[e y  e( x  y ) ]
 2 F ( x, y )  2[1  e x  e y  e( x  y ) ]
=

xy
xy
x

e( x  y ) for x  0, y  0
Therefore, f ( x, y)  

otherwise
 0,
Marginal probability function of X is
f ( x, y ) 



0





f X ( x)   f ( x, y )dy   e ( x  y ) dy   e( x  y ) 0  e x , x  0 .
Marginal probability function of Y is



0

fY ( y )   f ( x, y )dx   e( x  y ) dx   e( x  y ) 0  e y , y  0 .
Example 12: A manufacturer has been using two different manufacturing processes to
make computer memory chips. Let ( X , Y ) be a bi-variate random variable, where X
denotes the time to failure of chips made by process A and Y denotes the time to failure
of chips made by process B. Assuming that the joint probability density function of
( X , Y ) is
8 xy ;0  y  x  1
f ( x, y )  
; otherwise
 0
Are X and Y independent?
Solution: X and Y are said to be independent if f ( x, y)  f ( x). f ( y) .
From the given, we know that, f ( x, y)  8xy .
Also we know that, marginal probability density function of X as
x
 y2 
 8 x    4 x3 ; 0  x  1
 2  0
Similarly, marginal probability density function of Y as

x
f X ( x)   f ( x, y )dy =  8 xydy
0

1

1
 x2 
fY ( y)   f ( x, y)dx =  8 xydx  8 y    4 y (1  y 2 );
y

 2  y
0  y 1
Therefore, f ( x). f ( y)  (4 x3 ).(4 y  4 y3 )  16 x3 y(1  y 2 )  f ( x, y)
This implies that, f ( x, y)  f ( x)  f ( y)
Therefore, we conclude that X and Y are not independent.
Example 13: An electronic component has two fuses. If an overload occurs, the time
when fuse one blows is a random variable X and the time when fuse two blows is a
random variable Y . The joint probability density function of the random vector ( X , Y )
2
2
is given by f ( x, y )  kxye  ( x  y ) , x  0, y  0 . Find the value of k and also prove
that X and Y are independent.
 
Solution: By definition, we know that, 
 f ( x, y )dydx  1
 
 
2
2
That is,   kxye  ( x  y ) dydx  1 .
0 0


0
0
2
2
This implies that, k  ye y dy .  xe  x dx =1 ( Separation principle )

2
1
Since  xe  x dx = , we have k  4
2
0
2 2
We have, f ( x, y )  4 xye( x  y ) .
Now, the marginal probability density function of X is


2 2
2
2
f X ( x)  f ( x)   f ( x, y )dy =  4 xye( x  y ) dy = 4 xe x  ye y dy

0
0
2
2 1
= 4 xe  x   = 2 xe  x , x  0 .
2
Now, the marginal probability density function of Y is


2 2
2
f Y ( y )  f ( x)   f ( x, y )dx =  4 xye( x  y ) dx  2 ye y , y  0

0
 y2
 x2
2
2
Now, f ( x). f ( y)  2 xe
. 2 ye
= 4 xye  ( x  y ) = f ( x, y)
This implies that, f ( x, y)  f ( x)  f ( y) ;
Therefore, we conclude that X and Y are independent .
Exercise 2.2.
1. The two dimensional random variable ( X , Y ) has joint probability density function,
kx ( x  y ); 0  x  2;  x  y  x
f ( x, y )  
.
;
otherwise
0
Find (a) the constant k
(b) the marginal density functions of X .
Solution: (a) k 
1
8
(b)
f ( x) 
x3
; 0 x2
4
2. Find the marginal density functions of X and Y if
2
f ( x, y)  (2 x  5 y), 0  x  1, 0  y  1 .
5
4x  3
2  6y
Solution: f ( x) 
and f ( y) 
5
5
3. Given that the joint probability density function of ( X , Y ) ,
c

; x  0; y  0

f ( x, y )   (1  x  y )3
.
0
;
otherwise
Determine (a) the constant c (b) the marginal density functions of X and Y .
(c) conditional probability density function of X given Y .
(d) P( X  5, 1  Y  2)
1
1
; x  0 and f ( y) 
; y0
Solution: (a) c  2 (b) f ( x) 
2
(1  x)
(1  y)2
(c) f ( x y) 
2(1  y )2
;x0
(1  x  y )3
(d) P( X  5, 1  Y  2) 
25
168
4. The joint probability density function of the two dimensional random variable (X,Y) is
 8
xy; 1  x  y  2
.
f ( x, y )   9
 0 ;
otherwise
(i) Find the marginal density function of X and Y .
(ii) Find the conditional density function of Y given X  x .
4x
4y 2
Solution: (i) f ( x) 
(4  x2 ); 1  x  2 and f ( y) 
( y  1); 1  y  2
9
9
2y
(ii) f ( y x) 
4  x2
5. If the joint probability density function of two dimensional random variable ( X , Y ) is
k (6  x  y ); 0  x  2; 2  y  4
given by f ( x, y )  
. then find the followings
;
otherwise
0
(i) the value of k ; (ii) P( X  1, Y  3) and
(iii) conditional probability density function of Y given X .
1
1
(6  x  y )
Solution: (i) k 
(ii) P( X  1, Y  3) 
(iii) f ( y x) 
8
4
6  2x
6. Determine X and Y are independent if the joint probability density function of (X,Y)
4 xy ;0  x  1; 0  y  1
is f ( x, y )  
.
; otherwise
 0
Solution: X and Y are independent
7. The joint probability density function of X and Y is
e( x  y ) ; 0  x, y  

.
f ( x, y)  
0
;
otherwise


Are X and Y independent?
Solution: X and Y are independent.
Download