Chapter #6 – Jointly Distributed Random Variables Question #1:

advertisement
Chapter #6 – Jointly Distributed Random Variables
Question #1: Two fair dice are rolled. Find the joint probability mass function of 𝑋 and π‘Œ
when (a) 𝑋 is the largest value obtained on any die and π‘Œ is the sum of the values; (b) 𝑋 is
the value on the first die and π‘Œ is the larger of the two values; (c) 𝑋 is the smallest and π‘Œ is
the largest value obtained on the dice. (The answers are based on the following table. Also
note that only the solution to part (a) is presented as parts (b) and (c) are solved similarly.)
Second Die
Sum of two dice
First Die
1
2
3
4
5
6
1
2
3
4
5
6
7
2
3
4
5
6
7
8
3 4 5 6
4 5 6 7
5 6 7 8
6 7 8 9
7 8 9 10
8 9 10 11
9 10 11 12
a) If 𝑋 is the max on either die and π‘Œ is the sum of the dice, then the joint probability
mass function π‘π‘‹π‘Œ (π‘₯, 𝑦) = 𝑃(𝑋 = π‘₯, π‘Œ = 𝑦) is given in the following table.
X
Y
1
2
3
4
5
6
7
8
9
10
11
12
𝑃𝑋 (𝑋 = 𝑖)
1
2
3
4
5
6
0
0
0
0
0
0
1/36
0
0
0
0
0
0
2/36
0
0
0
0
0
1/36 2/36
0
0
0
0
0
2/36 2/36
0
0
0
0
1/36 2/36 2/36
0
0
0
0
2/36 2/36 2/36
0
0
0
1/36 2/36 2/36
0
0
0
0
2/36 2/36
0
0
0
0
1/36 2/36
0
0
0
0
0
2/36
0
0
0
0
0
1/36
1/36 3/36 5/36 7/36 9/36 11/36
π‘ƒπ‘Œ (π‘Œ = 𝑖)
0
1/36
2/36
3/36
4/36
5/36
6/36
5/36
4/36
3/36
2/36
1/36
Note that the marginal probability mass function for 𝑋 is given by summing the
columns while the marginal mass function for π‘Œ is given by summing the rows.
Question #2: Suppose that 3 balls are chosen without replacement from an urn consisting
of 5 white and 8 red balls (13 total). Let 𝑋𝑖 equal 1 if the ith ball selected is white, and let it
equal 0 otherwise. Give the joint probability mass function of (a) 𝑋1, 𝑋2; (b) 𝑋1, 𝑋2, 𝑋3.
8∗7∗11
a) We have that 𝑋1 = {1,0} and 𝑋2 = {1,0}. We then calculate 𝑝(0,0) = 13∗12∗11 = 14/39,
5∗8∗11
10
8∗5∗11
10
5∗4∗11
5
𝑝(1,0) = 13∗12∗11 = 39, 𝑝(0,1) = 13∗12∗11 = 39 and 𝑝(1,1) = 13∗12∗11 = 39.
b) We have that 𝑋1 = {1,0}, 𝑋2 = {1,0} and 𝑋3 = {1,0}. We can therefore calculate
8∗7∗6
5∗8∗7
𝑝(0,0,0) = 13∗12∗11,
8∗5∗7
𝑝(1,0,0) = 13∗12∗11,
5∗4∗8
8∗7∗5
𝑝(0,1,0) = 13∗12∗11,
5∗8∗4
𝑝(0,0,1) = 13∗12∗11,
8∗5∗4
5∗4∗3
𝑝(1,1,0) = 13∗12∗11, 𝑝(1,0,1) = 13∗12∗11, 𝑝(0,1,1) = 13∗12∗11, 𝑝(1,1,1) = 13∗12∗11.
Question #9: The joint probability density function of X and Y is given by the following
6
𝑓(π‘₯, 𝑦) = 7 (π‘₯ 2 +
π‘₯𝑦
2
) where π‘₯ ∈ (0,1) and 𝑦 ∈ (0,2). (a) Verify that this is indeed a joint
density function. (b) Compute the probability density function of 𝑋 and π‘Œ. (c) Compute the
𝑃(𝑋 > π‘Œ). (d) Find the probability 𝑃(π‘Œ > ½|𝑋 < ½). (e) Find 𝐸(𝑋). (f) Find 𝐸(π‘Œ).
∞
∞
1
26
a) We must verify that the sum is 1 = ∫−∞ ∫−∞ 𝑓(π‘₯, 𝑦) 𝑑𝑦𝑑π‘₯ = ∫0 ∫0 7 (π‘₯ 2 +
6
1
π‘₯
6
1
6 2
1
6 2
π‘₯𝑦
2
6 7
1
) 𝑑𝑦𝑑π‘₯ =
∫ [π‘₯ 2 𝑦 + 4 𝑦 2 ]20 𝑑π‘₯ = 7 ∫0 (2π‘₯ 2 + π‘₯) 𝑑π‘₯ = 7 [3 π‘₯ 3 + 2 π‘₯ 2 ]10 = 7 (3 + 2) = 7 (6) = 1.
7 0
∞
26
b) We must find the following integral 𝑓𝑋 (π‘₯) = ∫−∞ 𝑓(π‘₯, 𝑦) 𝑑𝑦 = ∫0 7 (π‘₯ 2 +
6
7
π‘₯
6
π‘₯𝑦
2
) 𝑑𝑦 =
6
[π‘₯ 2 𝑦 + 4 𝑦 2 ]20 = 7 (2π‘₯ 2 + π‘₯). Thus, 𝑓𝑋 (π‘₯) = 7 (2π‘₯ 2 + π‘₯) where π‘₯ ∈ (0,1). Similarly,
∞
16
π‘“π‘Œ (𝑦) = ∫−∞ 𝑓(π‘₯, 𝑦) 𝑑π‘₯ = ∫0 7 (π‘₯ 2 +
π‘₯𝑦
2
6 1
𝑦
6 1
𝑦
) 𝑑π‘₯ = 7 [3 π‘₯ 3 + 4 π‘₯ 2 ]10 = 7 (3 + 4), 𝑦 ∈ (0,2).
1
π‘₯6
c) We calculate that 𝑃(𝑋 > π‘Œ) = ∬(π‘₯,𝑦):π‘₯>𝑦 𝑓(π‘₯, 𝑦) 𝑑𝑦𝑑π‘₯ = ∫0 ∫0 7 (π‘₯ 2 +
1
∫ [π‘₯ 2 𝑦
7 0
6
π‘₯
6
1
+ 4 𝑦 2 ]0π‘₯ 𝑑π‘₯ = 7 ∫0 (π‘₯ 3 +
1
1
d) We have 𝑃 (π‘Œ > 2 |𝑋 < 2) =
1
2
π‘₯3
4
6
15
6
5
30
π‘₯𝑦
2
) 𝑑𝑦𝑑π‘₯ =
15
) 𝑑π‘₯ = 7 ∫0 4 π‘₯ 3 𝑑π‘₯ = 7 [16 π‘₯ 4 ]10 = 112 = 56.
1/2
6 2
π‘₯𝑦
∫1/2 ∫0 (π‘₯ 2 + 2 )𝑑π‘₯𝑑𝑦
6 1/2
∫ (2π‘₯ 2 +π‘₯)𝑑π‘₯
7 0
1
2
𝑃(𝑦> ∩ 𝑋< )
1
2
=7
𝑃(𝑋< )
2
=
1/2
∫1/2 ∫0
1/2
∫0
(π‘₯ 2 +
π‘₯𝑦
)𝑑π‘₯𝑑𝑦
2
(2π‘₯ 2 +π‘₯)𝑑π‘₯
.
∞
6
e) Since 𝑓𝑋 (π‘₯) = 7 (2π‘₯ 2 + π‘₯) where π‘₯ ∈ (0,1), we have 𝐸(𝑋) = ∫−∞ π‘₯ ∗ 𝑓(π‘₯) 𝑑π‘₯ =
6
1
6 1
1
6 1
1
6 5
5
∫ 2π‘₯ 3 + π‘₯ 2 𝑑π‘₯ = 7 [2 π‘₯ 4 + 3 π‘₯ 3 ]10 = 7 (2 + 3) = 7 (6) = 7.
7 0
6 1
∞
𝑦
f) Since π‘“π‘Œ (π‘₯) = 7 (3 + 4) where 𝑦 ∈ (0,2), we have that 𝐸(π‘Œ) = ∫−∞ 𝑦 ∗ 𝑓(𝑦) 𝑑𝑦 =
6
2 1
𝑦
6 1
1
6 2
1
6 7
∫ ( + 4) 𝑑𝑦 = 7 [3 𝑦 + 8 𝑦 2 ]20 = 7 (3 + 2) = 7 (6) = 1.
7 0 3
Question #13: A man and a woman agree to meet at a certain location at about 12:30 PM.
(a) If the man arrives at a time uniformly distributed between 12:15 and 12:45, and if the
woman independently arrives at a time uniformly distributed between 12:00 and 1 PM, find
the probability that the first to arrive waits no longer than 5 minutes. (b) What is the
probability that the man arrives first?
a) If we let 𝑋 denote the arrival time of the man, then 𝑋~π‘ˆπ‘πΌπΉ(15,45) while the density
1
is 𝑓𝑋 (π‘₯) = 30 when π‘₯ ∈ (15,45) and zero otherwise. Similarly, if we let π‘Œ denote the
1
arrival time of the women, then π‘Œ~π‘ˆπ‘πΌπΉ(0,60) and is π‘“π‘Œ (𝑦) = 60 when π‘₯ ∈ (0,60) and
zero otherwise. Since 𝑋 and π‘Œ are independent, we have that the joint density function
1
1
1
π‘“π‘‹π‘Œ (π‘₯, 𝑦) = 𝑓𝑋 (π‘₯) ∗ π‘“π‘Œ (𝑦), which implies that π‘“π‘‹π‘Œ (π‘₯, 𝑦) = 30 ∗ 60 = 1800 whenever we
have that (π‘₯, 𝑦) ∈ [15,45] × [0,60] and zero otherwise. Therefore, the probability that
the first to arrive waits no longer than 5 minutes is given by 𝑃(|𝑋 − π‘Œ| ≤ 5) =
𝑃(−5 ≤ 𝑋 − π‘Œ ≤ 5) = 𝑃(−𝑋 − 5 ≤ −π‘Œ ≤ −𝑋 + 5) = 𝑃(𝑋 + 5 ≥ π‘Œ ≥ 𝑋 − 5) =
45
π‘₯+5
∫15 ∫π‘₯−5
1
45
1
1800
45
1
300
1
π‘₯+5
𝑑𝑦𝑑π‘₯ = 1800 ∫15 [𝑦]π‘₯−5
𝑑π‘₯ = 1800 ∫15 10 𝑑π‘₯ = 1800 = 6.
45
60
b) The probability that the man arrives first is 𝑃(𝑋 < π‘Œ) = ∫15 ∫π‘₯
45
1
∫ (60
1800 15
1
1
1
1800
𝑑𝑦𝑑π‘₯ =
1
45
− π‘₯) 𝑑π‘₯ = 1800 [60π‘₯ − 2 π‘₯ 2 ]15
= 2.
Question #17: Three points 𝑋1, 𝑋2 and 𝑋3 are selected at random on a line L. What is the
probability that 𝑋2 lies between 𝑋1 and 𝑋3?
ο‚·
1
The probability is 3 since each of the points is equally likely to be the middle one.
Question #21: Let the joint density function 𝑓(π‘₯, 𝑦) = 24π‘₯𝑦 where (π‘₯, 𝑦) ∈ [0,1] × [0,1] and
0 ≤ π‘₯ + 𝑦 ≤ 1 while we let 𝑓(π‘₯, 𝑦) = 0 otherwise. (a) Show that 𝑓(π‘₯, 𝑦) is a joint probability
density function. (b) Find the 𝐸(𝑋). (c) Find the 𝐸(π‘Œ).
∞
∞
1
1−π‘₯
a) We must verify that the sum 1 = ∫−∞ ∫−∞ 𝑓(π‘₯, 𝑦) 𝑑𝑦𝑑π‘₯ = ∫0 ∫0
1
1
24π‘₯𝑦 𝑑𝑦𝑑π‘₯ =
1
𝑑π‘₯ = ∫0 12π‘₯(1 − 2π‘₯ + π‘₯ 2 ) 𝑑π‘₯ = ∫0 (12π‘₯ − 24π‘₯ 2 + 12π‘₯ 3 ) 𝑑π‘₯ =
∫0 [12π‘₯𝑦 2 ]1−π‘₯
0
[6π‘₯ 2 − 8π‘₯ 3 + 3π‘₯ 4 ]10 = 6 − 8 + 3 = 1.
∞
b) We first find the marginal density for 𝑋. We thus have 𝑓𝑋 (π‘₯) = ∫−∞ 𝑓(π‘₯, 𝑦) 𝑑𝑦 =
1−π‘₯
∫0
24π‘₯𝑦 𝑑𝑦 = [12π‘₯𝑦 2 ]1−π‘₯
= 12π‘₯(1 − π‘₯)2 = 12π‘₯ − 24π‘₯ 2 + 12π‘₯ 3 . Then we have
0
∞
1
𝐸(𝑋) = ∫−∞ π‘₯ ∗ 𝑓(π‘₯) 𝑑π‘₯ = ∫0 (12π‘₯ 2 − 24π‘₯ 3 + 12π‘₯ 4 ) 𝑑π‘₯ = [4π‘₯ 3 − 6π‘₯ 4 +
2
c) By the same reasoning as above, we find that 𝐸(π‘Œ) = 5.
12
5
2
π‘₯ 5 ]10 = 5.
Question #29: The gross weekly sales at a certain restaurant is a normal random variable
with mean $2200 and standard deviation $230. What is the probability that (a) the total
gross sales over the next 2 weeks exceeds $5000; (b) weekly sales exceed $2000 in at least
2 of the next 3 weeks? What independence assumptions have you made (independence)?
a) If we let π‘Šπ‘– be the random variable equal to the restaurant’s sales in week 𝑖, then we
see that π‘Šπ‘– ~𝑁(2200,230). We can then define the random variable 𝑋 = π‘Š1 + π‘Š2 and
by proposition 3.2, we have that 𝑋~𝑁(2200 + 2200,230 + 230) or that
𝑋−4400
𝑋~𝑁(4400,460). We therefore have that 𝑃(𝑋 > 5000) = 𝑃 (
460
>
5000−4400
460
)=
𝑃(𝑍 > 1.85) = 1 − Φ(1.85) = 0.0326, where 𝑍~𝑁(0,1).
π‘Šπ‘– −2200
b) In any given week, we calculate that 𝑃(π‘Šπ‘– > 2000) = 𝑃 (
230
>
2000−2200
230
)=
𝑃(𝑍 > −0.87) = 1 − Φ(−0.87) = 0.808. The probability that weekly sales will
exceed this amount in at least 2 of the next 3 weeks can be calculated with the
binomial distribution where π‘Œ is the random variable equal to the number of weeks
that sales exceed $2,000. We therefore have 𝑃(π‘Œ ≥ 2) = 𝑃(π‘Œ = 2) + 𝑃(π‘Œ = 3) =
(32)(0.808)2 (0.192)1 + (33)(0.808)3 (0.192)0 = ∑3𝑖=2(3𝑖)(0.808)𝑛 (0.192)3−𝑖 = 0.9036.
Question #34: Jay has two jobs to do, one after the other. Each attempt at job 𝑖 takes one
hour and is successful with probability 𝑝𝑖 . If 𝑝1 = 0.3 and 𝑝2 = 0.4, what is the probability
that it will take Jay more than 12 hours to be successful on both jobs?
ο‚·
Assume that Jay’s strategy is to attempt job 1 until he succeeds, and then move on to
job 2 after. The probability that Jay never gets job 1 done in 12 attempts is given by
0.712 . The probability that Jay fails job 1 in his first 𝑖 attempts, succeeding in the 𝑖 + 1
attempts and then never getting job 2 done is given by (0.7)𝑖 (0.3)(0.6)11−𝑖 . Thus, the
𝑖
11−𝑖
total probability is given by 0.712 + ∑11
.
𝑖=0(0.7) (0.3)(0.6)
Question #38: Choose a number 𝑋 at random from the set of numbers {1, 2, 3, 4, 5}. Now
choose a number at random from the subset no larger than 𝑋, that is, from {1, . . . ,X}. Call this
second number Y. (a) Find the joint mass function of 𝑋 and π‘Œ. (b) Find the conditional mass
function of 𝑋 given that π‘Œ = 𝑖. Do it for i = 1, 2, 3, 4, 5. (c) Are 𝑋 and π‘Œ independent? Why?
a) We can represent the joint probability mass function of 𝑋 and π‘Œ either with a table or
with an analytic formula. For the formula, we may write the joint PMF of these two
1
1
random variables as π‘π‘‹π‘Œ (𝑋 = 𝑗, π‘Œ = 𝑖) = (5) (𝑗 ). The table below presents this joint
PMF of 𝑋 and π‘Œ also, where the row sums give π‘π‘Œ (𝑦), the probability mass function of
π‘Œ, while the column sums give 𝑝𝑋 (π‘₯), the probability mass function of 𝑋.
1
2
3
4
5
Y
𝑃𝑋 (𝑋 = 𝑖)
X
1
2
3
1/5 1/10 1/15
0 1/10 1/15
0
0
1/15
0
0
0
0
0
0
1/5 1/5 1/5
4
1/20
1/20
1/20
1/20
0
1/5
π‘π‘‹π‘Œ (𝑋=𝑖,π‘Œ=𝑗)
b) We have that 𝑝𝑋|π‘Œ (𝑋 = 𝑖|π‘Œ = 𝑗) =
π‘π‘Œ (π‘Œ=𝑗)
5
1/25
1/25
1/25
1/25
1/25
1/5
=
1
5𝑗
1
∑5π‘˜=𝑖
5π‘˜
π‘ƒπ‘Œ (π‘Œ = 𝑗)
0.46
0.26
0.16
0.09
0.04
=
1
𝑗
1
∑5π‘˜=𝑖
π‘˜
= 𝑗∗∑5
1
π‘˜=𝑖 π‘˜
−1
. The
construction of this PMF follows from the definition of conditional distributions.
c) No, because the distribution of π‘Œ depends on the value of 𝑋.
Question #41: The joint density function of X and Y is given by π‘“π‘‹π‘Œ (π‘₯, 𝑦) = π‘₯𝑒 −π‘₯(𝑦+1)
whenever (π‘₯, 𝑦) ∈ (0, ∞) × (0, ∞) and π‘“π‘‹π‘Œ (π‘₯, 𝑦) = 0 otherwise. (a) Find the conditional
density of 𝑋, given π‘Œ = 𝑦, and that of π‘Œ, given 𝑋 = π‘₯. (b) Find the density function of 𝑍 = π‘‹π‘Œ.
a) We have that 𝑓𝑋|π‘Œ (π‘₯|𝑦) =
that π‘“π‘Œ|𝑋 (𝑦|π‘₯) =
π‘“π‘‹π‘Œ (π‘₯,𝑦)
𝑓𝑋 (π‘₯)
=
π‘“π‘‹π‘Œ (π‘₯,𝑦)
=
π‘₯𝑒 −π‘₯(𝑦+1)
∞
π‘“π‘Œ (𝑦)
∫0 π‘₯𝑒 −π‘₯(𝑦+1) 𝑑π‘₯
π‘₯𝑒 −π‘₯(𝑦+1)
∞
∫0 π‘₯𝑒 −π‘₯(𝑦+1) 𝑑𝑦
= (𝑦 + 1)2 π‘₯𝑒 −π‘₯(𝑦+1) for π‘₯ > 0 and
= π‘₯𝑒 −π‘₯𝑦 for 𝑦 > 0.
∞
π‘Ž
b) We calculate the CDF of 𝑍 as 𝐹𝑍 (π‘Ž) = πΉπ‘‹π‘Œ (π‘Ž) = 𝑃(π‘‹π‘Œ ≤ π‘Ž) = ∫0 ∫0π‘₯ π‘₯𝑒 −π‘₯(𝑦+1) 𝑑𝑦𝑑π‘₯ =
𝑑
𝑑
1 − 𝑒 −π‘Ž , and then differentiate to obtain π‘‘π‘Ž 𝐹𝑍 (π‘Ž) = π‘‘π‘Ž (1 − 𝑒 −π‘Ž ), which implies that
the density is given by 𝑓𝑍 (π‘Ž) = 𝑒 −π‘Ž for π‘Ž > 0. (Or use the CDF technique.)
Question #52: Let 𝑋 and π‘Œ denote the coordinates of a point uniformly chosen in the circle
1
of radius 1 centered at the origin. That is, their joint density is π‘“π‘‹π‘Œ (π‘₯, 𝑦) = πœ‹ whenever we
have π‘₯ 2 + 𝑦 2 ≤ 1 and π‘“π‘‹π‘Œ (π‘₯, 𝑦) = 0 otherwise. Find the joint density function of the polar
1
π‘Œ
coordinates 𝑅 = (𝑋 2 + π‘Œ 2 )2 and Θ = π‘‘π‘Žπ‘›−1 (𝑋).
ο‚·
We have π‘“π‘‹π‘Œ (π‘₯, 𝑦) and wish to find π‘“π‘…Θ (π‘Ÿ, πœƒ), where the random variables 𝑋 and π‘Œ
have been transformed into 𝑅 and Θ by the function 𝑔1 (π‘₯, 𝑦) = √π‘₯ 2 + 𝑦 2 and the
𝑦
function 𝑔2 (π‘₯, 𝑦) = π‘‘π‘Žπ‘›−1 (π‘₯ ). From the change of variables formula, we know that
π‘₯ = π‘Ÿπ‘π‘œπ‘ (πœƒ) and 𝑦 = π‘Ÿπ‘ π‘–π‘›(πœƒ) so we are able to uniquely solve for π‘₯ and 𝑦 in terms of
π‘Ÿ and πœƒ. The next step is to compute the Jacobian for this transformation, which is
πœ•π‘”1
πœ•π‘₯
𝐽(π‘₯, 𝑦) = 𝑑𝑒𝑑 [πœ•π‘”
2
πœ•π‘₯
π‘₯
𝑦
√π‘₯ 2 +𝑦2
√π‘₯ 2 +𝑦 2
−𝑦
π‘₯
πœ•π‘”1
πœ•π‘¦
]
πœ•π‘”2
= 𝑑𝑒𝑑 [
π‘₯ 2 +𝑦 2
πœ•π‘¦
formula π‘“π‘Œ1 π‘Œ2 (𝑦1 , 𝑦2 ) = 𝑓𝑋1 𝑋2 (π‘₯1 , π‘₯2 ) ∗
1
to find that π‘“π‘…Θ (π‘Ÿ, πœƒ) = πœ‹ ∗
1
1
π‘Ÿ
1
]=β‹―=
1
√π‘₯ 2 +𝑦 2
= π‘Ÿ . We then use the
π‘₯ 2 +𝑦 2
1
= 𝑓𝑋1 𝑋2 (β„Ž1 (𝑦1 , 𝑦2 ), β„Ž2 (𝑦1 , 𝑦2 ))
|𝐽(π‘₯1 ,𝑦1 )|
∗ |𝐽(π‘₯
1
1 ,𝑦1 )|
π‘Ÿ
= πœ‹ whenever 0 ≤ π‘Ÿ ≤ 1, 0 < πœƒ < 2πœ‹. Note that we
πœ•β„Ž1
could also compute the Jacobian as 𝐽(π‘Ÿ, πœƒ) =
πœ•π‘Ÿ
𝑑𝑒𝑑 [πœ•β„Ž
2
πœ•π‘Ÿ
πœ•β„Ž1
πœ•πœƒ
] and
πœ•π‘”2
just add |𝐽(π‘Ÿ, πœƒ)|.
πœ•πœƒ
1
Question #55: X and Y have joint density function 𝑓(π‘₯, 𝑦) = π‘₯ 2 𝑦 2 when π‘₯ ≥ 1 and 𝑦 ≥ 1 (i.e.
whenever (π‘₯, 𝑦) ∈ [1, ∞) × [1, ∞)) and 𝑓(π‘₯, 𝑦) = 0 otherwise. (a) Compute the joint density
𝑋
function of π‘ˆ = π‘‹π‘Œ and 𝑉 = π‘Œ . (b) What are the marginal densities?
a) We have π‘“π‘‹π‘Œ (π‘₯, 𝑦) and wish to find π‘“π‘ˆV (𝑒, 𝑣), where the random variables 𝑋 and π‘Œ
have been transformed into π‘ˆ and 𝑉 by the function 𝑔1 (π‘₯, 𝑦) = π‘₯𝑦 and the function
π‘₯
𝑔2 (π‘₯, 𝑦) = 𝑦. We can solve for π‘₯ and 𝑦 uniquely in terms of 𝑒 and 𝑣 and find that we
πœ•π‘”1
√𝑒
.
√𝑣
πœ•π‘”1
πœ•π‘₯
𝑑𝑒𝑑 [πœ•π‘”
2
πœ•π‘¦
]
πœ•π‘”2
have π‘₯ = √𝑒√𝑣 and 𝑦 =
by 𝐽(π‘₯, 𝑦) =
πœ•π‘₯
The next step is to compute the Jacobian, which is given
𝑦
=
𝑑𝑒𝑑 [ 1
𝑦
πœ•π‘¦
π‘₯
− 𝑦2] = −
π‘₯
find that joint density is π‘“π‘ˆV (𝑒, 𝑣) =
2π‘₯
=−
𝑦
1
2√𝑒√𝑣
√𝑒
√𝑣
1
2
2 √𝑒
(√𝑒√𝑣) ( )
√𝑣
√𝑒
1
= −2𝑣. We can then
1
1
∗ |−2𝑣| = 𝑒2 ∗ 2𝑣 = 2𝑣𝑒2 whenever
1
√𝑒√𝑦 ≥ 1 and √𝑣 ≥ 1 which reduces to 𝑒 ≥ 1 and 𝑒 < 𝑣 < 𝑒.
∞
𝑒
b) We have that π‘“π‘ˆ (𝑒) = ∫−∞ π‘“π‘ˆV (𝑒, 𝑣) 𝑑𝑣 = ∫1
∞
∞
𝑓𝑉 (𝑣) = ∫−∞ π‘“π‘ˆV (𝑒, 𝑣) 𝑑𝑒 = ∫𝑣
∞
1
2
𝑒 2𝑣𝑒
1
1
𝑑𝑣 =
ln(𝑒)
𝑒2
whenever 𝑒 ≥ 1 and that
𝑑𝑒 = 2𝑣 2 for 𝑣 > 1 and similarly for 𝑣 < 1 we
2𝑣𝑒2
∞
have 𝑓𝑉 (𝑣) = ∫−∞ π‘“π‘ˆV (𝑒, 𝑣) 𝑑𝑒 = ∫1
1
2
2 2𝑣𝑒
1
𝑑𝑒 = 2.
Question #56: If 𝑋 and π‘Œ are independent and identically distributed uniform random
variables on (0, 1), compute the joint density functions of each of the following random
𝑋
𝑋
𝑋
variables (a) π‘ˆ = 𝑋 + π‘Œ and 𝑉 = π‘Œ ; (b) π‘ˆ = 𝑋 and 𝑉 = π‘Œ ; (c) π‘ˆ = 𝑋 + π‘Œ and 𝑉 = 𝑋+π‘Œ.
a) We have 𝑓𝑋 (π‘₯) = 1 and π‘“π‘Œ (𝑦) = 1 if π‘₯, 𝑦 ∈ (0,1) and zero otherwise. The
independence of 𝑋 and π‘Œ implies that their joint probability density function is
π‘“π‘‹π‘Œ (π‘₯, 𝑦) = 𝑓𝑋 (π‘₯)π‘“π‘Œ (𝑦) = 1 if 0 < π‘₯ < 1 and 0 < 𝑦 < 1. The transformations are then
π‘₯
𝑒
𝑒 = π‘₯ + 𝑦 and 𝑣 = 𝑦, so 𝑦 = 𝑒 − π‘₯ = 𝑒 − 𝑣𝑦 implying that 𝑦 = 1+𝑣 and that π‘₯ = 𝑣𝑦 =
πœ•π‘₯
𝑒𝑣
1+𝑣
. This allows to find 𝐽 =
𝑑𝑒𝑑 [πœ•π‘’
πœ•π‘¦
πœ•π‘’
𝑒(1+𝑣)
− (1+𝑣)3
=
𝑒
− (1+𝑣)2,
𝑣
πœ•π‘₯
πœ•π‘£
]
πœ•π‘¦
=
1+𝑣
𝑑𝑒𝑑 [ 1
1+𝑣
πœ•π‘£
𝑒
(1+𝑣)2
−𝑒 ]
(1+𝑣)2
𝑒𝑣
𝑒𝑣
𝑒
= − (1+𝑣)3 − (1+𝑣)3 =
𝑒
𝑒
so we can calculate π‘“π‘ˆπ‘‰ (𝑒, 𝑣) = π‘“π‘‹π‘Œ (1+𝑣 , 1+𝑣) |𝐽| = (1+𝑣)2. To find
𝑒𝑣
the bounds, we have 0 < π‘₯ < 1 so 0 < 1+𝑣 < 1 → 0 < 𝑒𝑣 < 1 + 𝑣 → 0 < 𝑣 <
𝑒
1+𝑣
𝑣
and
0 < 𝑦 < 1 so 0 < 1+𝑣 < 1 → 0 < 𝑒 < 1 + 𝑣.
π‘₯
π‘₯
𝑒
b) We have 𝑒 = π‘₯ so π‘₯ = 𝑒 and 𝑣 = 𝑦 so 𝑦 = 𝑣 = 𝑣 , which allows us to calculate the
πœ•π‘₯
Jacobian as 𝐽 =
𝑑𝑒𝑑 [πœ•π‘’
πœ•π‘¦
πœ•π‘’
𝑒
πœ•π‘₯
πœ•π‘£
]
πœ•π‘¦
1
= 𝑑𝑒𝑑 [ 1
𝑣
πœ•π‘£
0
−𝑒]
𝑣2
𝑒
= − 𝑣2 . We then have that π‘“π‘ˆπ‘‰ (𝑒, 𝑣) =
𝑒
π‘“π‘‹π‘Œ (𝑒, 𝑣 ) |𝐽| = 𝑣2 if 0 < π‘₯ < 1 → 0 < 𝑒 < 1 and 0 < 𝑦 < 1 → 0 <
𝑒
𝑣
< 1 → 0 < 𝑒 < 𝑣.
𝑒
Combining the bounds gives us that π‘“π‘ˆπ‘‰ (𝑒, 𝑣) = 𝑣2 whenever 0 < 𝑒 < 𝑣 < 1.
π‘₯
c) We have 𝑒 = π‘₯ + 𝑦 and 𝑣 = π‘₯+𝑦 so π‘₯ = 𝑣(π‘₯ + 𝑦) = 𝑒𝑣 and 𝑦 = 𝑒 − π‘₯ = 𝑒 − 𝑒𝑣 =
πœ•π‘₯
𝑒(1 − 𝑣), which implies that the Jacobian is 𝐽 =
𝑑𝑒𝑑 [πœ•π‘’
πœ•π‘¦
πœ•π‘’
πœ•π‘₯
πœ•π‘£
]
πœ•π‘¦
= 𝑑𝑒𝑑 [
𝑣
1−𝑣
𝑒
]=
−𝑒
πœ•π‘£
−𝑒𝑣 − 𝑒(1 − 𝑣) = −𝑒𝑣 − 𝑒 + 𝑒𝑣 = −𝑒. Thus, π‘“π‘ˆπ‘‰ (𝑒, 𝑣) = π‘“π‘‹π‘Œ (𝑒𝑣, 𝑒(1 − 𝑣))|𝐽| = 𝑒
1
whenever we have that the bounds are 0 < π‘₯ < 1 → 0 < 𝑒𝑣 < 1 → 0 < 𝑣 < 𝑒 and
1
0 < 𝑦 < 1 → 0 < 𝑒(1 − 𝑣) < 1 → 0 < 𝑒 < 1−𝑣.
Download