Name:

advertisement
Midterm 2 - Math 5010 - Spring 2016
Name: Firas Rassoul-Agha
Solve the following 5 problems.
Note that there is also a 6th Extra Credit problem.
You have to clearly explain your solution.
The answer carries no points. Only the work does.
CALCULATORS ARE NOT ALLOWED.
Problem 1: Let X and Y be two continuous random variables with joint
density given by
−(x+y)
2e
if 0 ≤ x ≤ y,
f (x, y) =
0
otherwise.
Find the marginal densities of X and Y .
Solution: The marginal of X is given by
Z ∞
Z
Z ∞
−(x+y)
−x
2e
dy = 2e
f (x, y) dy =
fX (x) =
−∞
x
∞
e−y dy = 2e−2x ,
x
when x ≥ 0 and fX (x) = 0 for x < 0. The marginal of Y is given by
Z ∞
Z y
Z y
−(x+y)
−y
fY (y) =
f (x, y) dx =
2e
dx = 2e
e−x dx = 2e−y (1 − e−y ),
−∞
0
0
when y ≥ 0 and fY (y) = 0 for y < 0.
1
Problem 2: Let (X, Y ) be continuous random variables with joint density f (x, y) = (x + y)/8, 0 ≤ x ≤ 2, 0 ≤ y ≤ 2; f (x, y) = 0 elsewhere. Find
the probability that X 2 + Y ≤ 1.
Solution: The probability in question equals the integral of f (x, y) over
the required region {(x, y) : x2 + y ≤ 1}. Since f (x, y) is not zero only when
both 0 ≤ x ≤ 2 and 0 ≤ y ≤ 2, we see that we need to integrate (x + y)/8
below the parabola y = 1−x2 , but only in the first quadrant. In other words,
the probability in question equals
Z
0
1
Z
0
1−x2
Z 1 Z 1−x2
x y dy dx +
dy dx
8
8
0
0
0
0
Z 1
Z 1
x(1 − x2 )
(1 − x2 )2
dx +
dx
=
8
16
0
0
Z 1
2x − 2x3 + 1 − 2x2 + x4
=
dx
16
0
1 − 1/2 + 1 − 2/3 + 1/5
=
16
30 − 15 + 30 − 20 + 6
31
=
=
≈ 6.5%.
16 × 2 × 3 × 5
480
x+y dy dx =
8
Z
1
Z
1−x2
2
Problem 3: Let Z be a random variable with standard normal distri2
bution, i.e. Z has pdf f (z) = √12π e−z /2 . Find the pdf of X = Z 2 . Do you
recognize this distribution? (You can peak at the pdf in Problem 4.)
Solution: First note that X ≥ 0 be because it is a square. So we know
that fX (x) = 0 if x < 0. Assume now x > 0 (it does not really matter what
the value of fX is at the one point x = 0). To find fX (x) we apply the change
of variables formula:
dz fX (x) = fZ (z) · dx
√
√
2
where z solves x = z . But this equation has two solutions x and − x,
both of which are admissible (since Z can be negative or positive). Therefore,
we apply the above to each of the solution and both formulas up. We thus
have
√ 2
√ 2
√
√
1
1
1
fX (x) = √ e−( x) /2 |( x)0 | + √ e−(− x) /2 |(− x)0 | = √ x−1/2 e−x/2 .
2π
2π
2π
The variable part x−1/2 e−x/2 is of the form xα−1 e−λx where α = 1/2 and
λ = 1/2. So X is a Gamma(1/2,1/2) random variable.
3
Problem 4: Let α, β, λ be positive parameters. Consider two independent random variables X and Y with Gamma(α, λ) and Gamma(β, λ)
distribution. This means the two have pdfs
λα α−1 −λx
x e
when x > 0 and 0 otherwise,
Γ(α)
λβ β−1 −λy
fY (y) =
y e
when y > 0 and 0 otherwise.
Γ(β)
fX (x) =
Compute the pdf of V = X + Y . Do you recognize this distribution?
At some point you will need to compute an integral of the form
R v Hint:
α−1
u
(v
− u)β−1 du. Change variables from u to s via u = vs. This will
0
lead you to showing that this integral in fact equals a constant times v α+β−1 .
Remark: This can be done using moment generating functions. It is a good
exercise to work it out that way. But at this point, we did not yet know these
objects and so we do it using the transformation method.
Solution: We need a two-to-two transformation with V = X + Y and such
that inverting the transformation will not be too difficult. One easy way to achieve
this is to set U = X. Then the inverse of the transformation is given by
X=U
and Y = V − U.
We will use the formula
fU,V (u, v) = fX,Y (x, y) × |detJ|.
The Jacobian matrix J is given by
"
J=
∂u
∂x
∂v
∂x
∂u
∂y
∂v
∂y
#
1 0
=
1 1
the determinant of which equals one. Since X and Y are independent we know
that
λα α−1 −λx
λβ β−1 −λy
fX,Y (x, y) = fX (x)fY (y) =
x
e
×
y
e
Γ(α)
Γ(β)
when x > 0 and y > 0 and zero otherwise.
4
Extra Page
Then the joint pdf of (U, V ) then equals
λα α−1 −λx
λβ β−1 −λy
x
e
×
y
e
×1
Γ(α)
Γ(β)
λα α−1 −λu
λβ
=
u
e
×
(v − u)β−1 e−λ(v−u
Γ(α)
Γ(β)
λα+β
=
uα−1 (v − u)β−1 e−λv .
Γ(α)Γ(β)
fU,V (u, v) =
This holds when u > 0 and v − u > 0. Otherwise, fU,V (u, v) = 0. Since we want
the pdf of V we have to integrate the u out of the above joint pdf to get
Z v
Z ∞
λα+β
−λv
uα−1 (v − u)β−1 du.
fV (v) =
fU,V (u, v) du =
e
Γ(α)Γ(β)
0
−∞
To finish we need to compute the integral. The hint suggests the change of variables
s = u/v. This turns the integral to
Z v
Z 1
α−1
β−1
u
(v − u)
du = v
(sv)α−1 (v − sv)β−1 ds.
0
0
Note how the boundaries of the integral changed (since s = 0 when u = 0 and
s = 1 when u = v). Also, the factor v outside the integral is there because u = sv
means du = v ds.
Factoring v out of (v − sv) and then factoring the v’s outside the integral we
get
Z
Z
1
1
sα−1 (1 − s)β−1 ds.
(sv)α−1 (v − sv)β−1 ds = v α+β−1
v
0
0
The remaining integral only depends on α and β, not on v. So we see that
fV (v) = constant × v α+β−1 e−λv
when v > 0 and fV (v) = 0 for v < 0. This says that V is a Gamma(α + β,λ)
random variable. The constant equals
λα+β
Γ(α)Γ(β)
R1
0
sα−1 (1 − s)β−1 ds
and since this should equal λα+β−1 /Γ(α + β) we deduce the extra fact that
Z 1
Γ(α + β)
sα−1 (1 − s)β−1 ds =
.
Γ(α)Γ(β)
0
Problem 5: If Z1 and Z2 are two independent standard normal distributions, what is the distribution of Z12 + Z22 ?
Hint: Put together your answers to Problems 3 and 4.
Solution: From problem 3 we know that Z12 and Z22 are both Gamma(1/2,1/2)
random variables. Since Z1 and Z2 are independent, so are Z12 and Z22 . But
from Problem 4 we know that the sum of two independent Gamma(1/2,1/2)
random variables gives a Gamma(1,1/2) random variable. Hence, Z12 + Z22
is a Gamma(1,1/2) random variable (which is an Exponential(1/2) random
variable).
6
Extra Credit: (no partial credit) If Z1 , Z2 , . . . , Zn are independent
standard normal distributions, what is the distribution of Z12 + Z22 + . . . + Zn2 ?
This random variable appears frequently in statistics and is called a Chi
Square random variable with n degrees of freedom.
Solution: By the same reasoning as for Problem 5 we get that Z12 + Z22 +
. . . + Zn2 is a Gamma(n/2,1/2) random variable.
7
Download