MA22S6 Homework 5 Solutions ˆ Question 1

advertisement
MA22S6 Homework 5 Solutions
ˆ Question 1
We are told that Cov(X, Y ) = E[(X − E(X))(Y − E(Y ))]. Given that expectation is a
linear operator (i.e., E(X + Y ) = E(X) + E(Y )) we may rewrite this as
Cov(X, Y ) = E[(X − E(X))(Y − E(Y ))]
= E[XY − XE(Y ) − Y E(X) + E(X)E(Y )]
= E(XY ) − E(X)E(Y ) − E(Y )E(X) + E(X)E(Y )
= E(XY ) − E(X)E(Y ) as required.
If X and Y are independent then E(X, Y ) = E(X)E(Y ), i.e., the previous equation equals
0. A covariance of zero implies uncorrelation. Hence independence implies uncorrelation.
We know Var[X + Y ] = Var[X] + Var[Y ] + 2Cov[X, Y ]. Suppose T = Y + Z. Then
Var[X + T ] = Var[X] + Var[T ] + 2Cov[X, T ]
= Var[X] + Var[Y + Z] + 2Cov[X, Y + Z]
= Var[X] + Var[Y ] + Var[Z] + 2Cov[Y, Z] + 2Cov[X, Z] + 2Cov[X, Y ]
ˆ Question 2
What is E(X)? We know E(X) = ∑X xP(X = x). The marginal distribution of X is
X
1
1
2
3
1
4
1
2
1
4
We find E(X) = 1 × 14 + 2 × 21 + 3 × 41 = 2. The marginal of Y is
Y
1
1
2
3
1
4
1
2
1
4
We find E(Y ) = 1 × 41 + 2 × 21 + 3 × 41 = 2. We know E(XY ) = ∑X ∑Y P(X = x, Y = y), giving
E(XY ) = 0 × 1 + 2 ×
1
1
1
1
+3×0+2× +4×0+6× +3×0+6× +9×0=4
4
4
4
4
1
As E(XY ) = E(X)E(Y ), X and Y are uncorrelated. They are dependent as, for instance,
P(X = 1, Y = 1) = 0 ≠ P(X = 1)P(Y = 1) = 14 × 14 =
1
16 .
ˆ Question 3
Recall E[aX + b] = aE[X] + b and Var[aX + b] = a2 Var[X]. We have
ρ(rX + s, tY + u) = √
=
=
=
=
=
=
=
Cov(rX + s, tY + u)
Var[rX + s]Var[tY + u]
E[(rX + s)(tY + u)] − E[rX + s]E[tY + u]
√
Var[rX + s]Var[tY + u]
E[rtXY + ruX + stY + su] − (rE[X] + s)(tE[Y ] + u)
√
r2 Var[X]t2 Var[Y ]
rtE[XY ] + ruE[X] + stE[Y ] + su − rtE[X]E[Y ] − ruE[X] − stE[Y ] − su
√
∣rt∣ Var[X]Var[Y ]
rtE[XY ] − rtE[X]E[Y ]
√
∣rt∣ Var[X]Var[Y ]
rt(E[XY ] − E[X]E[Y ])
√
∣rt∣ Var[X]Var[Y ]
E[XY ] − E[X]E[Y ]
sign(rt) √
Var[X]Var[Y ]
sign(rt)ρ(X, Y )
where sign(x) is the signum of x which is the same as x/∣x∣. We have ρ(X, Y ) = 1 when
X = Y and ρ(X, Y ) = −1 when X = −Y .
ˆ Question 4
For a probability distribution to be valid it must integrate (or sum in a discrete case) to
1 over its range of possible value. In this case we require
∫x ∫y f (x, y) dx dy = 1
∫x ∫y c dx dy = 1
where the integral is over the unit disk, which has area π and therefore we get cπ = 1 and
thus c = 1/π. If one wants to calculate this directly: on the unit disk the limits of x are
2
√
±1 and those of y are then ± 1 − x2 , giving
1
√
1−x2
π/2
1√
1 − x2 dx = 2c ∫
∫−1 ∫−√1−x2 cdydx = 2c ∫−1
−π/2
cos2 (u)du = 2cπ/2 = cπ,
(1)
where we have used u-substitution, with x = sin u, dx = cos(u)du and new integration
limits for u, ±π/2 = arcsin(±1) (note that sin u is strictly monotonically increasing and
thus invertible on [−π/2, π/2]). Alternatively, one may use polar coordinates: as we
a the full unit disc of radius 1 we end up with 0 ≤ φ ≤ 2π and 0 ≤ r ≤ 1. We have
x = r cos φ = cos φ and y = r sin φ = sin φ. Recall that dxdy = rdrdφ. We now have
2π
∫0
1
1
1 21
∫0 cr drdφ = 2πc ∫0 rdr = 2πc 2 r ∣0 = πc,
Hence we have fX,Y (x, y) =
1
π
(2)
if (x, y) is inside the unit disk and fX,Y (x, y) = 0 otherwise.
The marginal distribution of x is
fX (x) = ∫ fX,Y (x, y) dy
y
√
1−x2
1
= ∫ √
dy
2
− 1−x π
y √1−x2
∣ √ 2
=
− 1−x
π
√
√
1 − x2
1 − x2
=
−(−
)
π
π
2√
=
1 − x2
π
By symmetry we see fY (y) =
2
π
√
(3)
1 − y2.
X and Y are independent if fX,Y (x, y) =
√
√
fX (x)fY (y). We see here that fX (x)fY (y) = π42 1 − x2 1 − y 2 ≠ fX,Y (x, y) so they
are dependent.
To investigate correlation we examine E[XY ] − E[X]E[Y ] and if this equals 0. First we
note that
1
E[X] = ∫
−1
1
xfX (x)dx = ∫
−1
x
2√
1 − x2 dx = 0,
π
(4)
as the integrand is odd under x → −x, integrated over an interval which is symmetric
3
around zero. For the same reason E[Y ] = 0. Hence we only need to calculate
E[XY ] = ∫
∞
∞
−∞
∫−∞
√
1−x2
1 1
xyfX,Y (x, y)dydx = ∫ x {∫ √
y dy} dx = 0,
π −1
− 1−x2
(5)
because the inner y-integral vanishes, being over an odd function of y with the integration
limits symmetric around zero. We conclude that X and Y are uncorrelated despite the
fact that they are dependent!
4
Download