LECTURE 11 A general formula Derived distributions; convolution; covariance and correlation

advertisement
LECTURE 11
A general formula
• Let Y = g(X)
g strictly monotonic.
Derived distributions; convolution;
covariance and correlation
• Readings:
Finish Section 4.1;
Section 4.2
slope
y
dg
(x)
dx
g(x)
[y, y+?]
Example
y
x
f X ,Y(y,x)=1
[x, x+d]
1
• Event x ≤ X ≤ x + δ is the same as
g(x) ≤ Y ≤ g(x + δ)
or (approximately)
g(x) ≤ Y ≤ g(x) + δ|(dg/dx)(x)|
x
1
• Hence,
Find the PDF of Z = g(X, Y ) = Y /X
FZ (z ) =
z≤1
FZ (z) =
z≥1
where y = g(x)
The distribution of X + Y
• W = X + Y ; X, Y independent
y
.
dx
!
!
(x)!!
The continuous case
• W = X + Y ; X, Y independent
.
!
! dg
δfX (x) = δfY (y) !!
y
(0,3)
.
w
(1,2)
.
(2,1)
.
(3,0)
.
x
w
pW (w) = P(X + Y = w)
=
=
"
x
"
x
x
x +y=w
P(X = x)P(Y = w − x)
• fW |X (w | x) = fY (w − x)
pX (x)pY (w − x)
• fW,X (w, x) = fX (x)fW |X (w | x)
• Mechanics:
= fX (x)fY (w − x)
– Put the pmf’s on top of each other
– Flip the pmf of Y
• fW (w) =
– Shift the flipped pmf by w
(to the right if w > 0)
– Cross-multiply and add
1
# ∞
−∞
fX (x)fY (w − x) dx
Two independent normal r.v.s
The sum of independent normal r.v.’s
• X ∼ N (0, σx2), Y ∼ N (0, σy2),
independent
• X ∼ N (µx, σx2), Y ∼ N (µy , σy2),
independent
fX,Y (x, y) = fX (x)fY (y)
$
(x − µx)2 (y − µy )2
1
=
exp −
−
2πσxσy
2σx2
2σy2
• Let W = X + Y
%
fW (w) =
=
• PDF is constant on the ellipse where
Correlation coefficient
&
• cov(X, Y ) = E (X − E[X]) · (Y − E[Y ])
'
• Dimensionless version of covariance:
ρ = E
.
.
. . .... . .. .
. ... ...... ... .. . . .
. .. ... . .... ... . .. .. . ... . .
.
. . . .. .
. . .. .. . . ... .. . .
. . .
.
x
i=1
Xi =
n
"
i=1
var(Xi) + 2
"
-
• −1 ≤ ρ ≤ 1
y
• |ρ| = 1 ⇔ (X − E[X]) = c(Y − E[Y ])
(linearly related)
• Independent ⇒ ρ = 0
(converse is not true)
• cov(X, Y ) = E[XY ] − E[X]E[Y ]
• var 
,
(X − E[X]) (Y − E[Y ])
·
σX
σY
cov(X, Y )
=
σX σY
• Zero-mean case: cov(X, Y ) = E [XY ]

2
– same argument for nonzero mean case
Covariance
n
"
#
∞
2
2
2
2
1
e−x /2σx e−(w−x) /2σy dx
2πσxσy −∞
– mean=0, variance=σx2 + σy2
• Ellipse is a circle when σx = σy

fX (x)fY (w − x) dx
• Conclusion: W is normal
is constant
.
.
. . .... . .. .
. . ... ...... ... .. . . .
. .. ... . ... ... . .. .. . ... . .
. . . .. .. .
y
. . . .. . . ... .. . .
.
.
. .
−∞
(algebra) = ce−γw
(x − µx)2
(y − µ y ) 2
+
2σx2
2σy2
x
# ∞
cov(Xi, Xj )
&
(i,j ):i=j
• independent ⇒ cov(X, Y ) = 0
(converse is not true)
2
MIT OpenCourseWare
http://ocw.mit.edu
6.041SC Probabilistic Systems Analysis and Applied Probability
Fall 2013
For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
Download