Math-UA.233: Theory of Probability Lecture 20 Tim Austin

advertisement
Math-UA.233: Theory of Probability
Lecture 20
Tim Austin
From last time... 1
Suppose T is a change of variables between two regions of the
two-dimensional plane:
region A
Õ
T
T
1
region B
In coordinates,
py1 , y2 q T px1, x2 q g1px1 , x2 q, g2px1 , x2 q
px1 , x2 q T 1py1 , y2 q h1py1 , y2 q, h2py1 , y2 q
The Jacobian of T at the point px1 , x2 q:
J px1 , x2 q det
B g1 B g1
Bx Bx
Bg12 Bg22
Bx1 Bx2
BBgx1 BBgx2 BBgx1 BBgx2
1
2
2
1
From last time... 2
Theorem
Suppose pX1 , X2 q are jointly continuous, and this random
point lands in A with probability 1. Then the transformed
RVs
pY1, Y2q pg1pX1, X2q, g2pX1 , X2qq
are also jointly continuous, the point pY1 , Y2 q lies in B with
probability 1, and
fY1 ,Y2 py1 , y2 q fX1 ,X2 px1 , x2 q |J px1 , x2 q|1 ,
where xi
hi py1 , y2 q for i 1, 2.
A simple and common special case: when T is a linear
transformation given by an invertible matrix, say M:
y1
y2
M
x1
x2
.
If pY1 , Y2 q is the image of pX1 , X2 q under this change of
variables, then the previous formula becomes
fY1 ,Y2 py1 , y2 q where px1 , x2 qT
1
| det M | fX1 ,X2 px1 , x2 q,
M 1py1 , y2 qT .
Example (Ross E.g. 6.7a)
Let pX1 , X2 q be jointly continuous with joint PDF f . Find the joint
PDF of pX1 X2 , X1 X2 q.
Example (Ross E.g. 6.7b)
Let X , Y be independent Np0, 1q RVs. Let pR, Θq be the polar
coordinates of the point pX , Y q. Find their joint PDF.
This is a good place to introduce one of the most important joint
distributions.
Definition (Ross E.g. 6.5d)
Fix a parameter 1 ρ 1. The random vector pX , Y q is a
bivariate standard normal with correlation ρ if it is jointly
continuous with joint PDF
f px, y q 2π
for 8 x, y
a
1
1 ρ2
exp
2p1 1 ρ2q px 2 2ρxy
y 2q
8.
See Ross E.g. 6.5d for the general bivariate normal, which has
several other parameters.
The best way to make sense of this (compare Ross Subsec
7.8.1): change variables to
ay ρx 2 .
1ρ
a
The determinant of this change is 1{ 1 ρ2 , and
u
x
and v
px 2 2ρxy
2p1 q
1
ρ2
y 2q 1 2
pu
2
v 2q
So the change-of-variables formula gives
fU,V pu, v q 1
1
1
1
2
2
exp pu 2 v 2 q ? eu {2 ? ev {2 :
2π
2
2π
2π
thus, U and V are independent Np0, 1q RVs.
Since U X , this implies that the marginal distribution of X is
standard normal. A similar change of variables gives the same
conclusion for Y .
Distributions of sums of RVs (Ross Sec 6.3)
As remarked previously, determining the distribution of g pX , Y q
can be difficult. We won’t present a general solution.
But in the special case g pX , Y q X Y , things are simpler,
and many tools are available. We’ve already met linearity of
expectation, but it’s often not hard to find the distribution (i.e.
CDF, PMF or PDF, not just the expectation) of X Y .
In particular, we can say a lot when X and Y are independent.
Let’s start with the general formulas.
Proposition (Distribution of sum: discrete case)
If X and Y are discrete with possible values x1 , x2 , . . . and
y1 , y2 , . . . , then
P tX
Y
zu ¸
all i,j such that xi yj z
for any real value z.
IDEA: Just another use of axiom 3.
p
pxi , yj q
looomooon
joint PMF of X ,Y
Example (Binomials; Ross E.g. 6.3f)
Let X and Y be independent binomial RVs with params pn, p q
and pm, p q. Then X Y is binompn m, p q.
IDEA: Remember what X and Y are counting.
SANITY CHECK: This gives
E rX Y s pn mqp
np
mp E rX s E rY s — as seen before.
Example (Poissons; Ross E.g. 6.3e)
Let X and Y be independent Poipλq and Poipµq, respectively.
Then X Y is Poipλ µq.
IDEA: Compute directly, or deduce from Poisson approx to
binomial.
There’s an analogous formula in the continuous case, but it
takes a bit more work.
Proposition (Distribution of sum: continuous case)
If X , Y are jointly continuous with joint PDF f , then X
continuous with PDF
fX
Y pz q »8
8
Y is
fX ,Y px, z x q dx.
IDEA: change variables to pX , X Y q and then compute the
marginal PDF of X Y . See Ross p239 for the alternative of
first finding the CDF of X Y .
In particular, if X and Y are independent, then
fX
Y
pz q »8
8
fX px qfY pz
x q dx.
Example (Ross E.g. 6.3a)
If X and Y are independent Unifp0, 1q RVs, find the PDF of
X Y.
ANSWER is called the tent distribution.
Example (Special case of Ross Prop 6.3.1)
If X and Y are independent Exppλq RVs, then X
continuous with PDF
fX
Y
pz q "
λ2 zeλz
0
Y is
z ¡0
otherwise.
By induction on n, if X1 , . . . , Xn are independent Exppλq RVs,
then their sum has PDF
fX1
Xn
pz q pn 1 1q! λn z n1 eλz
for z
¡ 0.
Any continuous RV with this PDF is called Gammapn, λq (so, for
instance, Gammap1, λq = Exppλq).
See Ross Subsections 5.3.1 and 6.3.2 for much more on
Gamma RVs.
The case of normals RVs is particularly important.
Proposition (Ross Prop 6.3.2)
If X1 , . . . , Xn are independent normal RVs
with respective
°
parameters µi , σi2 for i 1, . . . , n, then ni1 Xi is
Npµ1 µn , σ12 σn2 q. (Beware that we add the
squares of the σi ’s!)
IDEA: Step 1: let n 2, X1 be Np0, 1q and X2 be Np0, σ 2 q;
Step 2: get result for any two independent normals by scaling
and translating;
Step 3: induction on n.
More on sums: Variance and covariance (Ross Secs 7.3 and 7.4)
First, a useful observation:
Proposition (Ross Prop 7.4.1)
If X and Y are independent, and g and h are any functions
from reals to reals, then
E rg pX qhpY qs E rg pX qsE rhpY qs.
(Not true without independence!)
Now recall: the variance of a RV X is
VarpX q E pX
µ q2
where µ E rX s.
It gives a useful measure of how ‘spread out’ X is.
Generalization to two RVs X and Y :
Definition
Let X and Y be RVs, and let µX
The covariance of X and Y is
Cov
pX , Y q E pX
looooomooooon
E rX s and µY E rY s.
µX qpY µY q
notation
(provided this expectation makes sense, for discrete RVs, or continuous, or
whatever.)
First properties:
1. Symmetry: CovpX , Y q CovpY , X q.
2. Applying with Y
X:
CovpX , X q VarpX q.
3. Like Var, Cov has a useful alternative formula:
CovpX , Y q E rXY s E rX sE rY s.
4. By that previous prop, if X and Y are independent then
CovpX , Y q E rX
µX s E rY µY s 0.
Example (Ross E.g. 7.4d)
Let A and B be events, and let IA and IB be their indicator
variables. Then
CovpIA , IB q P pA X B q P pAqP pB q.
In particular,
VarpIA q P pAq P pAq2
P pAqp1 P pAqq.
OBSERVATION FROM THIS EXAMPLE: in general, Cov can
be positive or negative.
Download