Transformations of a Continuous Random Variable:

advertisement
Continue: Joint distributions and Related Concepts
▪ Last time: covered joint pmfs.
▪ Let A be a set of values for (X,Y) then
P( X , Y )  A 
  p ( x, y )
( x , y ) A
▪ Usual properties of pmf’s still hold for the joint pmf:
p(x,y)=P(X = x, Y = y) ≥ 0 and
  p( x, y) = 1
all ( x , y )
▪ Marginal probability mass functions for X and Y obtained by summation as
p X ( x)   p( x, y ) and pY ( y )   p ( x, y )
y
x
Joint distributions for continuous rvs
▪ For continuous rv’s, X and Y, the joint probability density function f(x,y) for a
point (x,y) is analogous to the joint pmf .
▪ Defn: f(x,y) is the joint pdf for X and Y if for any set A of (x,y) values,
P( X , Y )  A    f ( x, y)dxdy
A
▪ Again, usual properties of pdf’s should still hold:
- f(x,y) ≥ 0

-

  f ( x, y)dxdy  1
 
▪ Analogously to discrete case, marginal densities fX(x) and fY(y) are given by




fX(x)=  f ( x, y )dy and fY(y)=  f ( x, y )dx
1
▪ Example (tire pressure):
- Front tires on a particular type of car are supposed to be filled to a pressure of 26
psi
- Suppose the actual air pressure in EACH tire is a random variable (X for the right
side; Y for the left side) with joint pdf
f ( x, y)  K ( x 2  y 2 ) for 20  x  30 and 20  y  30 and 0 otherwise
- Find the marginal distribution of X:
2
Independence
▪ Defn: Two random variables X and Y are said to be independent if for every pair
of values (x,y):
- Discrete: p(x,y) = pX(x) * pY(y)
- Continuous: f(x,y) = fX(x) * fY(y)
▪ Example:
- Consider continuous rv’s X and Y with joint pdf:
f ( x, y ) 
1
 2  
2
XY
e
1 x 2
y2
( 2  2 )
2  X  Y
for -∞ < x,y < ∞ and 0 otherwise
- Are X and Y independent?
3
Conditional probability
▪ Let X and Y be cts rv’s with joint pdf f(x,y) and marginal distribution of X, fX (x).
The the conditional probability density of Y, given X=x is defined as
fY | X ( y | x ) 
f ( x, y )
f X ( x)
▪ Moreover, the conditional expected value of Y given X=x is defined as

E(Y|X=x) =  y  f Y | X ( y | x)dy .

▪ If X and Y are discrete rv’s, substitute the pmf’s to get
pY | X ( y | x ) ,
the conditional
probability mass function of Y given X=x. Then get E(Y|X=x) =  y  pY | X ( y | x)
y
▪ Example: cts rv’s X and Y with joint pdf
`
2
f ( x, y )  ( 2 x  3 y ) for 0  x  1 and 0  y  1; and 0otherwise
5
- What is the marginal distribution of X?
4
- What is the conditional distribution of Y given X=0.2?
- What is the conditional expected value of Y given X=0.2?
Expected value of a function of (X,Y)
▪ Suppose we are interested in calculating the expected value of a function, h(X,Y).
▪ Will consider 2 cases:
1) X and Y continuous rv’s: Then
 
E (h( X , Y )) 
  h( x, y) f ( x, y) dx dy
  
2) X and Y discrete rv’s: Then
E (h( X , Y ))   h( x, y ) p ( x, y )
x
y
▪ Example 1
- An instructor has given a short test consisting of two parts
- For a random student, let X = # points earned on the first part
and Y = #points earned on the second part
5
- Suppose that the joint pmf of X and Y is given by
Y
X
0
5
10
0
5
10
.02
.15
.03
.06
.15
.14
.02
.20
.23
- If the score recorded in the grade book is the total number of points earned on
the two parts, what is the expected recorded score?
▪ Example 2: Show that if X and Y are independent rv’s, then E(XY)=E(X)E(Y)
- Assume X and Y are continuous. Then independence says f(x,y)=fX(x)fY(y).
- Let h(X,Y)=XY. Want E(h(X,Y)).
6
▪ Example 3: Suppose h(X,Y)=X+Y. What is E( h(X,Y) )?
▪ Defn: The conditional expected value of h(X,Y) given X=x is given by

E( h(X,Y) | X=x) = E( h(x,Y) | X=x) =  h( x, y ) fY | X ( y | x)dy

▪ Example: Suppose h(X,Y)=X+Y again. What is an expression for
E( h(X,Y) | X=x)?
7
Covariance
▪ Often two rv’s X and Y are dependent.
▪ E.G., Values of X that are large relative to their mean μX tend to occur with
values of Y that are large relative to their mean μY
▪ Alternately, values X that are large relative to their mean tend to occur with values
of Y that are small relative to their mean
8
▪ Example
- An instructor has given a short test consisting of two parts
- For a random student, let X = # points earned on the first part
and Y = #points earned on the second part
- Suppose that the joint pmf of X and Y is given by
Y
X
0
5
10
0
5
10
.02
.15
.03
.06
.15
.14
.02
.20
.23
- What is Cov(X,Y)?
9
Notes
10
▪ Defn: The correlation of X and Y is
 X   X Y  Y
 XY  E 
*
Y
 X
 Cov( X , Y )
 
 XY

▪ Properties: (homework problems)
i)
ii)
▪ Example: Find the correlation between X and Y in the previous problem
- Joint pmf was
Y
X
0
5
10
0
5
10
.02
.15
.03
.06
.15
.14
.02
.20
.23
- Had Cov(X,Y) = 3.125, E(X)=6.5 and E(Y)=6.25
11
Sampling ▪
▪ In chapter 1, looked at numerical/graphical summaries of samples (X1, X2, …, Xn)
from some population
▪ Can view each of the Xi’s as random variables
▪ Will be concerned with random samples
▪The Xi‘s are independent
▪The Xi‘s have the same probability distribution
▪ Often called “iid sampling” (iid=independent and identically distributed)
▪ Recall the following definitions:
- A parameter is a numerical feature of a distribution or population
- A statistic is a function of data from a sample (e.g., sample mean, sample
median…)
▪ We use statistics to estimate parameters
- e.g. use sample mean (statistic) to estimate population mean (parameter)
▪ Question:
- Suppose we draw a random sample from some population and compute the
value of a statistic
- Draw another random sample of the same size and compute the value of the
statistic again.
- Would we expect the 2 values of the statistic to be equal?
▪ Another question:
- We use statistics to estimate parameters. Will the statistics be exactly equal to
the parameter?
12
▪ Defn: The probability distribution of a statistic is called the sampling distribution
of the statistic
▪ Example:
- Large (infinite) population is described by the probability distribution
X
P(X=x)
0
3
12
.2
.3
.5
- If a random sample of size 2 is taken, what is the sampling distribution for the
sample mean?
13
Download