 

advertisement

Stat 401B "Laws of Expectation and Variance"

Consider the random variable

U

 

, ,

, Z

Beginning from a distribution for

, ,

, Z

, simulation or appropriate mathematics can sometimes be used to determine the distribution of U . From this distribution, one can typically compute the mean and variance, E U and Var X .

When it's not so easy to find the distribution of U we may also use the facts that for jointly discrete cases

E

, ,

, Z

 

, ,  , z

, ,

,

 

, ,

, z

 and for jointly continuous cases

E

, ,

, Z

  

, ,

,

 

, ,

,

 dz

There are also some special relationships that are helpful in manipulating means and variances of functions of random variables U

 

, ,

, Z

. Some of these involve so-called

"covariances" and "correlations" (that are measures of linear association). These are

Cov

,

 

E

X

E X



Y

E Y

( Definition 1 ) and

 

Corr

,

 

Cov

,

Var X

Var Y

( Definition 2 )

As it turns out,

 

1 and

 

1 exactly when the joint distribution of

 

makes the two variables perfectly linearly related.

Two "Laws of Expectation" concern linear combinations of random variables and products of functions of independent random variables. They are the fact that for constants a a a

0 1 2

,

, a n

E

 a

0

 a X

1

 a Y

2

  a Z n

  a

0

 a

1

E X

 a

2

E Y a E Z n

( Fact 1 ) and that for arbitrary functions g

1 g

2

,  , g n

and independent , ,  , Z

1

E g

1

   

 g n

  

E g

1

 

E

2

 

E g n

 

( Fact 2 )

(In general, the mean of a linear combination is the linear combination of the means, and for independent variables, the mean of a product is the product of the means.)

Fact 1 and some algebra lead to what is often a useful formula for a variance

Var X

E X

2  

E X

2

( Fact 3 )

(A variance is an expected square minus the square of a mean.) Fact 2 and Definition 1 show that independence of two variables produces 0 covariance and therefore (by Definition 2) correlation 0.

Algebra, the definitions of variance and covariance, and Fact 1 can be used to show that the variance of a linear combination of variables involves the coefficients, variances and covariances, as

Var

 a

0

 a X

1

 a Y

2

  a Z n

  a

1

2

Var X

 a

2

2

Var Y

+2 a a Cov

X Y

2 a a

2 n

Cov

Y Z

   a n

2

Var Z

2 a a n

Cov

X Z

  

( Fact 4 )

(In the simplest case this is Var

 a

0

 a X

1

 a Y

2

  a

1

2

Var X

 a

2

2

Var Y

2 a a

1 2

Cov

X Y

.)

Then for independent , ,

, Z , Fact 4 reduces to

Var

 a

0

 a X

1

 a Y

2

  a Z n

  a

1

2

Var X

 a

2

2

Var Y a n

2

Var Z ( Fact 5 )

So-called "propagation of error" approximations to means and variances are derived by applying formulas for linear combinations of independent variables to first-order multivariate Taylor approximations to a nonlinear

, ,

, z

. For independent , ,

, Z this produces

E

, ,

, Z

  g

E X , E ,

, E Z

( Approx 1 ) and

Var

, ,

, Z

 

  g

 x

 2

Var X

  g

 2

 y

 Var Y

(where the partials are evaluated at the vector of means).

  g

 z

 2

Var Z ( Approx 2 )

2

Download