3. Multivariate Normal Distribution In this chapter, the following topics will be discussed: Definition Moment generating function and independence of normal variables Quadratic forms in normal variable 3.1 Definition Intuition: Let Y ~ N , 2 . Then, the density function is 1 2 y 2 1 f y exp 2 2 2 2 1 2 1 2 1 1 1 1 y exp y Var Y 2 Var Y 2 Definition (Multivariate Normal Random Variable): A random vector Y1 Y Y 2 ~ N , Yn with EY , V Y has the density function 1 2 n 2 1 1 1 t f y f y1 , y2 , , yn exp y 1 y 2 det 2 1 Theorem: Q Y 1 Y ~ n2 t [proof:] Since is positive definite, matrix ( TT t TT t , where T T T I t 1 ) and 0 0 1 T1T t T 1 T t 2 0 0 . Thus, is a real orthogonal 0 0 . Then, n Q Y 1 Y t Y T1T t Y t X t 1 X where X T t Y . Further, Q X t 1 X X 1 n i 1 X2 1 1 0 X n 0 Xi X i i 1 i 2 i n 2 2 0 1 2 0 0 X1 X 0 2 1 X n n Therefore, if we can prove X i ~ N 0, i and Xi are mutually independent, then Xi ~ N 0,1, Q i i 1 i n Xi The joint density function of X 1 , X 2 ,, X n 2 ~ n2 . is g x g x1 , x2 ,, xn f y J , where y1 x1 y 2 y J det i det x1 x j y n x 1 det T y1 x2 y 2 x2 y n x2 X T Y Y TX Y T X t det TT t det I 1 t 2 1 det T det T det T det T 1 1 3 y1 xn y 2 xn y n xn Therefore, the density function of X 1 , X 2 ,, X n g x f y n 2 1 2 n 2 1 2 n 2 1 2 1 1 1 t exp y 1 y 2 det 2 1 1 1 t 1 exp x x 2 det 2 1 n xi2 1 1 exp 2 2 det i 1 i 1 2 n 1 n xi2 1 2 1 exp n 2 2 i 1 i i i 1 t det det T T det TT t det I n det i i 1 1 2 2 n 2 xi 1 1 exp 2 2 i i 1 i 1 Therefore, 3.2 X i ~ N 0, i and Xi are mutually independent. Moment generating function and independence of normal random variables Moment Generating Function of Multivariate Normal Random Variable: Let 4 Y1 t1 Y t 2 Y ~ N , , t 2 . Yn tn Then, the moment generating function for Y is M Y t M Y t1 , t2 ,, tn E exp t tY E exp t1Y1 t2Y2 tnYn 1 exp t t t t t 2 Theorem: If Y ~ N , and C is a pn matrix of rank p, then CY ~ N C , CC t . [proof:] Let X CY . Then, s C t E exp s Y M X t E exp t t X E exp t t CY t t t t s t C 1 exp s t s t s 2 1 exp t t C t t CC t s 2 Since M X t is the moment generating function of 5 N C , CC t , CY ~ N C , CC t ◆ . Corollary: 2 If Y ~ N , I then TY ~ N T , 2 I , where T is an orthogonal matrix. Theorem: If Y ~ N , , then the marginal distribution of subset of the elements of Y is also multivariate normal. Y1 Yi1 Y Y 2 Y ~ N , Y i 2 ~ N , , then , where Yn Yim i21i1 i21i2 i1 2 2 i i i i 2 2i2 m n, i1 , i2 , , im 1,2, , n , , 2 1 2 2 im imi1 imi2 i21im i22im i2mim Theorem: t Y has a multivariate normal distribution if and only if a Y is univariate normal for all real vectors a. [proof:] : 6 Suppose EY , V Y . a tY is univariate normal. Also, E atY at E Y at , V atY atV Y a at a . Then, a tY ~ N a t , a t a . Since Z ~ N , 2 1 t t M X 1 exp a a a 1 2 2 M 1 exp Z 2 E exp X E exp a t Y M Y a Since 1 M Y a exp a t a t a , 2 is the moment generating function of distribution N , , thus Y has a multivariate N , . : ◆ By the previous theorem. 3.3 Quadratic form in normal variables Theorem: 2 If Y ~ N , I and let P be an n n symmetric matrix of rank r. Then, 7 t Y PY Q 2 2 is distributed as r if and only if P 2 P (i.e., P is idempotent). [proof] : Suppose P 2 P and rank P r . Then, P has r eigenvalues equal to 1 and n r eigenvalues equal to 0. Thus, without loss generalization, 1 0 t P TT T 0 0 0 0 0 0 1 0 0 0 0 0 t T 0 0 where T is an orthogonal matrix. Then, t t Y PY Y TT t Y Q 2 Z t Z 2 2 Z T Y Z t 1 Z1 Z 1 2 Z1 Z 2 Z n 2 Z n Z12 Z 22 Z r2 2 8 Z2 Zn t Since Z T t Y and Y ~ N 0, 2 I , thus Z T t Y ~ N T t 0, T tT 2 N 0, 2 I . Z1 , Z 2 ,, Z n are i.i.d. normal random variables with common variance 2 . Therefore, Q Z12 Z 22 Z r2 2 2 2 2 Z Z Z 1 2 r ~ r2 : Since P is symmetric, P TT t , where T is an orthogonal matrix and a diagonal matrix with elements Since is 1 , 2 ,, r . Thus, let Z T t Y . Y ~ N 0, 2 I , Z T t Y ~ N T t 0, T t T 2 N 0, 2 I That is, Z1 , Z 2 , , Z r t t Y PY Y TT t Y Q 2 2 Z T Y Z t Z2 1 2 r Z i 1 i . are independent normal random variable with variance 2 . Then, Z t Z 2 i 2 r The moment generating function of Q 9 Z i 1 i 2 2 i is Zn t r 2 Z i i i 1 E exp t 2 r r ti Z i2 E exp 2 i 1 ti z i2 z i2 dz i exp 2 exp 2 2 2 2 1 i 1 z i2 1 2i t dz i exp 2 2 2 i 1 2 r z i2 1 2i t 1 2i t 1 dz i exp 2 2 2 1 2i t 2 i 1 r 1 r 1 1 2i t i 1 r 1 2i t 1 2 i 1 Also, since Q is distributed as 1 2t r 2 r2 , the moment generating function is also equal to . Thus, for every t, E exp tQ 1 2t r r 2 1 2i t i 1 Further, 1 2t r r 1 2i t . i 1 10 1 2 By the uniqueness of polynomial roots, we must have i 1 . Then, P2 P by the following result: a matrix P is symmetric, then P is idempotent and rank r if and only if it has r eigenvalues equal to 1 and n-r eigenvalues equal to 0. ◆ Important Result: t Let Y ~ N 0, I and let Q1 Y P1Y t and Q2 Y P2Y be both distributed as chi-square. Then, Q1 and Q2 are independent if and only if P1P2 0 . Useful Lemma: 2 2 If P1 P1 , P2 P2 and P1 P2 is semi-positive definite, then P1P2 P2 P1 P2 P1 P2 is idempotent. Theorem: 2 If Y ~ N , I Q1 and let t Y P1 Y ,Q 2 2 t Y P2 Y 2 2 2 If Q1 ~ r1 , Q2 ~ r2 , Q1 Q2 0 , then Q1 Q2 and Q2 2 are independent and Q1 Q2 ~ r1 r2 . 11 [proof:] We first prove Q1 Q2 ~ r21 r2 . Q1 Q2 0 , thus Q1 Q2 Since t P1 P2 Y Y Y ~ N 0, 2 I 2 , Y is any vector in 0 R n . Therefore, P1 P2 is semidefinite. By the above useful lemma, P1 P2 is idempotent. Further, by the previous theorem, Q1 Q2 since t P1 P2 Y Y 2 ~ r21 r2 rank P1 P2 trP1 P2 trP1 trP2 rank P1 rank P2 r1 r2 We now prove Q1 Q2 and Q2 are independent. Since P1P2 P2 P1 P2 P1 P2 P2 P1P2 P2 P2 P2 P2 0 By the previous important result, the proof is complete. 12 ◆