Stat 511 Points of Clarification/ January 29, 2003 Here are several points I have slurred over and need to clear up. 1. I still haven't gotten the argument for Var c· 'β right. To begin with, the e-mail correction I % % OLS sent was missing a σ 2 multiplier, and I made the unnecessary assumption that the generalized − inverse in use is symmetric. There is no reason to assume that ( X ′X ) is even symmetric, let alone a Moore-Penrose generalized inverse. (This whole thing can't possibly depend upon which generalized inverse one uses.) Here is the right argument (fashioned still after Norma's suggestion, but without the symmetry assumption) that depends on the uniqueness of the projection matrix. − − ′ Var c· ' β OLS = c ' ( X ′X ) X ′σ 2 I X ( X ′X ) c % % % % ( ) − = σ 2c ' ( X ′X ) X ′ X % (( X ′X ) )′ c% = σ 2a ' X ( X ′X ) X ′X % (( X ′X ) )′ c% − − − (( X ′X ) )′ c% ′ = σ a ' X ( ( X ′X ) ) X ′a % % − = σ 2a ' PX X % − 2 Now look at Koehler's notes, panel 169. His Result 3.3 part (ii) says that the transpose of a generalized inverse of X ′X is itself a generalized inverse of X ′X . So − ′ − X ( X ′X ) X ′ = PX = X ( X ′X ) X ′ ( and then ) = σ 2 a ' X ( X ′X ) X ′a Var c· 'β % % OLS % % − 2 = σ c′ ( X ′X ) c % % − 2. I waffled today about whether "positive definite" matrices are of necessity symmetric. In fact (consistent with Koehler's definition on panel 77) they are not. On the other hand, any covariance matrix is symmetric and Christensen only defines positive definiteness for symmetric matrices (see his page 404). 3. Today I wrote without explanation VarU = V − 1 − 1 2 (σ V )V 2 − 1 − 1 2 = σ 2I But the question is how I know that V 2VV 2 = I . Here is an argument. There is a symmetric positive definite matrix N such that NN = V −1 i.e. V Then if 1 2 =N A = NVN we also have and − AN = NVNN = NVV −1 = N so that (multiplying both sides by V ) AV −1 = V −1 A= I