Ch4 Derivation of multiple regression coefficients The three-variable model The three-variable model relates Y to a constant and two explanatory variables X2 and X3. Yt 1 2 X 2t 3 X 3t ut (4.2) As before, we need minimize the sum of the squared residuals (SSR): n RSS uˆt2 (4.3) t 1 uˆt Yt Yˆt Yt 1 2 X 2t 3 X 3t (4.4) Substituting Equation (4.4) into Equation (4.3) n n t 1 t 1 RSS uˆt2 (Yt 1 2 X 2t 3 X 3t ) 2 (4.5) n RSS 2 (Y ˆ1 ˆ2 X 2t ˆ3 X 3t ) 0 1 t 1 (4.6) and n RSS 2 (Y ˆ1 ˆ2 X 2t 3 X 3t ) 0 ˆ2 t 1 (4.7) n RSS 2 (Y 1 2 X 2t 3 X 3t ) 0 3 t 1 (4.8) n n n n t 1 t 1 t 1 t 1 n n n t 1 t 1 t 1 Yt ˆ1 ˆ2t X 2t ˆ3 X 3t (4.9) Yt n1 ˆ2 X 2t ˆ3 X 3t (4.10) Dividing throughout by n and defining X i t 1 X it n : n Y ˆ1 ˆ2 X 2 ˆ3 X 3 (4.11) 1 Using Equation (4.12) and the second and third of the FOCs after manipulation, we obtain a solution for ˆ2 ˆ2 Cov( X 2 , Y )Var ( X 3 ) Cov( X 3 , Y )Cov( X 2 , X 3 ) Var ( X 2 )Var ( X 3 ) [Cov( X 2 , X 3 )]2 (4.13) And ˆ3 will be similar to Equation (4.13) by rearranging X 2t and X 3t : ˆ3 Cov( X 3 , Y )Var ( X 2 ) Cov( X 2 , Y )Cov( X 3 , X 2 ) Var ( X 2 )Var ( X 3 ) [Cov( X 2 , X 3 )]2 (4.13) 2 The k-variable model The k explanatory variables the model is as presented initially in Equation (4.1), so we have: Yt 1 2 X 2t 3 X 3t ... k X kt ut (4.15) While again we derive fitted values as: Yˆt 1 2 X 2t 3 X 3t ... k X kt (4.16) and uˆt Yt Yˆt Yt 1 2 X 2t 3 X 3t ... k X kt (4.17) We again want to minimize RSS, so : n RSS û 2t t 1 n (Yt 1 2 X 2t 3 X 3t ... k X kt ) 2 (4.18) t 1 ˆ1 Yt 1 2 X 2t 3 X 3t ... k X kt (4.24) Derivation of the coefficients with matrix algebra Equation (4.1) can easily be written in matrix notation as: Y X u where 1 X 21 Y1 1 X Y 22 Y 2 , X Y T 1 X 2T X k1 1 u1 u Xk2 , 2 , u 2 X KT uT k X 31 X 32 X 3T Note that in matrix notation RSS= uˆ 'uˆ . Thus we have: uˆ 'uˆ (Y Xˆ )' (Y Xˆ ) (Y ' ˆ ' X ' )(Y Xˆ ) Y 'Y Y ' Xˆ ˆ ' X 'Y ˆ ' X ' Xˆ Y 'Y 2YX ' ˆ ' ˆ ' X ' Xˆ ˆ ( X ' X ) 1 ( X 'Y ) (4.26) (4.27) (4.28) (4.29) (4.32) 3