ST A T 511

advertisement
Formula Sheet
STAT 511
Selected matrix formulas:
(AB ) = B A
(AT ) = (A )T
(cA) = c A
tr (cA) = c tr (A)
tr (A + B ) = tr (A)+ tr (B )
tr (AB ) = tr (BA)
jAT j = jAj
jAB j = jAj jB j
jAB j = jBAj
jcAj = ck jAj
1
1
1
1
1
1
A=
A
k
X
i=1
1=2
1
1
i ui uTi
=
k
X
i=1
when A and B are square and non-singular
when A is square and nonsingular
when c is a scalar
when c is a scalar
for square matrices of the same order
when c is a scalar and A is a k k matrix
spectral decomposition of a symmetric matrix
i 1=2 ui uTi
E (AY + d) = A + d
V ar(AY + d) = AAT
Cov (AY + d; B Y + L) = AB T
when E (Y) = when V ar(Y) = when V ar(Y) = ( 1)i j jAjij=jAj
is the s(i; j ) element of A ,where Aji is
the minor of the (j; i) element of A
AA A = A
generalized inverse
AMA = A
MAM = M
AM and MA are symmetric
Moore-Penrose generalized inverse
+
PX
= X (X T X ) X T
=X
rank(PX ) = rank(X ) = tr(PX )
PX X
1
Projection matrix: You may use the facts that
projection matrices are idempotent and symmetric,
PX is invariant to the choice of (XT X ) ,
PX V = V
for any vector V that is a linear combination of the
columns of X .
PX V = O
for any V that is orthogonal to every column of X
1
Linear models:
Y = X + E (Y) = X V ar(Y) = (X T X )b = X T Y
normal equations
(X T X )b = X T Y
1
generalized least square estimating equations.
1
Estimable functions of parameters: C is estimable if E (AY) = C for some A.
Conditions you can check (Result 3.9) are:
(i) C = AX for some A
(ii) C d = 0 for every d for which X d = 0
Reparameterization: X = W G and W = XF
Quadratic forms: YT AY
E (YT AY) = T A + tr(A)
when E (Y ) = and V ar(Y) = when A is symmetric and Y N (; )
Result 4.7 If Y N (; ) where is positive denite, and if A is symmetric with rank (A) = k, and
A is idempodent, then
V ar(YT AY) = 4T AA + 2tr(AA)
YT AY
2k (T A)
Result 4.8 If Y N (; ) and A1 and A2 are symmetric matrices such that A1 A2 = 0, then YT A1 Y
and YT A2 Y are independent random variables.
Cochran's Theorem: If Y N (; 2 I ) and A1 ; A2 ; : : : ; Ak are n n symmetric matrices such that
I = A1 + A2 + : : : + Ak and n = rank(A1 ) + rank(A2 ) + : : : + rank(Ak ), then YT A1 Y; : : : ; YT Ak Y
are independent random variables, with YT AiY 2rank(A ) ( 12 T Ai )
i
Tests of hypothesis and condence intervals:
H0 : C = d
vs
Aa : C 6= d
SSH 0
F
t
= (C b d)T [C (XT X ) C T ] (C b d)
1
= MSMSH
residuals
0
cT b cT MSresiduals cT (XT X ) c
= q
2
Things you should know, but are not on the formula sheet:
1. What it means for a matrix to be orthogonal, non-singular, positive denite, non-negative denite,
idempodent, symmetric.
2. Denition of eigenvalues and eigenvectors.
3. Concepts of rank of a matrix, linearly independent vectors, basis for a vector space, what is means
for a set of vectors to span a vector space.
4. How to obtain a generalized inverse.
5. Properties of ordinary least squares estimators.
6. Properties of generalized least squares estimators.
7. Gauss-Markov Theorem.
8. Any linear function of a multivariate normal random variable has a multivariate normal distribution.
Y1
Y2
!
9. If
has a multivariate normal distribution, then Y is stochastically independent of Y if
and only if Cov(Y ; Y ) = 0.
10. Denitions of chi-square, F , and t distributions.
11. The denition of a testable hypothesis.
12. Concepts of Type I error, Type II error, and power.
13. How to construct condence intervals for estimable functions of B for the normal-theory GaussMarkov model.
14. How to construct a condence interval for a variance.
15. How to partition sums of squares.
16. How to interpret S-PLUS or SAS/IML commands for performing computations involving matrices.
1
1
2
3
2
Download