Corporate Profile - Crop and Soil Science

advertisement
Matrix Algebra and Regression
2
A
6
matrix element
3
a13 = 6
=
1
•
•
•
•
•
5
2
a matrix is a rectangular array of elements
m=#rows, n=#columns  m x n
a single value is called a ‘scalar’
a single row is called a ‘row vector’ B = 12 25 91 30
a single column is called a ‘column vector’
Matrix Algebra and Regression
•
•
a square matrix has equal numbers of rows and columns
in a symmetric matrix, aij = aji
2
0
4
0
11
0
6
0
0
0
4
0
3
8
9
0
0
8
1
0
11
0
9
0
8
• in a diagonal matrix, all off-diagonal elements = 0
• an identity matrix is a diagonal matrix with diagonals = 1
I=
1
0
0
0
0
1
0
0
0
0
1
0
0
0
0
1
Trace
• The trace of a matrix is the sum of the
elements on the main diagonal
A=
2
0
4
0
11
0
6
0
0
0
4
0
3
8
9
0
0
8
1
0
11 0
9
0
8
tr(A) = 2 + 6 + 3 + 1 + 8 = 20
Matrix Addition and Subtraction
4
6
2
9
2
5
+
8
7
1
=
8
3
0
4
7
1
12 10
4
6
2
9
2
5
-5
4
4
-4 -1
8
•
13
3
0
-3
=
4
7
1
The dimensions of the matrices must be the same
Matrix Multiplication
C11 = 2*2 + 5*5 + 1*1 + 8*8 = 94
A
B
mxn
nxp
2
5
1
8
3
6
9
4
7
3
3
5
X
2
3
7
5
6
3
1
9
3
8
4
5
C
mxp
94
=
77
72
77 142 86
72
86
92
• The number of columns in A must equal the number of rows in B
• The resulting matrix C has the number of rows in A and the number of
columns in B
• Note that the commutative rule of multiplication does not apply to
matrices: A x B ≠ B x A
Transpose a Matrix
 2 5 6
A

 1 2 3
•
 2 1
A  5 2


 6 3 
Multiplying A x A′ above will give the
uncorrected sums of squares for each
row in A on the diagonal of a 2 x 2
matrix, with the sums of crossproducts
on the off-diagonals
 65 30 
AA  

 30 65 
Invert a Matrix
•
The inverse of a matrix is analogous to division in
math
•
An inverted matrix multiplied by the original matrix
will give the identity matrix
M-1M = M-1M =I
•
It is easy to invert a diagonal matrix:
A
=
6
0
0
0
3
0
0
0
9
1/6 0
A-1
=
0
0 1/3 0
0
0 1/9
Inverting a 2x2 Matrix
• Calculate the Determinant (D) of the matrix M
M=
a
b
c
d
M=
• Verify
d/D
-b/D
-c/D
a/D
5
3
9
D = 2*9 – 5*3
|M| = D = ad - bc
M-1 =
2
M-1 =
9/3
-5/3
-3/3
2/3
 1 0
M M

0
1


1
• The extension to larger matrices is not simple – use a computer!
Linear Dependence
M=
a
b
c
d
M=
2
6
3
9
D = 2*9 – 6*3 = 0
D = ad - bc
The matrix M on the right is singular because one row (or column) can
be obtained by multiplying another by a constant. A singular matrix will
have D=0.
The rank of a matrix = the number of linearly independent rows or
columns (1 in this case).
A nonsingular matrix is full rank and has a unique inverse.
A generalized inverse (M–) can be obtained for any matrix, but the
solution will not be unique if the matrix is singular.
MM–M = M
Regression in Matrix Notation
Y = X + ε
Linear model
Parameter estimates
b = (X’X)-1X’Y
Source
df
SS
MS
Regression
p
b’X’Y
MSR
Residual
n-p
Y’Y - b’X’Y
MSE
Total
n
Y’Y
Download