The Big Picture of Linear Algebra Fundamental Theorem of Linear Algebra Part 1. Dimensions of the four subspaces Row space and column space: equal dimension r = rank Nullspaces of A and AT : dimensions n − r and m − r Part 2. Orthogonality row space ⊥ nullspace of A column space ⊥ nullspace of AT Part 3. Orthogonal bases SVD factorization: A = U ΣV T Columns of V and U : bases for column space and row space Av1 = σ1 u1 . . . Avr = σr ur Construction of v’s and u’s in A = U ΣV T v1 , . . . , vn = orthonormal eigenvectors of AT A Then Av1 , . . . , Avn are also orthogonal! (Avj , Avk ) = (vj , AT Avk ) = (vj , σk2 vk ) = 0 (Avj , Avj ) = σj2 (vj , vj ) = σj2 Two Important Sets of Matrices Symmetric Orthogonal ST = S QT = Q−1 Every invertible matrix has a polar form A = SQ Every complex number has a polar form z = reiθ Eigenvalues of S are real Eigenvalues of Q are eiθ Which matrices are symmetric and also orthogonal? A = AT = A−1 Six Great Theorems of Linear Algebra Dimension Theorem: All bases for a vector space have the same number of vectors. Counting Theorem: Dimension of column space + dimension of nullspace = number of columns. Rank Theorem: Dimension of column space = dimension of row space. Fundamental Theorem: The row space and the nullspace of A are orthogonal complements in Rn . SVD: There are orthonormal bases (v’s and u’s for the row and column spaces of A) so that Avi = σi ui . Spectral Theorem: If AH = A (or just AH A = AAH ) there are orthonormal vectors q1 , . . . , qn so that Aqi = λi qi and QH AQ = Λ. Here AH is ĀT .