Section 6.1 Definitions • A matrix is diagonalizable if it is similar to a diagonal matrix. • Let A be an n × n matrix. A nonzero vector v ∈ Rn is an eigenvector for A if there is a scalar λ so that Av = λv. The scalar λ is called the associated eigenvalue of T . • The characteristic polynomial of a square matrix A is given by pA (t) = det(A − tI). Main ideas • A scalar λ is an eigenvalue if and only if A − λI is singular. • The eigenvalues of a matrix are the roots of its characteristic polynomial. • For any scalar λ, the set E(λ) = N (A − λI) is a subspace (because null spaces are always subspaces). If λ is an eigenvalue then E(λ) is a nontrivial subspace which contains all of the eigenvectors associated to λ. • Eigenvectors are found by finding N (A − λI) where λ is a root of pA (t). • The rational roots test says that if a polynomial p(t) = a0 +a1 t+· · ·+an tn has a rational root of the form t = r/s (in lowest terms) then r must be a factor of a0 and s must be a factor of an . For our class, this test is used to guess roots of the characteristic polynomial of a matrix, which in turn helps us factor (by first using long division with any newfound roots) and then find other roots. Section 6.2 Definitions • Let λ be an eigenvalue of a square matrix. The algebraic multiplicity of λ is its multiplicity as a root of the characteristic polynomial p(t); that is, the highest power of (t − λ) that divides p(t). The geometric multiplicity of λ is the dimension of E(λ). 1 Main ideas Let A be an n × n matrix. • The matrix A is diagonalizable if and only if there is a basis for Rn consisting of eigenvectors for A. • If λ1 , . . . , λk are distinct eigenvalues for A with corresponding eigenvectors v1 , . . . , vk , then the set {v1 , . . . , vk } is linearly independent. • If A has n distinct eigenvalues then it is diagonalizable. • There exist matrices with fewer than n distinct eigenvalues that are diagonalizable. • Two similar matrices have the same characteristic polynomial. • The geometric multiplicity of an eigenvalue is bounded above by its algebraic multiplicity. • Let A have distinct real eigenvalues λ1 , . . . , λk . Then the following statements are equivalent: 1. A is diagonalizable. 2. The characteristic polynomial of A factors into only linear factors p(t) = ±(t − λ1 )d1 · · · (t − λk )dk and di = dim E(λi ) for each i = 1, . . . , k. 3. dim E(λ1 ) + · · · + dim E(λk ) = n. Section 6.3 Main ideas • When A is diagonalizable, one can find a tractable formula for Ak for all k ≥ 1. Indeed, suppose A is diagonalizable. Then there exists a basis of eigenvectors v1 , . . . , vn with corresponding eigenvalues λ1 , . . . , λn (with possible repetitions). Let P be the matrix whose columns are v1 , . . . , vn and let D be the diagonal matrix with diagonal values λ1 , . . . , λn . Then A = P DP −1 and Ak = P Dk P −1 for all k ≥ 1. Let x ∈ Rn and let 2 c = (c1 , . . . , cn )> be given by c = P −1 x. Then k λ1 Ak x = P Dk P −1 x = P Dk c = [v1 · · · vn ] .. . λkn c1 .. . cn c1 λk1 = [v1 · · · vn ] ... cn λkn = c1 λk1 v1 + · · · + cn λkn vn Section 6.4 Definitions • An n × n matrix A is symmetric if A> = A. • An n×n matrix Q is orthogonal if Q> Q = I. Equivalently, Q is orthogonal if its columns form an orthonormal basis for Rn . • An n × n matrix A is orthogonally diagonalizable if A = QDQ> where D is a diagonal matrix and Q is an orthogonal matrix. Main ideas • (Spectral Theorem) Let A be a symmetric matrix. Then 1. The eigenvalues of A are all real. 2. A is orthogonally diagonalizable. • Let A be a symmetric matrix and let q1 , . . . , qn be an orthonormal basis of Rn consisting of eigenvectors of A with corresponding eigenvalues λ1 , . . . , λn (with possible repetitions). Then Ax = λ1 projq1 (x) + · · · + λn projqn (x). Comments The final exam will be roughly 8-10 questions. Roughly half to two thirds of the questions will be on material in Chapter 6, and the remaining questions will 3 be on older material. Older material questions will be similar to those on past exams. Overall the questions will be similar to those on the sample exam below, but not necessarily the same. 4 Sample Exam 1. Let A = 2 1 5 −2 . Calculate Ak for all k ≥ 1. 2. Let A and B be similar matrices. (a) Prove that A and B have the same characteristic polynomial. (b) Prove that A and B have the same eigenvalues. 3. Let A be a symmetric n × n matrix. Prove that det A is the product of its eigenvalues. [In fact, this is true for any square matrix, but the proof requires the fundamental theorem of algebra.] 1 2 4. Suppose A is symmetric, A = , and det A = 6. Find A. 1 2 5 −4 −2 5 −2 . 5. Let A = −4 −2 −2 8 (a) Compute the characteristic polynomial of A. (b) Find all eigenvalues of A. (c) Find a basis for each eigenspace. (d) Find an orthonormal basis for each eigenspace. (e) Find an orthogonal matrix Q and a diagonal matrix D such that A = QDQ> . (f) Find a matrix B such that B 3 = A. 6. Prove that if λ1 , . . . , λk are distinct eigenvalues for A with corresponding eigenvectors v1 , . . . , vk , then the set {v1 , . . . , vk } is linearly independent. 7. Let V = {x ∈ R3 : −x1 + x2 + x3 = 0} and let T : R3 → R3 be the linear transformation that projects vectors onto V . (a) Find an orthonormal basis B = {q1 , q2 , q3 } for R3 so that q1 , q2 span V and q3 is orthogonal to V . (b) Let B be the matrix of T with respect to B. Find B (c) Let A be the standard matrix A of T . Find A. (d) Find the eigenvalues of B. (e) Find the eigenvalues of A. 8. Let V = {x ∈ R3 : −x1 + x2 + x3 = 0} and let S : R3 → R3 be the linear transformation that rotates vectors about the axis orthogonal to V . Find the real eigenvalues and corresponding eigenspaces of S. 5 9. True or False: (a) If A is invertible then it’s diagonalizable. (b) If A is diagonalizable then it’s invertible. (c) If A is invertible and diagonalizable then so is A−1 . 10. Give definitions of the following terms: (a) Diagonalizable matrix (b) eigenvector (c) eigenvalue (d) algebraic multiplicity of an eigenvalue (e) geometric multiplicity of an eigenvalue 11. Suppose A is diagonalizable and let pA (t) denote the characteristic polynomial of A. Show that pA (A) = O. [Here’s what is meant by plugging a matrix into a polynomial: if say q(t) = t3 +t2 +3, then q(A) = A3 +A2 +3I.] 1 8 12. Write the vector x = −4 as a linear combination of u1 = 0 , 1 −3 2 −1 u2 = 4 , and u3 = 1 . −2 1 13. Prove the following two statements. (a) If {u1 , . . . , uk } is an orthogonal set then it’s linearly independent. (b) If B = {u1 , . . . , uk } is an orthonormal basis for a subspace W and v is a vector orthogonal to each vector in B then v ∈ W ⊥ . 1 0 0 0 0 2 −2 1 . What is the rank of A? 14. Let A = 0 2 1 −2 0 1 2 2 15. (a) Show that if V and W are subspaces with V ∩W = {0} then dim(V + W ) = dim V + dim W . (b) If we take away the assumption that V ∩W = {0}, then the statement above is false. Give a counterexample to show that this is the case. 6