Section 9.3: Eigenvalues and Eigenvectors Definition: Let A be an n × n matrix. A scalar λ is called an eigenvalue of A if there exists a nonzero vector ~x such that A~x = λ~x. The vector ~x is called an eigenvector of A corresponding to λ. The equation A~x = λ~x can be written in the form (A − λI)~x = ~0. Thus, λ is an eigenvalue of A if and only if this equation has a nontrivial solution. That is, if and only if A − λI is singular, or equivalently, det(A − λI) = 0. If this determinant is expanded, we obtain an nth-degree polynomial of λ, p(λ) = det(A − λI). This polynomial is called the characteristic polynomial for the matrix A. Example: Find the eigenvalues and eigenvectors of the matrix 1 2 A= . 3 2 1 Example: Find the eigenvalues and eigenvectors of the matrix 3 6 A= . −1 −4 Determine the equations of the lines through the origin in the direction of the eigenvectors ~v1 and ~v2 , and graph the lines together with the eigenvectors and the vectors A~v1 and A~v2 . 2 Example: Find the eigenvalues and eigenvectors of the matrix 1 4 A= . 1 −2 Example: Find the eigenvalues and eigenvectors of the matrix 1 1 A= . 1 1 3 Example: Find the eigenvalues and eigenvectors of the matrix 2 3 A= . 0 −1 Example: Find the eigenvalues and eigenvectors of the matrix 1 0 A= . 4 3 Note: The eigenvalues of an upper- or lower-triangular matrix are the diagonal entries! 4 Example: Find the eigenvalues of the matrix 1 2 A= . −2 1 Definition: The trace of a 2 × 2 matrix a b A= c d is the scalar tr(A) = a + d. The trace and determinant of a 2 × 2 matrix are related to its eigenvalues. Theorem: If A is a 2 × 2 matrix with eigenvalues λ1 and λ2 , then tr A = λ1 + λ2 and det A = λ1 λ2 . Proof: The eigenvalues of A satisfy a−λ b det(A − λI) = c d−λ = 0. That is, λ2 − (a + d)λ + (ad − bc) = 0 λ2 − (trA)λ + det A = 0. If λ1 and λ2 are eigenvalues of A, then (λ − λ1 )(λ − λ2 ) = 0 λ − (λ1 + λ2 )λ + λ1 λ2 = 0. 2 5 Theorem: (Routh-Hurwitz Criterion) The real parts of the eigenvalues of a 2 × 2 matrix A are negative if and only if tr A < 0 and det A > 0. Example: Consider the matrix −2 3 A= . −1 1 Without explicitly computing the eigenvalues of A, determine whether the real parts of both eigenvalues are negative. Example: Consider the matrix 0 1 A= . 2 −1 Without explicitly computing the eigenvalues of A, determine whether the real parts of both eigenvalues are negative. Note: The Routh-Hurwitz Criterion will be very useful in Chapter 11. 6 Definition: Two nonzero vectors x~1 and x~2 are said to be linearly independent if there is no scalar c such that x~1 = cx~2 . That is, x~1 and x~2 are not scalar multiples of each other. Vectors which are not linearly independent are said to be linearly dependent. Theorem: (Criterion for Linear Independence) Let A be a 2 × 2 matrix with eigenvalues λ1 and λ2 , and respective eigenvectors v~1 and v~2 . If λ1 6= λ2 , then v~1 and v~2 are linearly independent. Note: A consequence of linear independence is that any vector can be written uniquely as a linear combination of the two eigenvectors. Suppose that v~1 and v~2 are linearly independent eigenvectors of a 2 × 2 matrix A. Then any 2 × 1 vector ~x can be written as ~x = c1 v~1 + c2 v~2 , where c1 and c2 are uniquely determined constants. Applying A to the vector ~x yields A~x = A(c1 v~1 + c2 v~2 ) = c1 Av~1 + c2 Av~2 = c1 λ1 v~1 + c2 λ2 v~2 . This representation is particularly useful if we apply A repeatedly to ~x. In particular, A2~x = A(c1 λ1 v~1 + c2 λ2 v~2 ) = c1 λ1 Av~1 + c2 λ2 Av~2 = c1 λ21 v~1 + c2 λ22 v~2 . Continuing in this fashion, we have An~x = c1 λn1 v~1 + c2 λn2 v~2 . 7 Example: Consider the matrix −2 1 A= . −4 3 (a) Show that ~v1 = h1, 1i and ~v2 = h1, 4i are linearly independent eigenvectors of A. (b) Represent ~x = h−1, 2i as a linear combination of ~v1 and ~v2 . (c) Use your results to compute A10~x. 8 Example: Consider the matrix 4 −3 A= . 2 −1 If ~x = h−4, −2i, then compute A30~x without using a calculator. 9 The Leslie Matrix Example: Suppose that a population is divided into two age classes with Leslie matrix 1.5 2 L= . 0.08 0 Find the eigenvalues and eigenvectors of the Leslie matrix. 10 Theorem: (Eigenvalues and Eigenvectors of the Leslie Matrix) Suppose that L is a 2 × 2 Leslie matrix with eigenvalues λ1 and λ2 . 1. The larger eigenvalue determines the growth parameter of the population. 2. The eigenvector corresponding to the larger eigenvalue is a stable age distribution. Example: Suppose that a population is divided into two age classes with Leslie matrix 1 3 L= . 0.7 0 (a) Find both eigenvalues. (b) Give a biological interpretation of the larger eigenvalue. (c) Find the stable age distribution. 11 Example: Suppose that a population is divided into two age classes with Leslie matrix 0 5 L= . 0.9 0 (a) Find both eigenvalues. (b) Give a biological interpretation of the larger eigenvalue. (c) Find the stable age distribution. 12