EIGENVALUES AND EIGENVECTORS Definition Let A = [𝑎𝑖𝑗 ] be a square matrix of order n. If there exist a non-zero column vector X and a scalar 𝜆,such that AX = 𝜆X Then 𝜆 is called an eigenvalue of the matrix A and X is called the eigenvector corresponding to the eigenvalue 𝜆. To find the eigenvalues and the corresponding eigenvectors of the square matrix A, we proceed as follows: Let 𝜆 be an eigenvalue of A and X be the corresponding eigenvector. Then. By definition, AX = 𝜆X = 𝜆IX, where I is the unit matrix of order n. i.e. (A-I)X=0 𝑎11 𝑎21 i.e.,{[ − 𝑎𝑛1 𝑎12 𝑎22 − 𝑎𝑛2 ⋯ 𝑎1𝑛 𝑥1 1 0 ⋯ ⋯ 0 0 ⋯ 𝑎2𝑛 𝑥2 0 1 ⋯ ⋯ 0 0 ] − 𝜆 [ ]} [ ]=[ ] − − ⋮ − − − − − ⋮ ⋯ 𝑎𝑛𝑛 𝑥 0 0 0 ⋯ 1 𝑛 0 i.e. (a 11- 𝜆)x1+a12x2+…+ainxn = 0 a21x1+(a22- 𝜆)x2+ … + a2nxn = 0 ……………………………… an1x1+an2x2+…(ann- 𝜆)xn = 0 equations (2) are a system of homogeneous linear equations in the unknown x1,x2,⋯xn. 𝑥1 𝑥2 Since X=[ ⋮ ] is to be a non-zero vector. 𝑥𝑛 x1,x2, … , xn should not be all zeros. In other words, the solution of the system (2) should be a non-trivial solution. The condition for the system (2) to have a non-trivial solution is 𝑎11 − 𝜆 𝑎 [ 21 − 𝑎𝑛1 𝑎12 𝑎22 − 𝜆 − 𝑎𝑛2 ⋯ ⋯ − ⋯ 𝑎1𝑛 𝑎2𝑛 ]=0 − 𝑎𝑛𝑛 − 𝜆 |𝐴 − 𝜆𝐼| = 0 The determinant |𝐴 − 𝜆𝐼| is a polynomial of degree n in 𝜆 and is called the characteristic polynomial of A. The equation |𝐴 − 𝜆𝐼| or the equation (3) is called the characteristic equation of A. When we solve the characteristic equation, we get n values for 𝜆, These n roots of the characteristic equation are called the characteristic roots or latent roots or eigenvalues of A. Corresponding to each value of , the equations (2) posses a non-zero (non-trivial) solution X. X is called the invariant vector or latent vector or eigenvector of a corresponding to the eigenvalue 𝜆. Notes 1.corresponding to an eigenvalue, the non-trivial solution of the system (2)will be a one-parameter family of solutions. Hence the eigenvector corresponding to an eigenvalue is not unique . 2.If all the eigenvalues 𝜆1 , 𝜆2 , … , 𝜆𝑛 of a matrix A are distinct, then the corresponding eigenvectors are linearly independent. 3.If two or more eigenvalues are equal, then the eigenvectors may be linearly independent or linearly dependent. Property of Eigenvalues 1.A square matrix A and its transpose A have the same eigenvalues. 2.the sum of the eigenvalues of a matrix A is equal to the sum of the principal diagonal elements of A.(The sum of the principal diagonal elements is called the Trace of the matrix.) The characteristic equation of an nth order matrix A may be written as 𝜆𝑛 − 𝐷1𝜆𝑛−1 + 𝐷2 𝜆𝑛−2 − ⋯ + (−1)𝑛 𝐷𝑛 = 0, Where Dr is the sum of all the rth order minors of A whose principal diagonals lie along the principal diagonal of A. Let 𝜆1 , 𝜆2 , … , 𝜆𝑛 be the eigenvalues of A. They are the roots of equation (1). ∴ 𝜆1 , 𝜆2 , … , 𝜆𝑛 = −(−𝐷) 1 = D1 = a11 + a22 + … + ann =Trace of the matrix A. 3.The product of the eigenvalues of a matrix A is equal to |𝐴|. Note If |𝐴|= 0 i.e. A isa singular matrix, at least one of the eigenvalues of A is zero and conversely. 4.If 𝜆1 , 𝜆2 , … , 𝜆𝑛 are the eigenvalues of the matrix A, then (i) 𝑘𝜆1 , 𝑘𝜆2 , … , 𝑘𝜆𝑛 are the eigenvalues of the matrix kA, where k is a non-zero scalar. (ii) 𝜆1 𝑝 , 𝜆2 𝑝 , … , 𝜆𝑛 𝑝 are the eigenvalues of the matrix AP, where P is a positive integer. (iii) 1 1 𝜆1 𝜆2 1 𝜆𝑛 , ,…, are the eigenvalues of the inverse matrix A-1, provided 𝜆𝑟 ≠ 0 i.e. A is non-singular. 5. The eigenvalues of a real symmetric matrix (i.e., a symmetric matrix with real elements) are real. 6. The eigenvectors corresponding to distinct eigenvalues of a real symmetric matrix are orthogonal.