Matrix Representation Matrix Rep. Same basics as introduced already. Convenient method of working with vectors. Superposition Complete set of vectors can be used to express any other vector. Complete set of N vectors can form other complete sets of N vectors. Can find set of vectors for Hermitian operator satisfying A u u . Eigenvectors and eigenvalues Matrix method Find superposition of basis states that are eigenstates of particular operator. Get eigenvalues. Copyright – Michael D. Fayer, 2007 Orthonormal basis set in N dimensional vector space e j basis vectors Any vector can be written as N x xj e j j 1 with xj e j x To get this, project out e j e j from x sum over all e j . xj e j e j e j x Copyright – Michael D. Fayer, 2007 Operator equation y Ax N yj e N j j 1 A x j e j Substituting the series in terms of bases vectors. j 1 N xj A e j j 1 Left mult. by e i N yi e i A e j x j j 1 The N2 scalar products ei A e j N values of j ; N for each yi are completely determined by . A and the basis set e j Copyright – Michael D. Fayer, 2007 Writing aij e i A e j Matrix elements of A in the basis e j gives for the linear transformation N yi aij x j j 1 j 1, 2, N j Know the aij because we know A and e In terms of the vector representatives x x1 , x2 , xN y y1 , y2 , yN Q 7 xˆ 5 yˆ 4 zˆ vector vector representative, [7, 5, 4] must know basis (Set of numbers, gives you vector when basis is known.) The set of N linear algebraic equations can be written as y Ax double underline means matrix Copyright – Michael D. Fayer, 2007 A array of coefficients - matrix a11 a A aij 21 aN 1 a12 a22 aN 2 a1 N a2 N a NN The aij are the elements of the matrix A . The product of matrix A and vector representative x is a new vector representative y with components N yi aij x j j 1 Copyright – Michael D. Fayer, 2007 Matrix Properties, Definitions, and Rules Two matrices, A and B are equal A B if aij = bij. The unit matrix 1 0 0 1 1 ij 0 0 0 0 ones down principal diagonal 1 Gives identity transformation N yi ij x j xi The zero matrix 0 0 0 0 0 0 0 0 0 0 0x 0 j 1 Corresponds to y 1 x x Copyright – Michael D. Fayer, 2007 Matrix multiplication Consider y Ax z B y operator equations z BA x Using the same basis for both transformations N zk bki yi zBy i 1 B matrix (bki ) z By B A x C x C BA has elements Example N ckj bki aij i 1 Law of matrix multiplication 2 3 7 5 29 28 3 4 5 6 41 39 Copyright – Michael D. Fayer, 2007 Multiplication Associative A B C A BC Multiplication NOT Commutative AB B A Matrix addition and multiplication by complex number A B C cij aij bij Inverse of a matrix A inverse of A A A 0 1 1 AA A A1 identity matrix A transpose of cofactor matrix (matrix of signed minors) A determinant CT 1 A 1 If A 0 A is singular Copyright – Michael D. Fayer, 2007 Reciprocal of Product A B B A 1 1 1 For matrix defined as A ( aij ) Transpose A a ji interchange rows and columns Complex Conjugate A aij* * complex conjugate of each element Hermitian Conjugate A a *ji complex conjugate transpose Copyright – Michael D. Fayer, 2007 Rules A B B A transpose of product is product of transposes in reverse order | A| | A| determinant of transpose is determinant ( A B )* A B * * | A | | A |* * ( A B ) B A | A | | A |* complex conjugate of product is product of complex conjugates determinant of complex conjugate is complex conjugate of determinant Hermitian conjugate of product is product of Hermitian conjugates in reverse order determinant of Hermitian conjugate is complex conjugate of determinant Copyright – Michael D. Fayer, 2007 Definitions A A Symmetric A A Hermitian A A * Real A A * 1 A A Imaginary Unitary aij aij ij Diagonal Powers of a matrix A 1 0 A A e 1 A A 1 A A A A 2 2 2! Copyright – Michael D. Fayer, 2007 Column vector representative x1 x x 2 xN vector representatives in particular basis then becomes y Ax y1 a11 y a 2 21 yN aN 1 a12 a22 row vector x x1 , x2 one column matrix aN 2 aN a NN x1 x 2 xN transpose of column vector xN y Ax y xA y Ax y x A transpose Hermitian conjugate Copyright – Michael D. Fayer, 2007 Change of Basis orthonormal basis e i then i , j 1, 2, e i e j ij Superposition of N ei can form N new vectors linearly independent a new basis e i e i N uik e k i 1, 2, N k 1 complex numbers Copyright – Michael D. Fayer, 2007 New Basis is Orthonormal e j e i ij if the matrix U uik coefficients in superposition e i N uik e k i 1, 2, N k 1 meets the condition U U 1 U 1 U U is unitary Important result. The new basis will be orthonormal e i if U , the transformation matrix, is unitary (see book UU U U 1 and Errata and Addenda ). Copyright – Michael D. Fayer, 2007 Unitary transformation substitutes orthonormal basis e for orthonormal basis e . Vector x x xi e i i x xi e i vector – line in space (may be high dimensionality abstract space) written in terms of two basis sets i x Same vector – different basis. The unitary transformation U can be used to change a vector representative of x in one orthonormal basis set to its vector representative in another orthonormal basis set. x – vector rep. in unprimed basis x' – vector rep. in primed basis x U x x U x change from unprimed to primed basis change from primed to unprimed basis Copyright – Michael D. Fayer, 2007 Example Consider basis xˆ , yˆ , zˆ y |s z x Vector s - line in real space. In terms of basis s 7 xˆ 7 yˆ 1zˆ Vector representative in basis xˆ , yˆ , zˆ 7 s 7 1 Copyright – Michael D. Fayer, 2007 Change basis by rotating axis system 45° around ẑ . Can find the new representative of s , s' s U s U is rotation matrix x cos U sin 0 y sin cos 0 z 0 0 1 For 45° rotation around z 2/2 U 2 / 2 0 2 / 2 0 2 / 2 0 0 1 Copyright – Michael D. Fayer, 2007 Then 2/2 s 2 / 2 0 7 2 s 0 1 2 / 2 0 7 7 2 2 / 2 0 7 0 0 1 1 1 vector representative of s in basis e Same vector but in new basis. Properties unchanged. Example – length of vector ss 1/ 2 s s 1/ 2 ( s* s )1/ 2 (49 49 1)1/ 2 (99)1/ 2 s s 1/ 2 ( s* s )1/ 2 (2 49 0 1)1/ 2 (99)1/ 2 Copyright – Michael D. Fayer, 2007 Can go back and forth between representatives of a vector x by change from unprimed to primed basis x U x change from primed to unprimed basis x U x components of x in different basis Copyright – Michael D. Fayer, 2007 Consider the linear transformation y Ax operator equation In the basis e can write y Ax or yi aij x j j Change to new orthonormal basis e using U y U y U A x U AU x or y A x with the matrix A given by A U AU Because U is unitary 1 A U AU Copyright – Michael D. Fayer, 2007 Extremely Important 1 A U AU Can change the matrix representing an operator in one orthonormal basis into the equivalent matrix in a different orthonormal basis. Called Similarity Transformation Copyright – Michael D. Fayer, 2007 y Ax AB C In basis e A B C Go into basis e y A x A B C A B C Relations unchanged by change of basis. Example AB C U A BU U C U 1 Can insert U U between A B because U U U U 1 U AU U BU U C U Therefore A B C Copyright – Michael D. Fayer, 2007 Isomorphism between operators in abstract vector space and matrix representatives. Because of isomorphism not necessary to distinguish abstract vectors and operators from their matrix representatives. The matrices (for operators) and the representatives (for vectors) can be used in place of the real things. Copyright – Michael D. Fayer, 2007 Hermitian Operators and Matrices Hermitian operator x A y y Ax Hermitian operator Hermitian Matrix A A + - complex conjugate transpose - Hermitian conjugate Copyright – Michael D. Fayer, 2007 Theorem (Proof: Powell and Craseman, P. 303 – 307, or linear algebra book) For a Hermitian operator A in a linear vector space of N dimensions, there exists an orthonormal basis, U1 , U 2 UN relative to which A is represented by a diagonal matrix 1 0 0 2 A 0 0 0 0 . N The vectors, U i , and the corresponding real numbers, i, are the solutions of the Eigenvalue Equation A U U and there are no others. Copyright – Michael D. Fayer, 2007 Application of Theorem Operator A represented by matrix A e i in some basis . The basis is any convenient basis. In general, the matrix will not be diagonal. There exists some new basis eigenvectors U i in which A represents operator and is diagonal To get from eigenvalues. e to U i i unitary transformation. U U e . i 1 A U AU i Similarity transformation takes matrix in arbitrary basis into diagonal matrix with eigenvalues on the diagonal. Copyright – Michael D. Fayer, 2007 Matrices and Q.M. Previously represented state of system by vector in abstract vector space. Dynamical variables represented by linear operators. Operators produce linear transformations. y Ax Real dynamical variables (observables) are represented by Hermitian operators. Observables are eigenvalues of Hermitian operators. A S S Solution of eigenvalue problem gives eigenvalues and eigenvectors. Copyright – Michael D. Fayer, 2007 Matrix Representation Hermitian operators replaced by Hermitian matrix representations. A A In proper basis, A is the diagonalized Hermitian matrix and the diagonal matrix elements are the eigenvalues (observables). 1 A suitable transformation U AU takes A (arbitrary basis) into A (diagonal – eigenvector basis) 1 A U AU . Diagonalization of matrix gives eigenvalues and eigenvectors. Matrix formulation is another way of dealing with operators and solving eigenvalue problems. Copyright – Michael D. Fayer, 2007 All rules about kets, operators, etc. still apply. Example Two Hermitian matrices A and B can be simultaneously diagonalized by the same unitary transformation if and only if they commute. All ideas about matrices also true for infinite dimensional matrices. Copyright – Michael D. Fayer, 2007 Example – Harmonic Oscillator Have already solved – use occupation number representation kets and bras (already diagonal). H 1 1 2 2 P x aa a a 2 2 a n n n 1 a n n1 n1 matrix elements of a 0 a 0 0 0a1 1 0 a 2 0 1a 0 0 1a 1 0 1a 2 2 1a 3 0 0 1 2 3 a 0 1 2 0 1 0 0 0 0 0 0 0 0 0 2 3 0 0 0 0 3 0 0 4 Copyright – Michael D. Fayer, 2007 a 0 0 0 0 1 0 0 0 0 0 0 2 0 0 3 0 0 0 0 4 0 0 aa 0 0 1 0 0 2 0 0 0 0 0 0 3 0 0 1 0 4 0 0 H 1 a a a a 2 0 0 0 0 0 0 0 0 2 0 3 0 0 0 4 1 0 0 2 0 0 0 0 0 0 3 0 0 0 0 4 Copyright – Michael D. Fayer, 2007 H 1 a a a a 2 a a 0 1 0 0 0 2 0 0 0 0 0 0 0 0 3 0 0 4 0 0 0 0 0 0 1 0 0 2 0 0 0 0 0 0 3 0 0 0 0 4 0 0 1 0 0 0 0 2 0 0 0 0 3 Copyright – Michael D. Fayer, 2007 Adding the matrices a a and a a and multiplying by ½ gives H 1 0 1 H 0 2 0 0 3 0 0 0 0 5 0 0 0 0 7 1 2 0 0 0 0 3 2 0 0 0 0 5 2 0 0 0 0 7 2 The matrix is diagonal with eigenvalues on diagonal. In normal units the matrix would be multiplied by . This example shows idea, but not how to diagonalize matrix when you don’t already know the eigenvectors. Copyright – Michael D. Fayer, 2007 Diagonalization Eigenvalue equation eigenvalue Au u matrix representing operator Au u 0 representative of eigenvector In terms of the components a N j 1 ij ij u j 0 i 1, 2 N This represents a system of equations a11 u1 a12 u2 a13 u3 0 a21u1 a22 u2 a23 u3 0 a31u1 a32 u2 a33 u3 0 We know the aij. We don't know - the eigenvalues ui - the vector representatives, one for each eigenvalue. Copyright – Michael D. Fayer, 2007 Besides the trivial solution u1 u2 uN 0 A solution only exists if the determinant of the coefficients of the ui vanishes. a11 a21 a31 a12 a22 a32 a13 a23 a33 know aij, don't know 's 0 Expanding the determinant gives Nth degree equation for the the unknown 's (eigenvalues). Then substituting one eigenvalue at a time into system of equations, the ui (eigenvector representatives) are found. N equations for u's gives only N - 1 conditions. Use normalization. u1* u1 u2* u2 u*N uN 1 Copyright – Michael D. Fayer, 2007 Example - Degenerate Two State Problem Basis - time independent kets H E0 H E0 orthonormal. and not eigenkets. Coupling . These equations define H. The matrix elements are H E0 H H H E0 And the Hamiltonian matrix is E0 H E0 Copyright – Michael D. Fayer, 2007 The corresponding system of equations is ( E0 ) 0 ( E0 ) 0 These only have a solution if the determinant of the coefficients vanish. E0 E0 0 Expanding E0 2 2 0 Dimer Splitting 2 E0 Excited State 2 2 E0 E02 2 0 Energy Eigenvalues E0 E0 Ground State E = 0 Copyright – Michael D. Fayer, 2007 To obtain Eigenvectors Use system of equations for each eigenvalue. a b a b a , b Eigenvectors associated with + and -. and a , b are the vector representatives of and in the , basis set. We want to find these. Copyright – Michael D. Fayer, 2007 First, for the E0 eigenvalue write system of equations. ( H 11 )a H 12b 0 H 21a ( H 22 )b 0 H 11 H ; H 12 H ; H 21 H ; H 22 H Matrix elements of H ( E0 E0 )a b 0 The matrix elements are a ( E0 E0 )b 0 The result is H E0 H H H E0 a b 0 a b 0 Copyright – Michael D. Fayer, 2007 An equivalent way to get the equations is to use a matrix form. E 0 a 0 E0 b 0 Substitute E0 E0 E0 a 0 E0 E0 b 0 a 0 b 0 Multiplying the matrix by the column vector representative gives equations. a b 0 a b 0 Copyright – Michael D. Fayer, 2007 a b 0 a b 0 The two equations are identical. a b Always get N – 1 conditions for the N unknown components. Normalization condition gives necessary additional equation. a2 b2 1 Then a b 1 2 and 1 1 2 2 Eigenvector in terms of the basis set. Copyright – Michael D. Fayer, 2007 For the eigenvalue E0 using the matrix form to write out the equations E0 a 0 E0 b 0 Substituting E0 a 0 b 0 a b 0 a b 0 These equations give a b 1 Using normalization a 2 Therefore 1 1 2 2 b 1 2 Copyright – Michael D. Fayer, 2007 Can diagonalize by transformation 1 H U H U diagonal not diagonal Transformation matrix consists of representatives of eigenvectors in original basis. a U b U 1 a 1/ 2 b 1/ 2 1/ 2 1/ 2 1/ 2 1/ 2 1/ 2 1/ 2 complex conjugate transpose Copyright – Michael D. Fayer, 2007 Then 1/ 2 H 1/ 2 1/ 2 E0 1/ 2 1/ 2 E0 1/ 2 1/ 2 1/ 2 Factoring out 1/ 2 1 1 1 E0 H 2 1 1 1 1 E0 1 1 after matrix multiplication 1 1 1 E0 H 2 1 1 E0 E0 E0 more matrix multiplication E0 H 0 0 E0 diagonal with eigenvalues on diagonal Copyright – Michael D. Fayer, 2007