Matrices Formulas 1. Definition of a matrix A matrix (plural matrices) is an array of items of the same kind arranged in rows and columns. It is usually denoted by a capital letter. Each entry in a matrix is called an ‘element’ of the matrix. 2. Order of a matrix The order of a matrix is written as "π × π" where π is the number of rows and π is the number of columns of the matrix. 5 2 Example : The matrix [−1 3] has order 3 × 2. 6 1 Each element πππ in a matrix π΄ is indexed using the iterator variables π and π, where π ∈ {1, 2, … … … π} and π ∈ {1, 2, … … … π}. 3. Types of matrices (i) Rectangular Matrix – Any matrix π × π, where π ≠ π, i.e. a matrix having different number of rows and columns is called a rectangular matrix. 1 0 6 Example : [ ] −4 2 3 (ii) Row Matrix – It is a subtype of the rectangular matrix and is defined as a matrix having only one row. Example : [3 2 −1] (iii) Column Matrix – It is a subtype of the rectangular matrix and is defined as a matrix having only one column. 3 Example : [7] 1 (iv) Square Matrix – A matrix with equal number of rows and columns is called a square matrix. Its order is represented by a single number. 1 3 Example : [ ] is a square matrix of order 2 6 7 Note : The principal/main/leading diagonal in a square matrix of order π is elements of the diagonal ranging from the first (π11 ) to the last element (πππ ) in the matrix, i.e. the principal diagonal of a square matrix is the ordered set of entries πππ , where π = π extending from the upper left-hand corner to the lower right-hand corner of the matrix. (v) Diagonal Matrix – A diagonal matrix is a square matrix in which only the principal diagonal elements are non-zero while the other elements are equal to 0. A diagonal matrix of order π × π having π1 , π2 , … … … ππ as principal diagonal elements may be denoted by ππππ[π1 , π2 , … … … ππ ]. 3 Example : [0 0 0 0 4 0 ] = ππππ[3, 4, −7] 0 −7 (vi) Scalar Matrix – A diagonal matrix where all the principal diagonal elements are equal to the same scalar value is called a scalar matrix. A scalar matrix can be defined as follows : π; if π = π π΄ = [πππ ]π×π = { 0; if π ≠ π 14 Example : [ 0 0 0 14 0 0 0] 14 (vii) Unit/Identity Matrix – A scalar matrix whose principal diagonal elements are equal to 1 is called a unit/identity matrix. It is denoted by πΌπ . An identity matrix can be defined as follows : πΌπ = [πππ ]π×π = { 1; if π = π 0; if π ≠ π 1 0 0 Example : πΌ3 = [0 1 0] 0 0 1 (viii) Zero/Null Matrix – A matrix of order π × π in which all the elements are equal to 0 is called as a zero matrix. It is denoted by ππ×π . 0 0 0 Example : π2×3 = [ ] 0 0 0 4. Equality of matrices Two matrices π΄ and π΅ are equal if and only if both matrices are of the same order and each element of one is equal to the corresponding element in the other, i. e. π΄ = [πππ ]π×π & π΅ = [πππ ]π×π are said to be equal if πππ = πππ ∀ π, π. 2 Example : [ 3 4/2 1 ]=[ 0 15 − 12 2−1 ] 0 5. Addition of matrices The sum of two matrices of the same order, π΄π×π and π΅π×π , is the matrix (π΄ + π΅)π×π in which the entry in the π th row and π th column is πππ + πππ , for π = 1, 2, 3, … … … π and π = 1, 2, 3, … … … π. If π΄ = [πππ ]π×π and π΅ = [πππ ]π×π , then π΄ + π΅ = [πππ + πππ ]π×π . 3 Example : [ 2 1 0 2 1 2 4 1 ]+[ ]=[ −1 3 0 1 4 1 4 4 ] 4 6. Zero/Null Matrix The zero/null matrix of order π × π, denoted by ππ×π is the matrix containing π × π elements, all of which are equal to 0. 0 0 Example : π3×2 = [0 0] 0 0 Note : The zero/null matrix ππ×π also serves as the additive identify for a matrix of order π × π, i.e. π΄ + π = π΄ and π + π΄ = π΄, where π΄ is a matric of order π × π. 7. Negative of a matrix The negative of a matrix π΄π×π denoted by −π΄π×π is the matrix formed by replacing each entry in the matrix π΄π×π with its additive inverse. 3 −1 −3 1 Example : If π΄3×2 = [ 2 −2], then −π΄3×2 = [−2 2 ] −4 5 4 −5 Note : The negative of a matrix also serves as the additive inverse of a matrix, i.e. π΄ + (−π΄) = π and (−π΄) + π΄ = π, where π΄ is a matric of order π × π and π is the zero/null matrix of order π × π. 8. Properties of sums of matrices If π΄, π΅ and πΆ are members of the set ππ×π of all π × π matrices with real number entries where π, π ∈ π, then : I. II. III. IV. π΄ + π΅ ∈ ππ×π (Closure Law for Addition) π΄ + π΅ = π΅ + π΄ (Commutative Law for Addition) (π΄ + π΅) + πΆ = π΄ + (π΅ + πΆ) (Associative Law for Addition) If π΄, π΅ and πΆ are three matrices of the same order, then : π΄ + π΅ = π΄ + πΆ βΉ π΅ = πΆ (Left Cancellation Law) π΅ + π΄ = πΆ + π΄ βΉ π΅ = πΆ (Right Cancellation Law) 9. Subtraction of matrices The subtraction or difference between two matrices of the same order, π΄π×π and π΅π×π , is the matrix (π΄ − π΅)π×π in which the entry in the π th row and π th column is πππ + (−πππ ), for π = 1, 2, 3, … … … π and π = 1, 2, 3, … … … π. If π΄ = [πππ ]π×π and π΅ = [πππ ]π×π , then π΄ − π΅ = π΄ + (−π΅) = [πππ + (−πππ )]π×π . Example : If π΄ = [ 2 + (−1) 0 + (−(−2)) 2 0 1 −2 ] and π΅ = [ ], then π΄ − π΅ = π΄ + (−π΅) = [ ] −3 6 0 4 −3 + (−0) 6 + (−4) =[ 1 2 ] −3 2 10. Multiplication of a matrix by a scalar The product of a real number or real number scalar π and a matrix π΄, denoted by ππ΄, is defined as the matrix whose entries are each multiplied by π. If π΄ = [πππ ]π×π , then ππ΄ = [π. πππ ]π×π . Example : If π΄ = [ 1 −3 1 ], then 3π΄ = 3 [ 0 2 0 −3 3×1 ]=[ 2 3×0 3 × −3 3 ]=[ 3×2 0 −9 ] 6 11. Properties of products of matrices and real numbers If π΄ and π΅ are members of the set ππ×π of all π × π matrices with real number entries where π, π ∈ π and π, π ∈ π , then : I. II. III. IV. V. VI. VII. VIII. ππ΄ ∈ ππ×π π(π΄ + π΅) = ππ΄ + ππ΅ (π + π)π΄ = ππ΄ + ππ΄ (ππ)π΄ = π(ππ΄) = π(ππ΄) 1π΄ = π΄ (−1)π΄ = −π΄ 0π΄ = 0 π0 = 0 12. Multiplication of matrices Matrix multiplication is “multiply row by column” process. This means that we multiply the entries of a row by the corresponding entries of a column and then add the products. Two matrices π΄ and π΅ are compatible for multipicaton only if the number of columns of π΄ is equal to the number of rows of π΅, i.e. π΄π×π × π΅π×π = (π΄π΅)π×π , where π΄π΅ denotes the product of the matrices π΄ and π΅. 12 6 20 ] and π΅ = [24 12], then they are cmpatibel for matrix multiplication 4 12 9 12 6 13 × 12 + 18 × 24 + 20 × 12 13 × 6 + 18 × 12 + 20 × 9 20 ] [24 12] = [ ] 2 × 12 + 3 × 24 + 4 × 12 2 × 6 + 3 × 12 + 4 × 9 4 12 9 13 18 Example : If π΄ = [ 2 3 and π΄π΅ = [ 13 18 2 3 =[ 828 144 474 ] 84 13. Properties of matrix multiplication I. II. III. IV. V. Matrix multiplication is not commutative i.e. π΄π΅ ≠ π΅π΄ (most of the time). In fact, sometimes when π΄π΅ is defined, π΅π΄ is not defined due to incompatibility for matrix multiplication. The product of two matrices can be zero without either of them being a zero matrix. Cancellation law for the multiplication of real numbers is not valid for the multiplication of matrices, i.e. π΄π΅ = π΄πΆ does not imply π΅ = πΆ. Matrix multiplication is associative if conformability is assured, i.e. π΄(π΅πΆ) = (π΄π΅)πΆ. Matrix multiplication is distributive with respect to matrix addition, i.e. π΄(π΅ + πΆ) = π΄π΅ + π΄πΆ. Note : The unit/identity matrix πΌπ multiplicative identify for a square matrix of order π, i.e. π΄πΌ = π΄ and πΌπ΄ = π΄. 14. Positive integral powers of matrices If π΄ is a square matrix, then all its positive integral powers such as π΄2 , π΄3 , etc. are defined and can be multiplied with each other to get higher powers, such as π΄2 π΄ = (π΄π΄)π΄ = π΄(π΄π΄) = π΄π΄2 = π΄3 = π΄π΄π΄, etc. Further, if πΌ is a unit matrix of any order, then πΌ = πΌ 2 = πΌ 3 = πΌ π . Note : We already know that the product of non-zero matrices can be 0. It is also possible that the integral power of a non-zero matrix is 0. Such a matrix is called a nil potent matrix. 0 2 2 0 Example : If π = [ ], π = π × π = [ 0 0 0 2 0 2 0 0 ][ ]=[ ] = 0 and π 3 = π 2 π = 0, and so on. 0 0 0 0 0 15. Transpose of a matrix The matrix obtained from any matrix π΄ by interchanging its rows and columns is called the transpose of the given matrix and is denoted by π΄′ or π΄π . This means that if the order of matrix π΄ is π × π, the order of its transpose is π × π. 3 6 3 2 1 ′ Example : If π΄ = [ ] , π΄ = [2 5 ] 6 5 4 1 4 16. Properties of transpose of a matrix I. II. III. IV. If π΄ is any matrix, then (π΄′ )′ = π΄ and (−π΄)′ = −π΄′. If π΄ and π΅ are two matrices of the same order, then (π΄ + π΅)′ = π΄′ + π΅′ and (π΄ − π΅)′ = π΄′ − π΅′. If matrix π΄ has order π × π and matrix π΅ has order π × π, then (π΄π΅)′ = π΅′π΄′. If π΄ is a matrix and π is a scalar, then (ππ΄)′ = ππ΄′. 17. Symmetric and skew-symmetric matrices Symmetric Matrix 1. A symmetric matrix is a square matrix which is symmetric about its principal diagonal. Thus the elements on one side of the principal diagonal are the reflected images of the elements on the other side of the principal diagonal. Skew-symmetric Matrix 1. A skew-symmetric matrix is a square matrix which is symmetric about its principal diagonal, but with reverse signs. Thus the elements on one side of the principal diagonal are the reflected images of the elements on the other side of the principal diagonal with changed signs. 2. The principal diagonal elements themselves 2. The principal diagonal elements are equal to 0. can be any value. 3. Definition of a symmetric matrix : 3. Definition of a skew-symmetric matrix : A square matrix π΄ = [πππ ] of order π is said to be A square matrix π΄ = [πππ ] of order π is said to be symmetric if πππ = πππ , ∀ π, π. skew-symmetric if πππ = −πππ , ∀ π, π. Using this definition, we get πππ = −πππ , βΉ πππ = 0. 4. A necessary and sufficient condition for a 4. A necessary and sufficient condition for a matrix π΄ to be symmetric is : matrix π΄ to be skew-symmetric is : π΄ = π΄′ Note : β’ Diagonal matrices are always symmetric. π΄ = −π΄′ β’ A matrix which is both symmetric and skew-symmetric is called a square null matrix. 18. Theorems related to symmetric and skew-symmetric matrices Theorem 1. If π΄ be any square matrix, then π΄ + π΄′ is symmetric and π΄ − π΄′ is skew-symmetric. Proof : Using the properties of transpose of a matrix and the necessary and sufficient condition of a symmetric matrix, we get : (π΄ + π΄′ )′ = π΄′ + (π΄′)′ π΄′ + π΄ = π΄ + π΄′ Likewise, using the properties of transpose of a matrix and the necessary and sufficient condition of a symmetric matrix, we get : (π΄ − π΄′ )′ = π΄′ − (π΄′)′ π΄′ − π΄ = −(π΄ − π΄′ ) Theorem 2. Every square matrix can uniquely expressed as the sum of a symmetric matrix and a skewsymmetric matrix. i. e. π΄ = 1 1 1 (π΄ + π΄′) + (π΄ − π΄′) 2 2 1 where 2 (π΄ + π΄′) is a symmetric matrix and 2 (π΄ − π΄′) is a skew-symmetric matrix. 19. Determinant of a matrix Determinant of a 2 × 2 matrix Let π΄ = [ π1 π2 π1 ] be a square matrix of order 2, then its determinant is given by : π2 π |π΄| or det(π΄) = | 1 π2 π1 | = π1 π2 − π2 π1 π2 Determinant of a 3 × 3 matrix π1 Let π΄ = [π2 π3 by : π1 π2 π3 π1 π2 ] be a square matrix of order 3, then its determinant (expanding along π 1 ) is given π3 π1 |π΄| or det(π΄) = |π2 π3 π1 π2 π3 π1 π π2 | = π1 | 2 π3 π3 π2 π2 | − π1 |π π3 3 π2 π2 | + π | 1 π3 π3 π2 | π3 = π1 (π2 π3 − π3 π2 ) − π1 (π2 π3 − π3 π2 ) + π1 (π2 π3 − π3 π2 ) Note : β’ If π΄ and π΅ are square matrices of the same order, then det(π΄π΅) = det(π΄) . det(π΅). 20. Singular and non-singular matrices A square matrix π΄ is said to be singular if det(π΄) = 0, otherwise it is said to be non-singular. 21. Adjoint of a matrix The adjoint of a square matrix π΄ is the transpose of the matrix obtained by replacing each element πππ by its cofactor π΄ππ . It is denoted by adj. π΄ π11 If π΄ = [π21 π31 πππ . π12 π22 π32 π13 π΄11 π23 ], then adj. π΄ = [π΄12 π33 π΄13 π΄21 π΄22 π΄23 π΄31 π΄32 ], where π΄ππ denotes the co-factor of an element π΄33 22. Properties of adjoint and determinant of a matrix I. If π΄ be any π-rowed square matrix, then (adj. π΄)π΄ = π΄(adj. π΄) = |π΄|πΌπ , where πΌπ is the π-rowed identity matrix. If π΄ is a square matrix of order π, then |ππ΄| = π π |π΄|. If π΄ is a square matrix and π is a scalar, then adj. (ππ΄) = π 2 (adj. π΄). If π΄ is a π × π non-singular matrix, then |adj. π΄| = |π΄|π−1 . If π΄ and π΅ are two non-singular matrices of the same type, then adj. (π΄π΅) = (adj. π΅)(adj. π΄). II. III. IV. V. 23. Inverse of a matrix A non-zero square matrix π΄ of order π is said to be invertible if there exists a square matrix π΅ of order π such that π΄π΅ = π΅π΄ = πΌπ . The square matrix π΅ is then called as the inverse of π΄ and is denoted by π΄−1 . Every invertible matrix has a unique inverse. The necessary and sufficient condition for a square matrix π΄ to possess inverse is that it must be nonsingular, i.e. det(π΄) ≠ 0. For an invertible square matrix π΄ of order π, the inverse is given as follows : π΄−1 = 1 (adj. π΄) |π΄| 24. Properties of inverse of a matrix I. II. III. If π΄ be any π-rowed non-singular square matrix, then π΄π΄−1 = π΄−1 π΄ = πΌπ , where πΌπ is the πrowed identity matrix. If π΄ be any π-rowed non-singular square matrix, then (π΄′ )−1 = (π΄−1 )′. If π΄ and π΅ be two π-rowed non-singular square matrices, then π΄π΅ is also non-singular and (π΄π΅)−1 = π΅ −1 π΄−1 (Reversal law for the inverse of a product). 25. Matrix representation of a system of linear equations Consider a system of two linear equations in two unknowns : π1 π₯ + π1 π¦ = π1 π2 π₯ + π2 π¦ = π2 The system of linear equations can be written in matrix form as follows : π΄π = π΅ [ π1 π2 π1 π1 π₯ ] [π¦] = [π ] π2 2 where π΄ − πππππππππππ‘ πππ‘πππ₯ π − π£πππππππ πππ‘πππ₯ π΅ − π’πππππ€π πππ‘πππ₯ Likewise, consider a system of three linear equations in three unknowns : π1 π₯ + π1 π¦ + π1 π§ = π1 π2 π₯ + π2 π¦ + π2 π§ = π2 π3 π₯ + π3 π¦ + π3 π§ = π3 The system of linear equations can be written in matrix form as follows : π΄π = π΅ π1 [π 2 π3 π1 π2 π3 π1 π₯ π1 π2 ] [π¦] = [π2 ] π3 π§ π3 where π΄ − πππππππππππ‘ πππ‘πππ₯ π − π£πππππππ πππ‘πππ₯ π΅ − π’πππππ€π πππ‘πππ₯ 26. Consistency and inconsistency of a system of linear equations The following flowchart must be followed to understand whether a system of linear equations is consistent or inconsistent. 27. Solving a system of linear equations using matrix method (Martin’s rule) Let π΄π = π΅ be a system of linear equations. The solution of the system of linear equations is given as follows : π = π΄−1 π΅ βΉπ= 1 (adj. π΄)π΅ |π΄| *******************************