MATH 2010 Introduction to Matrices A matrix is simply a rectangular array of numbers. A matrix has rows and columns. If M is a matrix having m rows and n columns then M is an m n matrix. The size or dimension of an m n matrix is m n: Example 1 A1 1 0 2 3 2 ¼ is a 3 1 2 0 ¼ 2 matrix. 7 n matrix is a row matrix. A m Example 2 7 0 2 is a 3 2 matrix. 1 matrix is a column matrix. 1 5 is a row matrix and 3 8 3 6 e is a column matrix. The numbers within a matrix are the elements or entries of the matrix. We can specify any entry with its position within the matrix. This means its row number and it column number. It is a common practice to label matrices with upper-case letters and the entries of that matrix with the corresponding lower-case letter and the position. Example 3 Suppose that A is a matrix. Then we may refer to A as [aij ] : Example 4 Suppose that A= 7 0 3 ¼ 1 11 Then we have a11 a12 a12 a21 a22 a23 = = = = = = 1 7 0 1 3 ¼ 11 De¯nition 5 Suppose that A and B are matrices. We say that A = B if and only if A and B have the same dimension and aij = bij i; j In other words, two matrices are equal if and only if they have the same size and corresponding entries are equal. De¯nition 6 Suppose that A and B are matrices of the same size. de¯ne A + B; the sum of A and B by We A + B = [aij + bij ] In other words, we add corresponding entries. Example 7 Let M = 1 8 3 5 4 5 0 8 and N = : Then 3 8 8 3 M +N = De¯nition 8 Suppose that A = [aij ] and t tA by tA = [taij ] R: Then we de¯ne the matrix In other words, when we multiply a matrix by a scalar we multiply every entry by that scalar. De¯nition 9 Suppose that A and B are matrices of the same size. de¯ne A B; the di®erence of A and B by A B = A + ( 1) B In other words we subtract corresponding entries. De¯nition 10 Let M = 1 8 5M 3 5 and N = 3N = 17 40 2 0 49 4 5 0 8 : Then We So we now know how to add and subtract matrices and to multiply a matrix by a scalar. Matrix addition and scalar multiplication of matrices have all the nice properties as addition and scalar multiplication of vectors. The next operation is multiplying matrices. This operation is a bit more complex. For one thing, we can multiply matrices only if they are conformable. This means that the number of columns of the ¯rst matrix is equal to the number of rows of the second matrix. Example 11 Suppose A is a 2 3 matrix and that B is a 3 4 matrix. Then we can multiply to obtain the matrix product AB; but we cannot ¯nd the product BA: De¯nition 12 Suppose that A is an m n matrix and that B is a n matrix. Then the matrix product AB = C = [cij ] is de¯ned by p n cij = aik bkj k=1 = ai1 b1j + ai2 b2j + ai3 b3j + : : : + ain bnj In other words the entry in the ij position of the product is the dot product of the ith row of A and the j th column of B: This matrix product will have dimension m p: That is Am£n Bn£p = Cm£p Example 13 1 2 0 3 2 9 5 3 0 1 2 6 5 3 1 0 2 0 = 9 15 10 7 28 21 28 9 It is easy to see that each entry of the product is the sum of three products. For instance, the 2; 3 position of the matrix product is the dot product of the 5 3 0 1 1 2 0 nd rd 2 6 5 3 : 2 row of and the 3 column of 3 2 9 1 0 2 0 3 0 + 2 5 + 9 2 = 28 3 Example 14 Suppose that M = 1 4 1 3 7 0 and that N = 0 3 10 1 5 6 : Then M N = = 1 4 0 3 10 1 5 6 1 3 7 0 5 20 70 19 and N M = = 0 3 10 1 5 6 1 4 1 3 7 0 12 21 0 14 3 30 29 37 15 Exercise 15 Find the matrix products 2 3 4 6 5 6 1 4 0 2 5 0 and 2 3 4 6 5 6 1 4 0 2 5 0 Matrix arithmetic satis¯es several familiar properties. Theorem 16 Matrix multiplication is associative. That is A (BC) = (AB) C Theorem 17 Matrix addition and multiplication satisfy two distributive laws. A (B + C) = AC + BC and (A + B) C = AC + BC 4 Theorem 18 c (AB) = (cA) B One familiar property which matrices do not have is commutativity. In general it is not true that AB = BA: Indeed, it can easily be the case that one of those products does not even exist. There are two special matrices. The Zero Matrix is the matrix for which all entries are zero. And the Identity Matrix is the matrix I where the entries along the main diagonal and all ones and the entries o® the main 1 if i = j diagonal are all zero. That is aij = : 0 if i = j Example 19 I2£2 = 1 0 0 1 and I4£4 = 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 The following observation justi¯es the terminology. Theorem 20 Suppose that A is an m identity matrix. Then A I =I m matrix and that I is the m m A=A Remark 21 In the system of matrix arithmetic which we are developing, I is the multiplicative identity. The identity matrix I takes the same role in matrix multiplication as real number 1 takes in real number multiplication. De¯nition 22 Suppose that A and B are square matrices of the same dimension. If AB = I then we say that A is the multiplicative inverse of B and that B is the multiplicative inverse of A: The inverse of a matrix M is denoted by M ¡1 : This de¯nition says that if the product is two square matrices in I then each is the multiplicative inverse of the other. Theorem 23 A¡1 ¡1 =A AA¡1 = I 5 In matrix arithmetic, multiplying by an inverse is akin to division of real numbers. We can solve for unknown matrices just as we can solve for unknown real numbers. For example, we solve ax + b = c; (a = 0) and obtain 1 x = (c a Similarly, we solve the matrix equation b) AX + B = C and obtain X = A¡1 (C B) Of course this makes sense only when A has an inverse. So three questions about inverse matrices come to mind. Does every matrix have an inverse. If a matrix has an inverse, is it unique? And ¯nally, if a matrix has an inverse how do we ¯nd it. The ¯rst question is easily answered. No. Some matrices do not have an inverse. 2 3 does not have an inverse. To see 4 6 this we ¯rst assume that A does have an inverse Example 24 The matrix A = a b c d A¡1 = This means that A A¡1 I a b c d = More speci¯cally, 2 3 4 6 1 0 0 1 Evaluating the left-hand-side and then equating corresponding entries we obtain a system of four equations in four unknowns. 2a + 3c 3a + 6c 2d + 3c 4b + 6d 6 = = = = 1 0 0 1 It is easy to see that this system is inconsistent; it has no solution. A¡1 does not exist. Thus If a matrix has an inverse we say that it is invertible, or that it is nonsingular. If matrix does not have an inverse we say that it is not invertible or that it singular. The second question is also easy to answer. Theorem 25 Suppose that A; B; and C are all n is invertible. If AB = AC n matrices and that A (1) then B=C Since A in invertible, then there exists an inverse, A¡1: We left multiply both sides of (1) by A¡1 and obtain A¡1 (AB) = A¡1 (AC) Now simplifying both sides using the associativity of matrix multiplication and the de¯nition of inverse matrix we obtain B=C Corollary 26 If A and B are n n matrices such that A is invertible and AB = I (2) then B = A¡1 :: In other words, any matrix that acts line A¡1 is A¡1 ; and the inverse of a nonsingular matrix is unique. We know by(2) that AB = I = AA¡1 and we know by Theorem (25) that this implies B = A¡1: 7 Suppose that An£n and Bn£n are nonsingular matrices. nonsingular and (AB)¡1 = B ¡1 A¡1 Then AB is In other words, the inverse of a product is the product of the inverses in the opposite order. Proof. (AB) B ¡1 A¡1 = A BB ¡1 A¡1 = = = = A (I) A¡1 (AI) A¡1 AA¡1 I (Remember that if the product of two matrices is I then each is the inverse of the other.) 8