Math 2250-1 Summer 2010 Test 2 You will have one hour to complete this test. Please show all your work, since your score on a problem is determined from that and not necessarily from your answer. All solutions should be real-valued unless specifically instructed otherwise. No calculators or other electronic devices are allowed. Good Luck! 1. Find the inverse of the matrix 1 A= 0 2 We augment 1 2 0 0 3 1 2 5 0 A with the identity matrix .. . 1 0 0 1 2 .. ∼ . 0 1 0 0 3 .. 0 1 . 0 0 1 1 ∼ 0 0 2 3 5 and row reduce to find that . 0 .. 1 0 0 1 0 . ∼ 1 .. 0 1 0 0 1 .. 0 0 0 . −2 0 1 . 0 0 .. 5 0 −2 . , 1 0 .. −2 0 1 . 0 1 .. 6 1 −3 so A −1 5 −2 = 6 2. Calculate the determinant of the matrix 2 |A| = −2 −3 − 32 1 3 − 13 .. . 1 .. . 0 .. . −2 − 32 1 3 − 31 0 0 1 0 −2 0 1 . 1 −3 4 −1 3 4 . 2 6 2 −2 A= −3 We have that 0 1 . 0 4 1 2 3 4 = 6 2 6 9 4 −1 19 0 , 26 0 where the second matrix is obtained from the first by adding four times the first row to the second and six times the first row to the third (remember that the elementary row operation of adding a scalar multiple of one row to the other does not change the determinant). Expanding along the third column, we find that 6 19 = −(6 · 26 − 9 · 19) = −(156 − 171) = 15. |A| = −1 9 26 1 3. Below are two subsets of R3 . Either show that the subset is actually a subspace of R3 , or provide an example that shows it is not. (a) V = {(x, y, z) : xyz = 0} (b) W = {(x, y, z) : 3x + y = −z, −2x + 3z = y, and 4z = 0} (a) This is not a subspace since 1 0 0 , 1 ∈ V, 1 0 but their sum (111)T is not. (b) If a vector (x, y, z) ∈ W , then its components satisfy the system of equations 3x + y + z = 0 −2x − y + 3z = 0 , z=0 which is equivalent to the matrix system 3 1 1 x 0 −2 3 −1 y = 0 . 0 0 1 z 0 Therefore, W = Null(A), and a null space is always a subspace. 4. Determine if the set 0 9 3 −2 , −1 , −5 1 2 1 is linearly dependent of linearly independent. Since 3 −2 1 0 −1 1 9 −5 2 = (−1) 3 1 9 = −(6 − 9) − (−15 + 18) = 3 − 3 = 0. −5 9 3 − 2 −2 Since the determinant of the matrix is zero, its columns are linearly dependent. 5. Find a basis for the row and column spaces of the matrix 1 2 −1 0 A = 2 0 2 1 . −3 1 6 3 To do this, we reduce A to row-echelon form have that 1 2 −1 0 1 A ∼ 0 −4 4 1 ∼ 0 0 7 3 3 0 (but not necessarily reduced row-echelon form). We 2 −4 28 2 −1 4 12 0 1 1 ∼ 0 12 0 2 −4 0 −1 4 40 0 1 . 19 A basis for Row(A) consists of the nonzero rows of the row-echelon matrix: {(1 2 − 1 0), (0 − 4 4 1), (0 0 4 0 19)} . To find a basis for Col(A), we take columns 1, row-echelon matrix that have leading variables. 1 2 , −3 2, and 3 of A, since those are the columns of the Therefore, a basis for Col(A) is 2 −1 0 , 2 . 1 6 6. Find the general solution to the problem y 00 − 8y 0 + 16y = 0 The characteristic polynomial for this equation is r2 − 8r + 16, and its roots are r= 8± p 64 − 4(1)(16) = 4 or 4. 2 Therefore, the general solution is y(x) = c1 e4x + c2 xe4x . 7. Solve the initial value problem y 00 − 2y 0 − 3y = 0, y(0) = 2, y 0 (0) = 0. The characteristic equation is r2 − 2r − 3 = 0, so the roots are r= 2± p 4 + 4(3) = 3 or − 1. 2 Therefore, y(x) = c1 e3x + c2 e−x ⇒ y 0 (x) = 3c1 e3x − c2 e−x . Applying the initial conditions, we find that c1 + c2 = 2 , 3c1 − c2 = 0 which is equivalent to the augmented matrix system .. .. 1 1 . 2 1 1 . 2 ∼ ∼ 1 0 .. .. 3 −1 . 0 0 1 0 −4 . −6 3 .. . .. . 1 2 3 2 . The solution is y(x) = 1 3x 3 −x e + e . 2 2 8. Use the Wronskian to determine whether the set {cos x, sin x} is linearly independent or linearly dependent. The Wronskian of sin x and cos x is cos x − sin x sin x = cos2 x + sin2 x = 1 6= 0. cos x Since the Wronskian is not zero, the functions form a linearly independent set. 9. Mark the following statements true if they are always true. Mark them false if they are always false or only sometimes true. For any matrix A, dim(Row(A)) = dim(Col(A)). If A is an invertible matrix, then there is a unique solution to the problem A~x = ~b for any vector ~b. If W is a nonzero subspace of Rn , then it contains vectors of any length. If V and W are two subspaces of Rn , then the set V ∪ W = {x : x ∈ V or x ∈ W } is a subspace of Rn . If A~x = k~x for some scalar k and a fixed vector ~x, then A = kI. The first statement was stated as a theorem in the section about row and column statements. The second statement was proved in chapter 3 in the section about invertible matrices. The third statement is true because any subspace is closed under scalar multiplication. The fourth statement is false (think of two lines through the origin). The last statement is also false. For example, you might think of a diagonal matrix with all of its diagonal entries different. When you multiply this matrix with one of the columns of the identity, then A~x = k~x. 4