Math 2270 Spring 2004 Homework 14 Solutions Section 5.3) 2, 3, 12, 13, 14, 15, 16, 17, 20, 22, 36, 38 (2,3) Let A be the matrix for the orthogonal transformation L from ℜn to ℜn . Then, by Fact 5.3.3 A is orthogonal, and by Fact 5.3.7 AT A = In . Therefore, L(~v ) · L(w) ~ = (A~v) · (Aw) ~ = (A~v )T (Aw) ~ = ~v T AT Aw ~ = ~v T (AT A)w ~ = ~v T w ~ = ~v · w ~ Therefore, orthogonal transformations preserve the dot product. The angle between two nonzero vectors ~v and w ~ is defined by the formula cos θ = ~v · w ~ ||~v || ||w|| ~ Since L is orthogonal, ||~v || = ||L(~v )|| and ||w|| ~ = ||L(w)||, ~ so the angle between L(~v ) and L(w) ~ is cos θ = ~v · w ~ L(~v ) · L(w) ~ = ||L(~v )|| ||L(w)|| ~ ||~v || ||w|| ~ Therefore, orthogonal transformations preserve angle. √ 1/ √2 2/3 (12) We must find a unit vector that is perpendicular to both 2/3 and −1/ 2 . One 1/3 0 way to find a vector perpendicular to the given two is to take their cross product. Other methods do exist. For example, one could form a matrix using the given vectors as the rows and then find the kernel of the matrix. Here, I will use the cross product. √ √ √ 0 +√ 1/3 2 1/3√2 1/ √2 2/3 1/3 2/3 x −1/ 2 = √ 2 − 0 √ = 1/3 √2 1/3 0 −2/3 2 − 2/3 2 −4/3 2 Now that we have a vector that is orthogonal to the other two we must normalize it. √ 1/3√2 s 1 16 1 + + =1 1/3 2 = √ 18 18 18 −4/3 2 The magnitude of the vector is already one. The desired matrix is: √ √ 2/3 1/ √2 1/3√2 1/3 √2 2/3 −1/ 2 1/3 0 −4/3 2 1 Math 2270 Spring 2004 Homework 14 Solutions (13) From problems 2 and 3, we know that orthogonal transformations preserve angles be 2 −3 2 tween vectors. Here we notice that 3 and 2 are perpendicular, whereas T 3 = 0 0 0 2 −3 3 0 and T 2 = −3 are not (look at the corresponding dot products). There0 0 2 fore, there is no orthogonal transformation with the given criteria. (14) By definition, a matrix is symmetric if it is equal to its transpose. (AT A)T = AT (AT )T = AT A (AAT )T = (AT )T AT = AAT Both AAT and AT A are symmetric. (15) We are given that AT = A and B T = B. Therefore, (AB)T = B T AT = BA This is only equal to AB if the matrices A and B commute with one another. In general, the product of symmetric matrices is not symmetric. (16) We are given that AT = A. Therefore, (A2 )T = (AA)T = AT AT = AA = A2 The square of a symmetric matrix is always symmetric. (17) We are given that AT = A. Therefore, (A−1 )T = (AT )−1 = (A)−1 = A−1 The inverse of a symmetric matrix is always symmetric. 2 Math 2270 Spring 2004 Homework 14 Solutions (20) First, we must find an orthonormal basis for the given subspace. Then, we will use Fact 5.3.10. 1 1 1 w ~1 = 2 1 1 ~v2 · w ~1 = ~v2 − 4w ~1 = 1 9 5 3 + − + =4 2 2 2 2 −1 2 1 7 9 2 = − −5 2 −7 1 2 3 w ~2 = AAT = = 1 10 −1 7 −7 1 1/2 −1/10 " # 1/2 1/2 1/2 1/2 1/2 7/10 1/2 −7/10 −1/10 7/10 −7/10 1/10 1/2 1/10 26/100 18/100 32/100 24/100 18/100 74/100 −24/100 32/100 32/100 −24/100 74/100 18/100 24/100 32/100 18/100 26/100 = 1 50 13 9 16 12 9 37 −12 16 16 −12 37 9 12 16 9 13 (22) We can think of A2 as AA. Then A(A~x) first projects the vector ~x orthogonally onto the subspace V defined by the columns of A. Then, we perform the orthogonal projection again on the new vector projV ~x. But, projV ~x is already in the subspace V so the orthogonal projection will return the same vector back. Therefore, A2 = A if A is the matrix of an orthogonal projection. Now, consider an orthonormal basis {~v1 , . . . ~vm } for the subspace V and define the matrix B = [~v1 · · · ~vm ]. Then, by Fact 5.3.10 the matrix of the orthogonal projection onto V is A = BB T . Therefore, A2 = (BB T )(BB T ) = B(B T B)B T since matrix multiplication is associative. Since the matrix B is orthogonal, we know that B T B = Im . Therefore, A2 = B(B T B)B T = BB T = A. 3 Math 2270 Spring 2004 Homework 14 Solutions (36) We are given L(A) = AT . Let A and B be 2 x 3 matrices, and let k be a scalar. Then, L(A + B) = (A + B)T = AT + B T = L(A) + L(B) L(kA) = (kA)T = kAT = kL(A) so L is a linear transformation. We see that L(L(A)) = L(AT ) = (AT )T = A, so the transformation is its own inverse. Since L−1 exists, L is an isomorphism. (38) We will use the general matrix A. T (A) = A + AT = = a1,1 a2,1 ··· an,1 a1,2 a2,2 ··· an,2 ··· ··· ··· ··· a1,n a2,n ··· an,n 2a1,1 a1,2 + a2,1 a1,2 + a2,1 2a2,2 ··· ··· a1,n + an,1 a2,n + an,2 + a1,1 a1,2 ··· a1,n a2,1 a2,2 ··· a2,n · · · a1,n + an,1 · · · a2,n + an,2 ··· ··· ··· 2an,n ··· ··· ··· ··· an,1 an,2 ··· an,n Looking at the calculations above, we see that all matrices in the image are symmetric. Therefore, the image of T is the set of all n x n symmetric matrices. The kernel is the set of all matrices satisfying 2a1,1 a1,2 + a2,1 a1,2 + a2,1 2a2,2 ··· ··· a1,n + an,1 a2,n + an,2 · · · a1,n + an,1 · · · a2,n + an,2 ··· ··· ··· 2an,n = 0 0 ··· 0 0 0 ··· 0 ··· ··· ··· ··· 0 0 ··· 0 We see that the diagonal entries of A must equal zero. The off-diagonal entries must satisfy ai,j = −aj,i for all i 6= j. Matrices that satisfy this criteria are called skew-symmetric, so the kernel of T is the set of all n x n skew-symmetric matrices. 4