Math 317: Linear Algebra Homework 5 Solutions Turn in the following problems.

advertisement
Math 317: Linear Algebra
Homework 5 Solutions
Turn in the following problems.
1. Section 2.5, Problem 12
Proof : Suppose that A is a symmetric n × n matrix, and there is an x, y ∈ Rn
such that Ax = 2x and Ay = 3y. We wish to show that x is orthogonal to
y, i.e. x · y = 0. From class, we know that Ax · y = x · AT y. Since A is
symmetric, then A = AT so Ax · y = x · Ay. Now using the fact that Ax = 2x
and Ay = 3y, we have that Ax · y = x · Ay =⇒ 2x · y = x · 3y =⇒ 2(x · y) =
3(x · y) =⇒ x · y = 0. Thus x is orthogonal to y.
2. Section 2.5, Problem 16
Proof : Suppose that A is an n × n symmetric matrix and that A2 = 0. We
want to show that A = 0. Since A is a symmetric matrix (i.e. A = AT ), then
A2 = A ∗ AT = 0. Now by definition of matrix multiplication, we know that
0 = [A ∗ AT ]ii = ith row of A · ith column of AT . This is equivalent to saying
0 = (ai1 , ai2 , . . . , ain ) · (aT1i , aT2i , . . . , aTni ). By the definition of AT , aT1i = ai1
and so forth. So we really have that 0 = (ai1 , ai2 , . . . , ain ) · (ai1 , ai2 , . . . , ain ) =
a2i1 + a2i2 + . . . + a2in . For this statement to be true, we must have that ai1 =
0, ai2 = 0, . . . , ain = 0. Since this is true for every i = 1, . . . , n, this shows that
every entry of A is 0. Thus A = 0.
0 1
To show that symmetry is required, consider the matrix A =
. Then
0 0
A2 = 0, but A 6= 0, and A is not symmetric. That is, A 6= AT .
3. Section 2.5, Problem 22
(a) Proof: We show that 0 is the only matrix that is both symmetric and
skew-symmetric as follows. Suppose that A is symmetric and skewsymmetric. Then A = AT and −A = AT . So A = −A =⇒ 2A =
0 =⇒ A = 0. So 0 is the only matrix which is both symmetric and
skew-symmetric.
(b) Proof : Let S = 12 (A + AT ) and K = 12 (A − AT ). Then S T =
1
T
(A + AT ) = 12 (AT + (AT )T ) = 12 (AT + A) = 21 (A + AT ) = S. So
2
T
S is symmetric. Similarly, K T = 12 (A − AT ) = 21 (AT − (AT )T ) =
1
(AT − A) = − 21 (A − AT ) = −K. So K is skew-symmetric.
2
(c) Proof: Let A be a square matrix. From above, we observe that A =
S + K = 12 (A + AT ) + 21 (A − AT ). So every square matrix A may be
expressed as the sum of a symmetric matrix and a skew-symmetric matrix.
(d) Proof: To show that this sum is unique, let us suppose that a square
matrix A can be expressed as the sum of a symmetric and skew-symmetric
matrix is two different ways. That is, A = S + K and A = S 0 + K 0 where
1
Math 317: Linear Algebra
Homework 5 Solutions
S, S 0 as symmetric matrices and K, K 0 are skew-symmetric matrices. We
will show that S = S 0 and K = K 0 . Now, since A = S + K = S 0 + K 0 ,
then S + K = S 0 + K 0 =⇒ S − S 0 = K − K 0 . Now we observe that
S − S 0 is symmetric since (S − S 0 )T = S T − (S 0 )T = S − S 0 . Note that we
have used the fact that S and S 0 are symmetric. Similarly K − K 0 is skew
symmetric since (K − K 0 )T = K T − (K 0 )T = −K + K 0 = −(K − K 0 ).
Here we have used the fact that K and K 0 are skew symmetric. Thus we
have a symmetric matrix equal to a skew symmetric matrix. From (a), we
know that the only matrix that is both symmetric and skew-symmetric
is the 0 matrix. Hence, S − S 0 = K − K 0 = 0 =⇒ S − S 0 = 0, K − K 0 =
0 =⇒ S = S 0 , K = K 0 . Thus, this sum is unique.
4. Section 2.5, Problem 23
(a) Proof: Suppose that A is an m × n matrix such that Ax · y = 0 for
all x ∈ Rn and y ∈ Rm . We show that A = 0. Now, we know that
Ax · y = (Ax)T y = xT AT y = 0 for all x, y. Let x1 be the first column
of the n × n identity matrix In and let y1 be the first column of the
m × m identity matrix Im . Then xT AT y = 1 0 . . . 0 AT y = AT1j y =
 
1
0
 
AT1j  ..  = AT11 = 0, where A1j denotes the 1st row of AT and AT11 denotes
.
0
the first entry of AT . So the first entry of AT is 0. To show that the (ij)
entry of AT is 0, let xi denote the ith column of the identity matrix
In and let yj denote the jth column of the identity matrix Im . Then
0 = xTi AT yj = ATij . So every entry of AT is 0. Since 0T = 0, this proves
that A = 0.
(b) Proof: Suppose that Ax · x = 0 for all x ∈ Rn and that A is a square
symmetric matrix. We show that A = 0 as follows. Using associated
properties of the dot product, we have that A(x + y) · x + y = Ax ·
x + Ax · y + Ay · x + Ay · y = Ax · y + Ay · x where we have used the
fact that Ax · x = 0 and Ay · y = 0. From a proposition from class,
we know that Ay · x = y · AT x = y · Ax where the last equality we
have used the fact that A = AT (A is symmetric). So we have that
Ax · y + Ay · x = Ax · y + y · Ax = Ax · y + Ax · y = 0 for all x, y ∈ Rn .
From (a), we know that this implies that A = 0.
0 −1
(c) To show that symmetry is important, consider the matrix, A =
.
1 0
Then for any x = [x1 , x2 ], we have that Ax = [−x2 , x1 ], and so Ax · x =
−x2 ∗ x1 + x1 ∗ x2 = 0, but A 6= 0. Of course, A 6= AT and so this doesn’t
destroy the hypothesis in (b).
5. Section 2.5, Problem 24
2
Math 317: Linear Algebra
Homework 5 Solutions
Suppose that A is an n × n matrix satisfying Ax · y = x · Ay for all x, y ∈ Rn .
We wish to show that A is symmetric. Now, from class, we know that Ax · y =
x · AT y. We also have that Ax · y = x · Ay. Thus x · AT y = x · Ay =⇒
x · (AT − A)y = 0 =⇒ (AT − A)T x · y = 0 =⇒ (A − AT )x · y = 0. Since this
is true for all x, y ∈ Rn , then we have that A − AT = 0 from problem 23(a).
Thus A = AT and hence A is symmetric.
6. Suppose that I is an n × n identity matrix, and u, v ∈ Rn . Recall that elementary
matrices are matrices of the form I − uvT . Prove that any elementary matrix is
invertible and the inverse is once again an elementary matrix. Hint: Show directly
−1
uvT
.
that I − uvT
=I− T
v u−1
T
We show that I − uvT I − vTuvu−1 = I. Via the distributive property of
matrix multiplication, we obtain:
I − uv
T
uvT
I− T
v u−1
uvT
uvT
T
T
−
uv
+
uv
vT u − 1
vT u − 1
uvT − uvT (vT u − 1) + uvT uvT
vT u − 1
uvT − uvT vT u − uvT + uvT uvT
vT u − 1
T T
−uv v u + uvT uvT
vT u − 1
T
v u −uvT + uvT
vT u − 1
= I2 −
= I−
= I−
= I−
= I−
= I.
Thus, every
elementary matrix is invertible and its inverse is given by
T
T −1
= I − vTuvu−1 . We also see that the inverse of an elementary
I − uv
T
matrix is once again an elementary matrix since I − vTuvu−1 = I − ûvT where
û = vT uu−1 .


1 4 5
7. Let A = 4 18 26.
3 16 30
(a) Determine the LU factors of A.
We use elementary row operations (of Type III only) to transform A into
an upper triangular matrix. To that extent, we have,
3
Math 317: Linear Algebra
Homework 5 Solutions


1 4 5
R3 −3R1 →R3
A = 4 18 26
−→
R2 −4R1 →R2
3 16 30


1 4 5
2 →R3
0 2 6  R3 −2R
−→
0 4 15


1 4 5
0 2 6 = U
0 0 3


1 4 5
So U = 0 2 6. L is a lower triangular matrix with 10 s in the diagonal.
0 0 3
Below the diagonal, the lij entry represents the multiple of row j that is
subtracted from row i to
 eliminate
 the (i, j) position during Gaussian
1 0 0

elimination. Thus, L = 4 1 0.
3 2 1
(b) Use the LU factors to solve Ax1 = b1 as well as Ax2 = b2 where


6
b1 =  0 
−6
 
6

and b2 = 6  .
12
To solve Ax1 = b1 using our LU factors, we write Ax1 = LU (x) =
L(U x1 ) = b1 . Letting y = U x1 , we first solve Ly = b1 as follows:

   
1 0 0 y1
6
4 1 0 y2  =  0  =⇒
3 2 1 y3
−6
y1 = 6
4y1 + y2 = 0 =⇒ 4(6) + y2 = 0 =⇒ y2 = −24
3y1 + 2y2 + y3 = −6 =⇒ 3(6) + 2(−24) + y3 = −6 =⇒ y3 = 24
Now, we may solve U x1 = y to obtain:

  

1 4 5 x1
6
0 2 6 x2  = −24 =⇒
0 0 3 x3
24
3x3 = 24 =⇒ x3 = 8
2x2 + 6x3 = −24 =⇒ 2x2 + 6(8) = −24 =⇒ x2 = −36
x1 + 4x2 + 5x3 = 6 =⇒ x1 + 4(−36) + 5(8) = 6 =⇒ x1 = 110,
4
Math 317: Linear Algebra
Homework 5 Solutions


110
so x1 = −36. Repeating this same procedure for Ax2 = b2 , one can
8


112
show that x2 = −39.
10
8. Prove or give a counterexample (You must show that your counterexample works!):
Every n × n nonsingular matrix has an LU factorization. Hint: See Problem 14,
part a in Section 2.4.
0 1
This claim is false. To see why, consider the matrix A =
. A is full rank
1 0
and hence is nonsingular. Let us suppose for the sake of contradiction that A
has an LU factorization. Then
0 1
1 0 u1 u2
u1
u2
=
=
.
1 0
l1 1 0 u3
l1 u1 l1 u2 + u3
Note that this implies that u1 = 0 and l1 u1 = 1 simultaneously which is
impossible. Thus, A has no LU factorization.
5
Download