Math 317: Linear Algebra Homework 8 Solutions

advertisement
Math 317: Linear Algebra
Homework 8 Solutions
The following problems are for additional practice and are not to be turned in: (All
problems come from Linear Algebra: A Geometric Approach, 2nd Edition by ShifrinAdams.)
Exercises: Section 3.6: 1–6
Section 4.1: 6,7,8
Turn in the following problems.
1. Section 3.6, Problem 8
(a) Proof : Let V = A ∈ Rn×n | A = AT . We claim that V is a subspace
of Rn . We begin by observing that 0n×n ∈ V , since 0T = 0. That is, the
zero matrix is symmetric. Let A, B ∈ V . Then A = AT and B = B T
(wts. A + B is symmetric). Then (A + B)T = AT + B T = A + B so
A + B is symmetric and hence is an element of V . Let c ∈ R and suppose
that A ∈ V . Then (cA)T = cAT = cA and so cA is symmetric and hence
an element of V . Since all three conditions of a subspace were met, we
may conclude that V is a subspace of Rn×n . The dimension is given by
2
n + n 2−n
(b) Proof : Let V = A ∈ Rn×n | A = −AT . We claim that V is a subspace
of Rn . We begin by observing that 0n×n ∈ V , since 0T = −0 = 0. That is,
the zero matrix is skew symmetric. Let A, B ∈ V . Then A = −AT and
B = −B T (wts. A + B is skew symmetric). Then (A + B)T = AT + B T =
−A − B = −(A + B) so A + B is skew-symmetric and hence is an element
of V . Let c ∈ R and suppose that A ∈ V . Then (cA)T = cAT = −cA
and so cA is skew-symmetric and hence an element of V . Since all three
conditions of a subspace were met, we may conclude that V is a subspace
2
of Rn×n . The dimension is given by n 2−n .
(c) Proof : From a previous homework, we recall that every square matrix A
can be uniquely written as the sum of a symmetric matrix and a skewsymmetric matrix. This is equivalent to saying that Mn×n = S + K
where S is the set of symmetric matrices, K is the set of skew-symmetric
matrices, and Mn×n is the set of all n × n matrices.
2. Section 3.6, Problem 9 (Hint: The dimension of Rn×n is n2 . Why? )
(a) Proof : By definition of the transpose of A, we know that if A = [Aij ],
then AT = [Aji ] and
diagonal entries of A and AT are the same.
Pn so the P
Thus trace(A) = i=1 aii = ni=1 aTii = trace(AT ).
(b) Proof : For any two matrices
A and B,
have that P
(A + B)ii P
= aii + bii .
Pn
Pwe
n
n
Thus trace(A+B) = i=1 [A+B]ii = i=1 aii +bii = i=1 aii + ni=1 bii =
trace(A) + trace(B).PFor any realPnumber c, we
P have that [cA]ii = caii
and so trace(cA) = ni=1 [cA]ii = ni=1 caii = c ni=1 aii = ctrace(A).
1
Math 317: Linear Algebra
Homework 8 Solutions
(c) Proof :
Using the definition of matrix multiplication, we
call that [AB]ii = ith row of A · ith column
Pn of B
[a
a
.
.
.
,
a
]
·
[b
b
.
.
.
,
b
]
=⇒
trace(AB)
=
in
1i 2i
ni
Pi1n i2
Pn Pn i=1 [AB]ii
] · [b
1i b2i . . . , bni ]P =
i=1
j=1 aij bji
Pni=1 [a
Pi1nai2 . . . , ainP
P
n
n
n
b
a
=
b
a
=
[BA]
=
trace(BA).
jj
i=1
j=1 ij ji
j=1
i=1 ij ji
j=1
re=
=
=
(d) Proof : Define < A, B >= trace(AT B). We verify that < A, B > is an inner product. We begin by observing (from part (a)) that < A, B >=
trace(AT B) = trace((AT B)T ) = trace(B T A) =< B, A >. Next we
find that < A + B, C >= trace((A + B)T C) = trace((AT + B T )C) =
trace(AT C +B T C) = trace(AT C)+trace(B T C) =< A, C > + < B, C > .
For any real number c, we have that < cA, B >= trace((cA)T B) =
trace(cAT B) = ctrace(AT B)
c < A, B >. PFinally,
we have that
Pn= P
n
n Pn
T
T
< A, A >= trace(A A) = i=1 j=1 aij aji = i=1 j=1 a2ji ≥ 0, and
P P
< A, A >= 0 if and only if ni=1 nj=1 a2ji = 0 which occurs if and only if
aji = 0 for all i, j, which is true if and only if A = 0. Thus < A, B > is
an inner product.
(e) Proof : Suppose that A is symmetric and B is skew-symmetric. Then
A = AT and B = −B T . Thus we have that < A, B >= trace(AT B) =
trace(BAT ) = trace(−B T A) = −trace(B T A) = − < B, A >. But <
A, B >=< B, A > so < B, A >= − < B, A > =⇒ < B, A >= 0 =⇒ <
A, B >= 0.
(f) Proof : From the proof above, we have that A and B are orthogonal if
A is symmetric and B is skew symmetric (since < A, B >= 0.) Since
this is true for all symmetric and skew-symmetric matrices, then the set
of all symmetric matrices and the set of all skew symmetric matrices are
orthogonal complements of each other.
3. Section 3.6, Problem 10a
Proof : We show that V = {A ∈ Rn | trace(A) = 0} is a subspace of the space
of
We begin by noting that 0n×n ∈ V since trace(0n×n ) =
Pnall n × n matrices.
Pn
0
=
0. Now, suppose that A, B ∈ V and let c be any
0
=
i=1 ii
i=1
real number. Then trace(A) = 0 and trace(B) = 0. Using the properties
for trace established in the previous problem, we have that trace(A + B) =
trace(A) + trace(B) = 0 + 0 = 0 and trace(cA) = ctrace(A) = c(0) = 0. Thus
A + B and cA ∈ V for any real number c. Therefore, V is a subspace of the
space of all n × n matrices.
4. Let Pn denote the vector space of polynomials of degree n or less. Define V =
{tn p0 (0) + t, p ∈ Pn , t ∈ R} ⊂ Pn where p0 (0) denotes the first derivative of p ∈ Pn
evaluated at 0. Prove or disprove that V is a subspace of Pn .
We claim that V is not a subspace of Pn because it is not closed under addition.
0
0
To see why, let s, q ∈ V . Then s(t) = p1 (0)tn + t and q(t) = p2 (0)tn + t, where
0
0
p1 and p2 ∈ Pn . Then s + q = (p1 (0) + p2 (0))tn + 2t 6= tn p0 (0) + t for some
2
Math 317: Linear Algebra
Homework 8 Solutions
polynomial p. (since we now have 2t rather than t). Thus s + q ∈
/ V and hence
V cannot be a subspace of Pn .
5. Section 4.1, Problem 7
To find the least squares solution associated
system
 with
 the given 
 of equations,
1 1
1
we find x̂ = (AT A)−1 AT b where A = 1 −3 and b = 4. Thus we have
2 1
3
that

−1
1 1
1
11
0
1
1
2
1 −3 =
,
=
1 −3 1
66 0 6
2 1

(AT A)−1
and so
 
1
1 121
1 11 0 1 1 2  
4 =
x̂ =
66 0 6 1 −3 1
66 −48
3
 
73
1 
265
. The point closest to b is given by projV (b) = Ax̂ = 66
194
6. Section 4.1, Problem 15
Proof: We show that A is a projection matrix onto its own column space,
C(A). To show this, we must show that for all x ∈ Rn , we have Ax ∈ C(A)
and x − Ax ∈ C(A)⊥ = N (AT ). Note that this comes from the definition
of a projection with p = Ax and b = x. We observe that by definition
of C(A), Ax ∈ C(A). To show that x − Ax ∈ N (AT ) we must show that
AT (x − Ax) = 0. Using that fact that AT = A and A2 = A, we see that
AT (x − Ax) = AT x − AT Ax = Ax − A2 x = Ax − Ax = 0. Thus, A is a
projection matrix onto C(A).
7. A small company has been in business for three years and has recorded annual
profits (in thousands of dollars) as follows.
Year 1
Sales 7
2
4
3
3
Assuming that there is a linear trend in the declining profits, predict the year and
the month in which the company begins to lose money.
Since we are told that there is a linear trend, we will use a least squares
approach to find the line that best fits the data. To this extent, using y =
mx + b with the data points (1, 7), (2, 4), (3, 3) we obtain the following system
of equations:
3
Math 317: Linear Algebra
Homework 8 Solutions
m+b=7
2m + b = 4
3m + b = 3,
which in matrix form Ax = b becomes


 
1 1 7
2 1 m = 4 .
b
3 1
3
Noting that this system is inconsistent (while A is full rank), we proceed by
finding the least squares solution x̂ = (AT A)−1 AT b. We begin by calculating
(AT A)−1 to obtain:

−1
1 1
−1
1
3
−6
1
2
3
14
6
2 1 =
.
=
=
1 1 1
6 3
6 −6 14
3 1

(AT A)−1
Thus, the least squares solution is
 
7
1 3 −6 1 2 3   1 −6
T
−1 T
4 =
.
x̂ = (A A) A b =
6 −6 14 1 1 1
3 26
3
Thus, the best fit line is given by y = −2x+ 26
. To find out when the company
3
begins to lose money, we find the x such that y = 0. This is given by x = 13/3
months which translate into 4 years and 4 months. By the 5th month of the
4th year, the company will begin to lose money.
4
Download