Notes on Homework 10/Practice Exam 2 → ∈ (

advertisement
Notes on Homework 10/Practice Exam 2
Due: Never
1.
(a) A subspace S of a vector space V is a nonempty subset of V that is closed under addition and scalar multiplication.
(b) A linear transformation T : V → W is a function from vector space V to vector space
W such that, for any v1 , v2 ∈ V, T (v1 + v2 ) = T (v1 ) + T (v2 ) and, for any α in the field,
T (α v1 ) = α T (v1 ).
(c) An eigenvalue λ is a constant associated to a matrix A such that there is a vector v with
Av = λv.
(d) The image of a linear transformation T : V → W is {w ∈ W |T (v) = w for some v ∈ V }
(e) A linear transformation is injective if T (v1 ) = T (v2 ) necesssarily means that v1 = v2 .
Alternatively, T is injective if for all w ∈ W, there is no more than one v ∈ V such that
T (v) = w.
(f) A set of vectors B is a spanning set for vector space V if v is in the span of B for any
v ∈ V.
(g) A basis of a vector space is just a linearly independent spanning set.
2. Nonempty, closed under addition, and closed under scalar multiplication.
3. Simply put, T (v1 + v2 ) = T (v1 ) + T (v2 ) and T (α v) = α T (v).
4. Here are eight (plus a couple extras):
• A is not invertible.
• A has 0 as an eigenvalue.
• RREF ( A) is not an identity matrix.
• The nullspace of A contains more than just the zero vector.
• The columns of A are linearly dependent.
• The columns of A do not form a basis for the domain.
• The rank of A is less than the number of columns of A.
• The nullity of A is positive.
• det( A) = 0.
• Ax = b does not have a unique solution for every choice of b.
5. We need to show that three properties hold (as described in #2):
Nonempty: The polynomial p( z) = 0 is certainly in S since p(0) = 0. X
Closed under addition: Let g( z), h( z) ∈ S. Then ( g + h)( z) is certainly a polynomial, and
( g + h)(0) = g(0) + h(0) = 0 + 0 = 0, by definition of function addition. So ( g + h)( z) ∈ S.
X
Professor Dan Bates
Colorado State University
M369 Linear Algebra
Fall 2008
Closed under scalar multiplication: Let g( z) ∈ S, α ∈ F. (α g)( z) is certainly a polynomial,
and (α g)(0) = α g(0) = α · 0 = 0. So (α g)( z) ∈ S. X
6.
7.
(a) We need to show that two properties hold (as desribed in #3):


 
 
x1 + y1
y1
x1
Let x =  x2  , y =  y2  ∈ R3 . Then T ( x + y) = T  x2 + y2 
x + y3
y3
x3
3
( x1 + y1 ) − ( x2 + y2 )
x1 − x2
y1 − y2
=
=
+
= T ( x) + T ( y).X
2( x3 + y3 ) + ( x2 + y2 )
2x3 + x2
2y3 + y2


α x1
x1 − x2
α x1 − α x2
Also, let α ∈ F. Then T (α x) = T α x2  =
=α
=
2x3 + x2
2α x3 + α x2
α x3
α T ( x).X
1 −1 0
.
(b) By inspection, [T ]ε←ε =
0 1 2
(c) A linear transformation is injective if and only if the kernel (a.k.a.
nullspace)
is trivial,
1 0 2
i.e., just the zero vector. Row-reducing our matrix, we get to
. The kernel is
0 1 2
nontrivial,so the
is not injective. Alternatively, you could notice
 transformation

 linear
1
0
0
.
that both  1  and 0 map to
0
1
0
−2
1−x
5
(a) p A ( x) = det
= (1 − x)2 − 25 = x2 − 2x − 24 = ( x − 6)( x + 4).
5
1−x
(b) The eigenvalues of A are the roots of p A ( x): x = −4, 6.
5 5
5 5
.
(c) For λ = −4, we have ε A (−4) = null( A − λ I ) = null( A + 4I ) = null
1 1
−1
Row-reducing, we get to
, so ε A (−4) = span
.
0 0
1
−5 5
For λ = 6, we have ε A (6) = null( A − λ I ) = null ( A − 6I ) = null
.
5 −5
1
1 −1
.
Row-reducing, we get to
, so ε A (6) = span
1
0 0
(d) Yes, A is diagonalizable. We know this because, for each eigenvalue, the algebraic
multiplicity and the geometric multiplicity are the same.
(e) Yes, A is invertible, since 0 is not an eigenvalue.
8.
(a)
Professor Dan Bates
Colorado State University
M369 Linear Algebra
Fall 2008
(b) Let v1 =
1
2
, v2 =
3
0
. Then
[id ]E ←B = ([id(v1 )]E [id (v2 )]E )
1 3
=
2 0
so that
[id ]B←E = ([id]E ←B )−1
1
0
2
= 1
1 .
3 −6
(c) Then
[ f ]B←B = [id]B←E [ f ]E ←E [id ]E ←B
1
1 5
0
1 3
2
= 1
1
5 1
2 0
3 −6
7 15 = 25 −23 .
2
9.
2
(a) True. f is injective if and only if the nullity is 0. However, the rank-nullity theorem tells
us that rank( f )+nullity( f ) = dim (W ) (the domain). However, since image( f ) ⊂ V,
rank( f ) =dim(im( f )) ≤dimV <dimW, so nullity( f ) > 0.
1
0
1
(b) False. As a simple counterexample, take
,
,
∈ R2 . The first two are
0
1
1
linearly independent, but the three together certainly are not!
(c) True. Since the first n vectors span V, any v ∈ V can be written as a linear combination
of these n vectors. This linear combination can be extended to all n + 1 vectors simply
by using the coefficient of 0 for vn+1 . Thus, any vector v ∈ V can be written as a linear
combination of all n + 1 vectors, i.e., they span V.
(d) False. We had a homework problem about this. Recall the example of two lines in
R3 . That example is closed under scalar
multiplication
  but definitely not under addi 
0
1
tion. In particular, take the spans of 0 and 1 in R3 as U and W, respectively.
0
0
   
     
1
0
1
0
1
0 , 1 ∈ U ∪ W, but 0 + 1 = 1 6∈ U ∪ W.
0
0
0
0
0
Professor Dan Bates
Colorado State University
M369 Linear Algebra
Fall 2008
Download