Uploaded by Layla Yasmine

SpracticeFinal

advertisement
1
Linear Algebra I,
Solution of the Practice final exam


1 −1
1
 −1
1
1 

1. (14 points) All parts of this problem refer to the following matrix: A = 
 1 −1 −1 .
−1
1 −1
(a) (2 pts) Find rref(A), the reduced row echelon form of A.


A→

1
0
0
0
−1
0
0
0


1

2 
→


−2 
−1
0
0
0
1
0
0
0
0

0

1 
 = rref(A)
0 
0
(b) (1 pt) Find the rank of A.
rank A = 2.
(c) (3 pts) Find dim N (A), the null space of A, and find a basis of N (A).
dim N (A) =3− rank
 A = 1. Solving Ax = 0 we get x1 = x2 , x3 = 0 and x2 is free. Therefore, a basis
1
of N (A) is { 1 }.
0
(d) (3 pts) Find dim C(A), the column space of A, and find an orthonormal basis
of C(A).
dim C(A)
2. Therefore, a basis of C(A) is the 1st and 3rd columns of A, i.e., {x1 =
 = rank A = 

1
1
 1 
 −1 
1
2




 1  , x =  −1 }. Now this is already an orthogonal basis. An orthonomral basis is {q =
−1
−1




1
1




−1 
1
2
2
2
 1 
x1 /||x1 || = 21 
 1  , q = x /||x || = 2 =  −1 }.
−1
−1

1
 1 

(e) (2 pts) Check whether the vector u = 
 1  is in C(A).
1



u is in C(A) iff Ax = u has a solution. [A |u] → 

1
0
0
0
−1
0
0
0
1
2
−2
0

1
2 
 At this point it is obvious that
0 
2
Ax = u has no solution. Thus u is not in C(A).


3
 −1 

(f) (3 pts) Solve the system Ax = b where b = 
 1 .
−3


[A |b] → 

1
0
0
0
−1
0
0
0
1
2
−2
0


3

2 
→



−2
0
Thus x1 = 2 + x2 , x3 = 1 and x2 is free.
1
0
0
0
−1
0
0
0
1
1
0
0


3



1 
→

0 
0
1
0
0
0
−1
0
0
0
0
1
0
0

2

1 
 = rref(([A|b]).
0 
0
2


1 −1
0
2 −1 .
2. (15 points) All parts of this problem refer to the following matrix A =  −1
2 −1
1
(a) (2 pts) Find det A using the cofactor method.
You can expand on any column or row. Here we expand on the first row |A| = a11 C11 + a12 C12 +
a13 C13 = 1 + 1 + 0 = 2.
(b) (1 pt) Find det A using Gaussian Elimination.


−1
1
1
1
0
0


0

−1  → 
1

0

−1  .
2
−1
1
0
1
0
0
Thus rank A = rank ref(A) = 2 since there were no row exchanges.
(c) (4 pts) Find A−1 using the cofactor method.

1
The cofactor matrix is C =  1
1
−1
1
1

−3
−1 .
1

1
T
C
1
A−1 =
=  −1
|A|
2
−3

1
1 .
1
1
1
−1
(d) (3 pts) Find A−1 using Gauss-Jordan Elimination.



1 −1
0 1 0 0
1
2 −1 0 1 0 . rref([A|I] =  0
[A|I] =  −1
2 −1
1 0 0 1
0
obtain the same answer as in part (c).
0
1
0
0
0
1
1/2
−1/2
−3/2
1/2
1/2
−1/2

1/2
1/2 . Thus we
1/2

0
(e) (3 pts) Consider the system of equations Ax = b where b =  −2 . Use
4
Cramer’s rule to find only x1 , the first variable.

x1 =
|B1 |
1
=
|A|
2
0
−2
4
−1
2
−1
0
−1
1
=
1
2
−2
4
−1
1
=
2
= 1.
2
(f) (2 pts) Solve the system Ax = b in part (e) (you have to find x1 , x2 , x3 ) using
any method you like.

Since A is invertible, x = A−1 b = 

1
1 . Thus x1 = x2 = 1 and x3 = 3.
3


1 0 1
3. (8 points) All parts of this problem refer to the following matrix A =  0 1 0 .
1 0 1
(a) (3 pts) Find χA (λ), the characteristic polynomial of A.
The characteristic polynomial of A is
χA (λ) = |A − λI| =
1−λ
0
1
0
1−λ
0
1
0
1−λ
= (1 − λ)(λ2 − 2λ) = −λ3 + 3λ2 − 2λ = −λ(λ − 1)(λ − 2).
3
(b) (5 pts) Find the eigenvalues of A and their corresponding eigenvectors.

The eigenvalues of A are λ1 = 0, λ2 = 1 and λ3
= 2. The corresponding eigenvectors are x1 = 

1
0 ,
−1




0
1
x2 =  1  and x3 =  0 .
0
1


−1 2 1
4. (12 points) All parts of this problem refer to the following matrix A =  −1 2 1 .
−2 2 2
(a) (1 pt) Is A invertible? Justify your answer. (You do not need to do any calculations to answer this question)
A is singular (not invertible) since the rows( columns) are dependent.


1
(b) (3 pts) Prove that x2 =  1  is an eigenvector of A and find its corresponding
1
eigenvalue λ2 .

Ax2 = 

2
2  = 2x2 . Thus x2 is an eigenvector of A with corresponding eigenvalue λ2 = 2.
2
(c) (2 pts) Use parts (a) and (b) to find the three eigenvalues of A.
since A is singular (part (a) ), it follows that λ1 = 0. From part (b) we have λ2 = 2. Thus trace A = 3
= λ1 + λ2 + λ3 = 0 + 2 + λ3 . Therefore, λ3 = 1.
(d) (6 pts) Is A diagonalizable? (Justify your answer) if so, find a nonsingular matrix
S and a diagonal matrix Λ such that A = SΛS −1 .
A is diagonalizable since it has 3 distinct eigenvalues. To find S, we first have to find the other 2
eigenvectors.Theeigenvector x1 corresponding to λ1 is obtained by solving (A − λ1 I)x1 = Ax1 = 0.
1
Thus x1 =  0 . The eigenvector x3 corresponding to λ3 is obtained by solving (A − λ3 I)x3 =
1


1
3
3
(A − I)x = 0. Thus x =  1 . Thus
0

1
S = [x x x ] =  0
1
1
5. (7 points) Let A =
2
3
1
1
1


1
0
1  and Λ =  0
0
0
0
2
0

0
0 .
1
0 2
.
2 0
(a) (4 pts) Orthogonally diagonalize A, i.e., find an orthogonal matrix Q and a
diagonal matrix Λ such that A = QΛQT .
The characteristic polynomial is χ(λ) = λ2 − 4 = (λ − 2)(λ + 2).
Thus
the eigenvalues
areλ1 = 2
1
1
1
1
2
1
and x = √
. Thus
and λ2 = −2. The corresponding unit eigenvectors are x = √
2
2
1
−1
1
1
2
0
Λ=
, Q = √1
; and A = QΛQT .
2
1 −1
0 −2
(b) (3 pts) Use part (a) to find Ak , where k is a positive integer. (Your answer should
be a single matrix).
4
1
Ak = QΛk QT = √
2
1
1
1
−1
2k
0
0
(−2)k
1
√
2
1
1
1
−1
=
1
2
2k + (−2)k
2k − (−2)k
2k − (−2)k
2k + (−2)k
.
6. (7 points) Let B be the 2 × 2 matrix
with eigenvalues
λ1 = 2, λ2 = 0, and with
1
1
corresponding eigenvectors x1 =
and x2 =
.
0
−1
(a) (1 pt) Is B diagonalizable? i.e. is there a nonsingular matrix S and a diagonal
matrix Λ such that A = SΛS −1 . Justify your answer.
Yes since B has 2 independent eigenvectors or since B has 2 distinct eigenvalues.
(b) (4 pts) Find B.
B = SΛS −1 where Λ =
2
0
B=
0
0
1
0
and S =
1
−1
1
0
2
0
0
0
1
−1
1
0
. Thus
1
−1
=
2
0
2
0
.
(c) (2 pt) Is B orthogonally diagonalizable? i.e. is there an orthogonal matrix Q
and a diagonal matrix Λ such that A = QΛQT . Justify your answer.
No since B is not symmetric or since x1 is not orthogonal to x2 or since {x1 , x2 } is not an orthonormal
basis.
7. (a) Find the equation of the straight line y = a0 + a1 t which best fits the following
points:
t 1 2 3 4
y 1 2 2 3

1
 1
A=
 1
1
AT A =



1
1


2 
2 

and b = 
 2 . Thus
3 
4
3
4
10
10
30
. Hence, (AT A)−1 =
1
20
−10
4
30
−10
=
3/2
−1/2
−1/2
1/5
and AT b =
8
23
a0
1/2
= (AT A)−1 AT b =
. Thus the equation of the line which best fits these
a1
3/5
points is y = 1/2 + (3/5)t = 0.5 + 0.6t.
Thus x̂ =

1
 1
(b) Find the projection matrix onto C(A), where A = 
 1
1

1
2 
.
3 
4
The projection matrix on C(A) is

1 

P = A(AT A)−1 AT =
20 
14
8
2
−4
8
6
4
2
2
4
6
8

−4
2 
.
8 
14
.
5


 
 
1
2
1






1
2
 2   3  1 
8. (5 points) Let x1 = 
 0 , x =  1 , x =  2 .
0
1
0
(a) (1 pt) Are x1 , x2 , x3 linearly independent? Justify your answer.

1
 1
1 2 3

[x x x ] = 
0
0
2
2
1
1


1

1 
→

2 
0
1
0
0
0
2
0
1
1


1

0 
→



2
0
1
0
0
0
2
1
0
0
1
2
-2
0





Therefore, x1 , x2 , x3 are linearly independent since every column has a pivot.
(b) (4 pts) Use Gram-Schmidt orthogonalization process to find an orthogonal basis
for Span({x1 , x2 , x3 }).
First we find an orthogonal basis. u1 = x1 .



 

2
1
0
2 )T u1






4
(x
2 
 1   0 
u2 = x2 − 1 T 1 u1 = 
 1  − 2  0  =  1 .
(u ) u
1
0
1





 

1
1
0
0
3
T
1
3
T
2
 1  2 1  2 0   0 
(x ) u
(x ) u




 

u3 = x3 − 1 T 1 u1 − 2 T 2 u2 = 
 2  − 2  0  − 2  1  =  1 .
(u ) u
(u ) u
0
0
1
−1

1
 1 
1 2 3

(c) (2 pts) Find the coordinates of the vector v = 
 1  w.r.t the basis {u , u , u }
1
obtained in part (b).

The coordinates of v w.r.t. the basis U = {u1 , u2 , u3 } are α1 , α2 , α3 where v = α1 u1 +α2 u2 +α3 u3 . Now
2 T
2 T 2
3 T
3 T 3
(u1 )T v = α1 (u1 )T u1 . Thus α1 = 1.
 (u ) v = α2 (u ) u . Thus α2 = 1. Finally, (u ) v = α3 (u ) u .
1
Thus α3 = 0. Hence, [v]U =  1 .
0
(d) (1 pts) Find an orthonormal basis for Span({x1 , x2 , x3 }).
an orthonormal basis is






1
0
0





1
1
1
1  2
0  3
0 
}.
{q 1 = u1 /||u1 || = √ 
, q = u2 /||u2 || = √ 
, q = u3 /||u3 || = √ 





0
1
1 
2
2
2
0
1
−1


 
 
1
2
3






1
2
 2   3  3 
9. (5 points) Let x1 = 
 0 , x =  1 , x =  1 .
0
1
1
(a) (1 pt) Are x1 , x2 , x3 linearly independent? Justify your answer.

1
 1
[x1 x2 x3 ] = 
 0
0
2
2
1
1


3

3 
→

1 
1
1
0
0
0
2
0
1
1
Therefore, these vectors are linearly dependent.


3

0 
→

1 
1
1
0
0
0
2
1
1
0


3

1 
→



1
0
1
0
0
0
2
1
0
0

3

1 
.
0 
0
6
(b) Let S = span of {A1 =
1 0
1 0
, A2 =
2 1
2 1
, A3 =
3 1
}. Find a basis
3 1
of S.
Note that these matrices are related to the vectors in part (a). In particular, to check whether these
matrices are linearly independent we need to solve α1 A1 + α2 A2 + α3 A3 = 0. This is the same
as α1 x1 + α2 x2 + α3 x3 = 0. But we saw in part (a), that x1 and x2 are independent since their
corresponding columns have pivots. Hence, A1 and A2 are linearly and independent and thus {A1 , A2 }
is a basis of S.
10. (5 points) Let the characteristic polynomial of a matrix A be χA (λ) = (−1−λ)2 (3−λ).
(a) (1 pt) What is the order (size or dimension) of A?
A is 3 × 3.
(b) (2 pts) What is the trace of A?
trace A is −1 − 1 + 3 = 1.
(c) (2 pts) Is A nonsingular (invertible)? Justify your answer.
A is non singular since det A = (−1)(−1)(3) = 3 6= 0.
11. (6 points)


x1
(a) (3 pts) Let S be the set of vectors x =  x2  in R3 such that x2 = 5x1 and
x3
x3 = 0. Is S a subspace of R3 ? Justify your answer. If S indeed is a subspace of
R3 , then find a basis of S and the dimension of S.

a

5a  and
Proof: let x and y be any two vectors in S. Then x =
0




a+b
b
y =  5b  for some a and b. Thus x + y =  5a + 5b  is also in S. Moreover, for any scalar α,
0
0


αa
αx =  α5a  is also in S. Hence, S is a subsapce of R3 .
0




1
a



5 . Therefore S is a
5a . Thus S = span of
Another proof: any vector in S is of the form
0
0
subsapce of R3 .


1

Moreover a basis of S is { 5 } and dim S = 1.
0

True, S is a subspace of R3 .
x1
(b) (3 pts) Let S2 be the set of vectors x =
in R2 such that x1 ≥ 0 and
x2
x2 ≤ 0. Is S2 a subspace of R2 ? Justify your answer. If S2 indeed is a subspace
of R2 , then find a basis of S2 and the dimension of S2 .
False, S2 is NOT a subspace of R2 . Let x =
1
−1
and let α = −1. Then x is in S2 but αx =
is not in S2 .
12. (6 points) Let A =
1 α
, where α is a scalar.
0 2
−1
1
7
(a) (1 pt) For what values of α is A nonsingular (invertible)? Justify your answer.
A is nonsingular iff det A 6= 0. But det A =2 is independent of α. therefore, A is nonsingular for all α.
(b) (2 pt) For what values of α is A diagonalizable? Justify your answer.
For all values of α, the eigenvalues of A are 1 and 2 since A is uppertriangular. Thus A is diagonalizable
for all α since it has 2 distinct eigenvalues.
(c) (3 pt) For what values of α is A orthogonally diagonalizable? i.e., A = QΛQT
where Q is an orthogonal matrix and Λ is a diagonal matrix. Justify your answer.
A is orthogonally diagonalizable iff A is symmetric iff α = 0.
13. (6 pts) Let λ1 , . . . , λn be the eigenvalues of matrix A. Assume that A is diagonalizable.
(a) Prove that (λ1 )2 , . . . , (λn )2 be the eigenvalues of matrix A2 .
Since A is diagonalizable, A = SΛS −1 where Λ is a diagonal matrix consisting of the eigenvalues of A.
Therefore, A2 = SΛS −1 SΛS −1 = SΛ2 S −1 . Thus the result follows.
(b) Prove that detA = λ1 · · · λn .
Since A is diagonalizable, A = SΛS −1 where Λ is a diagonal matrix consisting of the eigenvalues of A.
Therefore, det A = det SΛS −1 = det S det Λ det S −1 = det Λ since detS det S −1 = 1. Thus the result
follows since det Λ = λ1 · · · λn since Λ is a diagonal matrix.
14. (10 points) Every part of this problem is independent.
(a) (3 points) Let q 1 , q 2 , q 3 be an orthonormal basis of R3 and let x = α1 q 1 + α2 q 2 +
α3 q 3 , where α1 , α2 , α3 are scalars. Find α1 in terms of x, q 1 , q 2 , q 3 .
multiplying by (q 1 )T we get (q 1 )T x = α1 (q 1 )T q 1 +α2 (q 1 )T q 2 +α3 (q 1 )T q 3 . But (q 1 )T q 1 = 1, (q 1 )T q 2 =
(q 1 )T q 3 = 0 since q 1 , q 2 , q 3 be an orthonormal basis. Therefore, α1 = (q 1 )T x.
(b) (3 pts) Let A and B two n × n matrices. Is it true or false that det(AB) =
det(BA)? Justify your answer.
True, |AB| = |A| |B| = On the other hand |BA| = |B| |A|. But |A| |B| = |B| |A| since det is a scalar
and scalars commute. Therefore, |AB| = |BA|.
(c) (4 pts) Let A and B be two 3 × 3 matrices such that det(A) = 2 and det(B) = 3.
Let C = AT (2B)−1 . Find det(C).
|(2B)−1 | = 1/(|2B|) = 1/(23 |B|) = 1/24. |C| = |AT | |(2B)−1 | = 2/24 = 1/12.
(d) (6 pts) Prove that N (A) = N (AT A), where N (A) is the null space of A.
To prove that two sets say S1 and S2 are equal, we need to prove two things: First, S1 ⊆ S2 ( i.e., S1
is a subset of S2 or every element in S1 is an element of S2 ). Second, S2 ⊆ S1 .
Therefore, first we prove that N (A) ⊆ N (AT A). Let x ∈ N (A), then Ax = 0. Hence, AT Ax = 0. Thus
x ∈ N (AT A). Consequently, N (A) ⊆ N (AT A).
Second, we prove that N (AT A) ⊆ N (A). Let x ∈ N (AT A), then AT Ax = 0. Hence, xT AT Ax = 0 or
(Ax)T Ax = 0 or ||Ax||2 = 0. Thus Ax = 0 since the only vector whose norm is 0 is the zero vector.
Therefore, x ∈ N (A). Consequently, N (AT A) ⊆ N (A). As a result, N (A) = N (AT A).
15. Consider the transformation T : R2 → R2 defined by
x1
x1 − x2
T(
)=
.
x2
−x1 + x2
(a) Determine whether or not T is linear. Justify your answer.
8
T is linear since:
T (c
x1
x2
)=
cx1 − cx2
−cx1 + cx2
= c T(
x1
x2
).
and
T(
y1
x1
y1 − y2
x1 − x2
x1 + y1 − x2 − y2
y1
).
)+T (
= T(
+
==
)=
+
y2
x2
−y1 + y2
−x1 + x2
−x1 − y1 + x2 + y2
y2
x1
x2
(b) Find the matrix representation of T w.r.t standard basis in R2 .
1
x1
)=
T(
x2
−1
−1
1
x1
x2
.
Therefore, the matrix representation of T w.r.t standard basis in R2 is A =
(c) Find the matrix representation of T w.r.t basis B = {
B=
1
1
1
−1
0
. Thus [A]B = B −1 AB =
0
0
2
.
1
1
1
−1
,
−1
1
.
1
} of R2 .
−1
Download