December 9, 2015
• Final exam: Friday, 18 December, 1-3 PM in ST 208.
• Review sessions:
– Monday, 14 December, 1-3 PM in LCB 222
– Tuesday, 15 December, 1-3 PM in LCB 222
– Thursday, 17 December, 3:30-5:30 PM in LCB 222
Please bring questions and think of topics you would like me to review.
• For additional help, please make an appointment with me by email. I would also be happy to go through your past tests with you and give you suggestions for how to improve on the final exam.
• Comprehensive.
• Two pages of inventions and eight pages of multiple-part questions: think of the final exam as twice a regular test. 10 points per page, 100 points total, and each point counts twice as much as a regular test point. Thus the final exam counts four times as much as a regular test, so it is worth 40% of the final grade. That’s a lot! The reason the final exam is worth so much is that the concepts in linear algebra take time to absorb and I want students who may have struggled early in the course to be amply rewarded for figuring things out by the end of the course.
• You will have two full hours to complete the test, which is 20 minutes more than twice a regular test.
• The final exam will emphasize concepts slightly more than the regular tests. There may be more questions where you are given information and have to know how to use it, as in the SVD problem on Test #5.
• Study the five tests and the posted solutions thoroughly!
• Make use of the review sessions!
• When working on problems, try to understand what you are doing and why you are doing it instead of just memorizing the steps you have to follow. This will help prevent you from getting confused by what a problem is asking. It will also prevent you from having to memorize lots of algorithms. And it will certainly help you answer conceptual questions.
• Look over the five tests and the final exam from last year (available on my webpage).
Many but not all of the concepts and problems are similar. Last year’s tests are a good source of practice inventions.
• When reviewing concepts we saw early in the course, think about all the ways those concepts have been used throughout the course. For example Nul A , which is the subspace of all solutions of A ~ = b , can be used to check whether vectors are independent, is used to compute eigenspaces, is the orthogonal complement of Col A , coincides with the kernel of the linear transformation R n
A · R m , and so on.
• When taking the test, try not to leave anything blank! If you get stuck on an invention, invent something that satisfies as much of the problem as you can. If you’re having trouble with an early part of a multiple-part question, guess a reasonable answer and use that answer on later parts. You can get full credit on later parts if your guess didn’t oversimplify the problem!
• Study the quizzes and their solutions, the recommended problems, and extra problems in the textbook.
• For a detailed list of topics to study, look at the study guides for Tests #2, #3, #4, and #5. The material on Test #1 is also relevant, even though I didn’t make a study guide for that test.
• If you are struggling with time management, come talk to me about how you can work more e ffi ciently! Think about writing less extraneous information on the test and work on understanding the concepts and definitions so you don’t have to spend a long time trying to recall them.
2
The material covered in the course can be roughly grouped into the following four categories.
Think about definitions and intuition for the following concepts without relying on column vectors or matrices.
• Examples: R n ; P n
; M m ⇥ n
.
• Basic concepts: vector space; vector; vector addition; scalar multiplication; linear combination; span; independence; dependence; subspace; basis; dimension.
• Linear transformations: domain; codomain; image of a vector; kernel; range; one-toone; onto. If the domain and codomain are the same vector space, then eigenvectors and eigenvalues make sense.
R n
• Uses of row reduction:
•
– Solve A ~ =
~
A x =
~ for x .
– Determine if vectors are independent, if they span R n , and if they are a basis of
R n
Linear transformations R n
T R m :
– Every such T is multiplication by an m ⇥ n matrix A . (If we care mostly about the matrix and don’t want to give T a name, we can write R n A · R m .)
– The kernel of T equals Nul A ; the range of T equals Col A .
– Row reduction can be used to compute Nul A , Col A , Row A , inverse matrices, and eigenvectors.
– The action of T can be visualized geometrically if n and m are 3.
• Dot product: gives a notion of orthogonality, which is used for orthogonal projection and least squares approximations.
• Advanced algorithms for A :
– Diagonalization.
A must be square. Only works sometimes. Good for computing powers of A and studying discrete dynamical systems.
– Orthogonal diagonalization. Works whenever A is symmetric. Used in the SVD.
– Singular value decomposition (SVD). Works for any m ⇥ n matrix A . Good for approximating A by lower rank matrices and studying trends in the rows and columns of A .
3
• Coordinates allow you to use all of the special techniques for R n to study any abstract vector space or linear transformation.
Use the simplest basis you can think of for your abstract vector space to make the coordinates easy to compute.
• Changing coordinates is useful even on R n look simpler: to make linear transformations R n
A · R m
– Diagonalization: if A is square and diagonalizable, then changing coordinates using an eigenvector basis makes A look diagonal.
– Orthogonal diagonalization: like diagonalization except that the change of coordinates is orthogonal (it preserves lengths and angles).
– SVD: making orthogonal changes of coordinates in the domain and codomain makes any A look diagonal with non-negative entries.
• Solving systems of linear equations by row reduction.
• Studying discrete dynamical systems (including Markov chains) by diagonalization.
• Computing lines of best fit using least squares techniques.
• Other applications we discussed that will not be tested: eigenvectors in Google’s Page
Rank algorithm; using the SVD to study Congress voting data and to compress images.
4