slides

advertisement
Lecture 11
Fundamental Theorems of Linear
Algebra
Orthogonalily and Projection
Shang-Hua Teng
The Whole Picture
• Rank(A) = m = n
Ax=b has unique solution
R   I
• Rank(A) = m < n
Ax=b has n-m dimensional solution
R  I F 
I 
R 
0 
I F 
• Rank(A) < n, Rank(A) < m
R

0
0
Ax=b has 0 or n-rank(A) dimensions


• Rank(A) = n < m
Ax=b has 0 or 1 solution
Basis and Dimension of a Vector Space
• A basis for a vector space is a sequence of
vectors that
– The vectors are linearly independent
– The vectors span the space: every vector in the
vector can be expressed as a linear combination
of these vectors
Basis for 2D and n-D
• (1,0), (0,1)
• (1 1), (-1 –2)
• The vectors v1,v2,…vn are basis for Rn if and
only if they are columns of an n by n
invertible matrix
Column and Row Subspace
• C(A): the space spanned by columns of A
– Subspace in m dimensions
– The pivot columns of A are a basis for its column space
• Row space: the space spanned by rows of A
–
–
–
–
Subspace in n dimensions
The row space of A is the same as the column space of AT, C(AT)
The pivot rows of A are a basis for its row space
The pivot rows of its Echolon matrix R are a basis for its row
space
Important Property I: Uniqueness
of Combination
• The vectors v1,v2,…vn are basis for a vector
space V, then for every vector v in V, there is a
unique way to write v as a combination of
v1,v2,…vn .
v = a1 v1+ a2 v2+…+ an vn
v = b1 v1+ b2 v2+…+ bn vn
• So: 0=(a1 - b1) v1 + (a2 -b2 )v2+…+ (an -bn )vn
Important Property II: Dimension
and Size of Basis
• If a vector space V has two set of bases
– v1,v2,…vm . V = [v1,v2,…vm ]
– w1,w2,…wn . W= [w1,w2,…wn ].
• then m = n
– Proof: assume n > m, write W = VA
– A is m by n, so Ax = 0 has a non-zero solution
– So VAx = 0 and Wx = 0
• The dimension of a vector space is the number of
vectors in every basis
– Dimension of a vector space is well defined
Dimensions of the Four Subspaces
Fundamental Theorem of Linear
Algebra, Part I
•
•
•
•
•
Row space: C(AT) – dimension = rank(A)
Column space: C(A)– dimension = rank(A)
Nullspace: N(A) – dimension = n-rank(A)
Left Nullspace: N(AT) – dimension = m –rank(A)
Orthogonality and Orthogonal
Subspaces
• Two vectors v and w
are orthogonal if
vw  v w  w v  0
T
T
• Two vector subspaces V and W are
orthogonal if
for all v V and w W , v w  0
T
Example: Orthogonal Subspace in 5
Dimensions
1 1 0 
 0   0  
      
    
0 1 1 
 0   0  




C 0, 0, 1   C 0, 0 
 0  0   0  
0 1 
      
    
0 0 0 
1 1 
The union of these two subspaces is R5
Orthogonal Complement
• Suppose V is a vector subspace a vector space W
• The orthogonal complement of V is
V   {w W such that w  V }
• Orthogonal complement is itself a vector
subspace
dim( V  )  dim( V )  dim( W )
Dimensions of the Four Subspaces
Fundamental Theorem of Linear
Algebra, Part I
•
•
•
•
•
Row space: C(AT) – dimension = rank(A)
Column space: C(A)– dimension = rank(A)
Nullspace: N(A) – dimension = n-rank(A)
Left Nullspace: N(AT) – dimension = m –rank(A)
Orthogonality of the Four Subspaces
Fundamental Theorem of Linear
Algebra, Part II
• The nullspace is the
orthogonal complement
of the row space in Rn
• The left Nullspace is the
orthogonal complement
of the column space in
Rm
•

N ( A)  C ( A )
T


N ( AT )  (C ( A)) 
Proof
• The nullspace is the
orthogonal
complement of the
row space in Rn

N ( A)  C ( A )
N ( A)  x : Ax  0

C ( AT )  AT y : y  R m
implying

 

T

x A y  A y x  y T Ax  y T  Ax   0
T
T
T
T


The Whole Picture
dim r
C(AT)
xr
A xr = b
C(A)
b
A x= b
dim r
x  xr  xn
Rn
dim n- r
Rm
xn
N(A)
A xn = 0
N(AT)
dim m- r
Uniqueness of The Typical Solution
• Every vector in the column space comes from one
and only one vector xr from the row space
• Proof: suppose there are two xr , yr from the row
space such that Axr =A yr =b, then
Axr -A yr = A(xr -yr ) = 0
(xr -yr ) is in row space and nullspace hence must be 0
• The matching of dim in row and column spaces
Deep Secret of Linear Algebra
Pseudo-inverse
• Throw away the two null spaces, there is an
r by r invertible matrix hiding insider A.
• In some sense, from the row space to the
column space, A is invertible
• It maps an r-space in n space to an r-space
in m-space
Invertible Matrices
• Any n linearly independent vector in Rn
must span Rn . They are basis.
• So Ax = b is always uniquely solvable
• A is invertible
Projection
• Projection onto an axis
(a,b)
x axis is a vector subspace
Projection onto an Arbitrary Line
Passing through 0
Projection on to a Plane
Projection onto a Subspace
•
Input:
1. Given a vector subspace V in Rm
2. A vector b in Rm…
•
Desirable Output:
–
–
–
A vector in x in V that is closest to b
The projection x of b in V
A vector x in V such that (b-x) is orthogonal
to V
How to Describe a Vector Subspace
V in Rm
• If dim(V) = n, then V has n basis vectors
– a1, a2, …, an
– They are independent
• V = C(A) where A = [a1, a2, …, an]
Projection onto a Subspace
•
Input:
1. Given n independent vectors a1, a2, …, an in Rm
2. A vector b in Rm…
•
Desirable Output:
–
–
–
A vector in x in C([a1, a2, …, an]) that is closest to b
The projection x of b in C([a1, a2, …, an])
A vector x in V such that (b-x) is orthogonal to C([a1,
a2, …, an])
Think about this Picture
dim r
C(AT)
xr
A xr = b
C(A)
b
A x= b
dim r
x  xr  xn
Rn
dim n- r
Rm
xn
N(A)
A xn = 0
N(AT)
dim m- r
Download