Document 13205406

advertisement
Mathematics 369
Homework (due Mar 29)
A. Hulpke
46) For x = (x1 , x2 , x3 )T , y = (y1 , y2 , y3 )T , determine which of the following functions are
inner
for R3 :
products
a) x | y = x1 y1 + x3 y3
b) x | y = x1 y1 − x2 y2 + x3 y3
c) x | y = 2x1 y1 + x2 y2 + 4x3 y3
d) x | y = x12 y21 + x22 y22 + x32 y23
47) Let
p V be a vector space with inner product S = h·, ·i. We define the S-norm on V by
kvkS = hv, vi.
a) Show that for u, v ∈ V we have that ku + vk2S = kuk2S + kvk2S + 2 hu, vi.
b) Show the Theorem of Pythagoras: If u ⊥ v then ku + vk2S = kuk2S + kvk2S .
kuk2S + kvk2S
. (Hint: Consider u − v.)
a) Show that for u, v ∈ V we have that hu, vi ≤
2
48)
Let V = R3 with an inner product given by the Gram matrix


2
2 −6
3 −5 
GS =  2
−6 −5 22
with respect to the standard basis S.
a) Calculate the inner product of (1, 2, 3)T with (4, 5, 6)T .
b) Let U = Span((1, 0, 0)T , (0, 1, 0)T ). Calculate a basis of U ⊥ .
49∗ )
Let V
= Span(1, x, x2 , x4 ) with inner product h f , gi =
Z 1
−1
f (x)g(x) dx. Let B = (1, x, x2 , x4 ).
a) Calculate a Gram matrix for the scalar product with respect to B.
b) Let U = Span(x + x2 , x4 ). Calculate U ⊥ ≤ V .
Let V be a vector space with inner product S = h·, ·i.
A basis B = (b1 , . . . , bn ) is said
D
E
1 if i = j
to be orthonormal (with respect to S) if bi , bj = δi, j =
.
0 otherwise
n Show that in this case we have for every v ∈ V that v = ∑ v, bi bi , i.e. the coefficients for
50)
i=1
v with respect to B can be obtained simply as scalar products.
n (Hint: Consider the vector w = ∑ v, bi bi as given and show that w − v is orthogonal to
i=1
all basis vectors, then conclude that w = v)
Note: The coefficients v, bi are sometimes called Fourier coefficients, as the process of
Fourier transformation calculates
exactly these coefficients for the vector space C[−π, π]
Z
with the inner product
1
π
π
f (x)g(x) dx and the (infinite) orthonormal basis
−π
B = (sin(x), sin(2x), . . . , sin(mx), . . . , cos(x), cos(2x), . . . , cos(mx), . . .).
Problems marked with a ∗ are bonus problems for extra credit.
Please note that Dr. Painter will not conduct his review sessions the week of March 27-31.
Subspace Projection
An important type of problem is to compute the projection of a vector onto a subspace.
This note describes how to proceed in general. If you know this method, you can solve
all projection problems and don’t need to learn special cases which some books teach
separately. However I would recommend that you don’t just memorize the formulae but
learn how we obtain the projection equations – then you can deduce them easily again if
you need them.
Suppose that V is a vector space and B = (b1 , . . . , bn ) is a set of vectors in V . Let W =
Span(B), so W is a subspace of V (in general W 6= V ).
1. Problem We are given a vector v ∈ V and want to compute the orthogonal projection
p = PW (v) of v onto W . This vector is defined by the property
v = p+z
with p ∈ W and z ⊥ W
(1)
If we consider kzk2 = kp − vk2 to be the distance between p and v, one can characterize
the projection p also by the property that it is the vector in W that is closest to v. In other
words: p is the best approximation of v by an element of W .
Since p ∈ W , we can write it as a linear combination of elements of B:
n
p = ∑ xi bi = x1 b1 + x2 b2 + · · · + xn bn .
(2)
i=1
To compute p and z for a given v, we substitute the expression for p from equation (2) into
equation (1) and get:
!
n
v=
∑ xibi
(3)
+ z.
i=1
The main step now is that we form the scalar product of v with the vectors in B. The product
with b j yields
*
! +
!
n
n
(4)
b j , v = b j , ∑ xi bi + z = ∑ xi b j , bi + b j , z
i=1
i=1
by the properties of the scalar product. Since we want that z ⊥ W , we have that b j , z = 0
for every j. Thus we get for every j an equation:
!
n
= x1 b j , b1 + x2 b j , b2 + · · · + xn b j , bn .
(5)
b j , v = ∑ xi b j , bi
i=1
We can write the system of these equations in matrix form and get the general system:

 

 
hb1 , b1 i hb1 , b2 i · · · hb1 , bn i
hb1 , vi
x1
 hb , b i hb , b i · · · hb , b i   x2   hb , vi 
2 2
2 n  
 2 1
  2

(6)

 ·  ..  = 

..
..
..
..
.
.

  .  

.
.
.
.
.
hbn , b1 i hbn , b2 i · · · hbn , bn i
hbn , vi
xn
Since the bi and v are given, we can evaluate all scalar products and solve for the xi .
Equation (2) then gives us the projection p (in other words: x is the coordinate vector (p)B
for p with respect to B).
Equation (1) gives the difference z. The distance between p and v is kzk2 .
Z
2. Example: Let V = C[0, 1] with the scalar product f, g =
Span(t,t 2 ). Compute the orthogonal projection of et onto W .
0
1
f (t)g(t) dt, and let W =
Solution:
Equation (6) becomes:
 Z
1
Z 1
2
t · t dt 
 0 t · t dt
Z 01
 Z 1
·
2
2 2
t · t dt
t · t dt
0
 Z

x1
x2
1
t

t · e dt 

=  Z 01

2 t
t · e dt
0
0
We evaluate the integrals (partial integration) to get
1
x1
1/3 1/4
=
·
e−2
x2
1/4 1/5
which has the numerical solution x1 = 4.903, x2 = −2.537.
Thus the projection is p = 4.903x − 2.537x2 .
3. Column Spaces In the special situation that V is a column space Rm with the standard
inner product, the scalar product of two vectors is hv, wi = vT w. Thus equation (6) becomes:
 
 T
  T 
b1 b1 bT1 b2 · · · bT1 bn
b1 v
x1
 bT b bT b · · · bT b   x2   bT v 
 2 1 2 2
  2 
2 n  
·
=  . .
(7)


 ..
.
..
.
.
..
..   .. 
 .
  .. 
.
xn
bTn b1 bTn b2 · · · bTn bn
bTn v
If we write the elements of B as columns in a matrix B, we can write this in the form


x1
 x2 


T
T
(B · B) · x = B v with x =  ..  ,
 . 
xn
(8)
(which is sometimes called the normal equation). Also equation (2) can be written as
p = B · x.
If B is linear independent (and thus a basis of its span), the matrix BT · B is invertible, thus
we can solve for x and obtain:
p = B(BT B)−1 BT · v.
(9)
We call P = B(BT B)−1 BT the projection matrix for orthogonal projection onto W . The orthogonal projection of v onto the column space of B is P · v.
Download