Solutions Homework 2 6.7 Prove that if V is a complex inner

advertisement
Solutions
Homework 2
6.7 Prove that if V is a complex inner-product space, then
hu, vi =
ku + vk2 − ku − vk2 + ku + ivk2 i − ku − ivk2 i
4
for all u, v ∈ V .
Proof. Expanding each term using linearity in the first term and conjugate linearity
in the second yields
ku + vk2 = hu + v, u + vi,
= kuk2 + hu, vi + hv, ui + kvk2 ,
= kuk2 + hu, vi + hu, vi + kvk2 ,
ku − vk2 = hu − v, u − vi,
= kuk2 − hu, vi − hu, vi + kvk2 ,
ku + ivk2 = hu + iv, u + ivi,
= kuk2 + hu, ivi + hiv, ui + kivk2 ,
= kuk2 − ihu, vi + ihv, ui + |i|2 kvk2 ,
= kuk2 − ihu, vi + ihu, vi + kvk2 ,
ku − ivk2 = hu − iv, u − ivi,
= kuk2 − hu, ivi − hiv, ui + kivk2 ,
= kuk2 + ihu, vi − ihv, ui + |i|2 kvk2 ,
= kuk2 + ihu, vi − ihu, vi + kvk2 .
Now
ku + vk2 − ku − vk2 = 2(hu, vi + hu, vi),
and
i(ku + ivk2 − ku − ivk2 ) = 2i(−ihu, vi + ihu, vi) = 2(hu, vi − hu, vi).
Together these show that
ku + vk2 − ku − vk2 + ku + ivk2 i − ku − ivk2 i = 4hu, vi.
6.8 A norm on a vector space U is a function k k : U → [0, ∞) such that kuk = 0 if and
only if u = 0, kαuk = |α|kuk for all α ∈ F and all u ∈ U , and ku + vk ≤ kuk + kvk
for all u, v ∈ U . Prove that a norm satisfying the parallelogram equality comes
from an inner product (in other words, show that if k k is a norm on U satisfying
the parallelogram equality, then there is an inner product h , i on U such that kuk =
hu, ui1/2 for all u ∈ U ).
Proof. Let
hu, vi =
ku + vk2 − ku − vk2 + ku + ivk2 i − ku − ivk2 i
.
4
1
2
This is positive definite as
k2uk2 + k(1 + i)uk2 i − k(1 − i)uk2 i
,
4
4kuk2 + |1 + i|2 kuk2 i − |1 − i|2 kuk2 i
=
,
4
= kuk2 .
hu, ui =
This is positive and only zero when u = 0 by the properties of the norm k k. The
inner product is conjugate symmetric as
kv + uk2 − kv − uk2 + kv + iuk2 i − kv − iuk2 i
,
4
ku + vk2 − | − 1|ku − vk2 + |i|k − iv + uk2 i − | − i|kiv + uk2 i
=
,
4
ku + vk2 − ku − vk2 + ku − ivk2 i − ku + ivk2 i
=
,
4
= hu, vi.
hv, ui =
6.9 Suppose n is a positive integer. Prove that
1 sin(x) sin(2x)
sin(nx) cos(x) cos(2x)
cos(nx)
√ , √ , √ ,..., √
, √ , √
,..., √
π
π
π
π
π
π
2π
is an orthonormal list of vectors in C[π, π], the vector space of continuous real-valued
functions on [π, π] with inner product
Z π
hf, gi =
f (x)g(x)dx.
−π
Proof. The first vector is orthogonal to the others as for all k = 1, . . . , n,
π
Z π
1 sin(kx)
1
−1
√ , √
=√
sin(kx)dx = √ cos(kx) = 0,
π
2π
2π −π
k 2π
−π
similarly for all k = 1, . . . , n,
π
Z π
1 cos(kx)
1
1
√ , √
=√
cos(kx)dx = √ sin(kx) = 0.
π
2π
2π −π
k 2π
−π
Now for all k = 1, . . . , n and l = 1, . . . , n
Z
sin(kx) cos(lx)
1 π
√
, √
=
sin(kx) cos(lx)dx,
π −π
π
π
Z 0
Z π
1
=
sin(kx) cos(lx)dx +
sin(kx) cos(lx)dx ,
π
−π
0
3
and substituting u = −x in the first integral shows
Z 0
Z π
sin(kx) cos(lx)
1
√
− sin(−ku) cos(−lu)du +
sin(kx) cos(lx)dx ,
=
, √
π
π
π
π
0
Z π
Z π
1
=
−
sin(ku) cos(lu)du +
sin(kx) cos(lx)dx ,
π
0
0
= 0.
Finally for all k, l = 1, . . . , n with k 6= l, integrating by parts shows
Z
sin(kx) sin(lx)
1 π
√
=
, √
sin(kx) sin(lx)dx,
π −π
π
π
π
Z
1
sin(kx) cos(lx) k π
=
−
cos(kx) cos(lx)dx ,
+ l
π
l
−π
−π
k cos(kx) cos(lx)
√
=
, √
,
l
π
π
and another integration by parts shows
Z
cos(kx) cos(lx)
1 π
√
, √
cos(kx) cos(lx)dx,
=
π −π
π
π
π
Z
1 cos(kx) sin(lx) k π
=
sin(kx) sin(lx)dx ,
+ l
π
l
−π
−π
k sin(kx) sin(lx)
√
=
, √
.
l
π
π
Therefore,
sin(kx) sin(lx)
k 2 sin(kx) sin(lx)
√
√
, √
, √
= 2
.
l
π
π
π
π
E
D
E
D
sin(lx)
cos(kx) cos(lx)
√
√
√
√
,
=
0,
which
implies
that
,
= 0.
Since k =
6 l this implies sin(kx)
π
π
π
π
They are orthonormal as
Z π
1 2
1
√ =
dx = 1,
2π −π 2π
and by the half angle formulas, for all k = 1, . . . , n
Z π
Z
sin(kx) 2
1 π 1 − cos(2kx)
2
√
= 1
sin (kx)dx =
dx = 1,
π −π
π −π
2
π and
Z π
Z π
cos(kx) 2
1
1
1 + cos(2kx)
2
√
=
cos (kx)dx =
dx = 1.
π −π
π −π
2
π
4
6.10 On P2 (R), consider the inner product given by
Z 1
hp, qi =
p(x)q(x)dx.
0
Apply the Gram-Schmidt procedure to the basis (1, x, x2 ) to produce an orthonormal
basis of P2 (R).
Solution: The first polynomial, 1, is a unit vector as
Z 1
2
1dx = 1,
k1k =
0
thus e1 (x) = 1. The second polynomial will be
R
1
x − 0 xdx 1
√
x − 21
1
e2 (x) =
=2 3 x−
.
= qR
x − 1 1
2
1 2
2
(x − 2 ) dx
0
The third polynomial is given by
R
R
1
1
x2 − 0 x2 dx) 1 − 12 0 x2 x − 12 dx) x − 12
e3 (x) =
,
x2 − 1 − x − 1 3
2
= qR
1
x2 − x +
x2
1
6
1 2
6
,
dx
−x+
√
1
= 6 5 x2 − x +
.
6
6.11 What happens if the Gram-Schmidt procedure is applied to a list of vectors that is
not linearly independent?
Solution: The Gram-Schmidt procedure can be describe inductively using orthogonal
projections as follows. For a list of vectors (v1 , . . . , vk )
v1
e1 =
U1 = span(e1 ),
kv1 k
vj+1 − PUj vj+1
ej+1 =
Uj+1 = span(e1 , . . . , ej+1 ).
kvj+1 − PUj vj+1 k
0
When the list is linearly dependent, let l be the smallest integer such that (v1 , . . . , vl )
is linearly independent and vl+1 ∈ span(v1 , . . . , vl ). Then the procedure produces
an orthonormal basis (e1 , . . . , el ) for Ul =span(e1 , . . . , el ) =span(v1 , . . . , vl ). Since
vl+1 ∈ Ul , at the next step of the procedure
vl+1 − PUl vl+1
0
el+1 = el+1 =
=
.
kvl+1 − PUl vl+1 k
k0k
This is not defined, however the procedure can be continued if we simply throw out
vl+1 . Proceeding in this manner will produce an orthonormal basis for span(v1 , . . . , vk ),
but there will be fewer than k vectors in this basis.
6.13 Suppose (e1 , . . . , em ) is an orthonormal list of vectors in V . Let v ∈ V . Prove that
kvk2 = |hv, e1 i|2 + · · · + |hv, em i|2
if and only if v ∈ span(e1 , . . . , em ).
5
Proof. First, extend (e1 , . . . , em ) to an orthonormal basis, (e1 , . . . , en ). Then since
v ∈V,
n
X
v=
ai ei .
i=1
Note that,
hv, ej i =
* n
X
+
ai ei , ej
=
i=1
n
X
ai hei , ej i = aj ,
i=1
and
kvk2 =
* n
X
ai e i ,
i=1
=
=
n X
n
X
n
X
+
aj e j
,
j=1
ai āj hei , ej i,
i=1 j=1
n
X
|ai |2 ,
i=1
=
n
X
|hv, ei i|2 .
i=1
2
Now assume that kvk = |hv, e1 i|2 + · · · + |hv, em i|2 , then
n
X
|hv, ei i|2 = 0,
i=m+1
and since all of the terms in the sum are non-negative, it must be that each is zero.
That is, for i = m + 1, . . . , n, |hv, ei i|2 = 0, which implies ai = hv, ei i = 0. Thus
v = a1 e1 + · · · + am em and v ∈span(e1 , . . . , em ).
For the converse, assume that v ∈span(e1 , . . . , em ). This implies that ai = 0 for
i = m + 1, . . . , n, thus
m
X
kvk2 =
|hv, ei i|2 .
i=1
6.16 Suppose U is a subspace of V . Prove that U ⊥ = {0} if and only if U = V .
Proof. Assume that U ⊥ = {0}, and let v ∈ V . Since h0, vi = 0, and 0 is the only
vector in U ⊥ , v ∈ (U ⊥ )⊥ . Since (U ⊥ )⊥ = U , this implies that v ∈ U , and V ⊂ U .
Therefore, as U is assumed to be a subspace of V , U = V .
Now assume, conversely, that U = V , and let u ∈ U ⊥ . Let (e1 , . . . , en ) be an
orthonormal basis for V then
u = hu, e1 ie1 + · · · + hu, en ien .
Since U = V , ei ∈ U for all i = 1, . . . , n, this implies that hu, ei i = 0. Therefore
u = 0, and U ⊥ = {0}.
6
2.1.1 For V = R3 , with inner product given by the dot product, hx, yi = x · y, v1 =
(1, 0, 1), v2 = (0, 1, 1), v3 = (1, 3, 3), apply the Gram-Schmidt procedure to obtain
an orthonormal basis {e1 , e2 , e3 }. Then write v = (1, 1, 2) in terms of {e1 , e2 , e3 }.
Solution: First set
(1, 0, 1)
e1 = √ ,
2
then
(0, 1, 1) − (v2 · e1 )e1
,
e2 =
k(0, 1, 1) − (v2 · e1 )e1 k
(0, 1, 1) −
=
(1,0,1)
√1
√
2
2
k(0, 1, 1) − (1/2, 0, 1/2)k
(−1, 2, 1)
√
=
,
6
,
and finally
e3 =
(1, 3, 3) − (v3 · e1 )e1 − (v3 · e2 )e2
,
k(1, 3, 3) − (v3 · e1 )e1 − (v3 · e2 )e2 k
(1, 3, 3) −
=
(1,0,1)
√4
√
2
2
−
(−1,2,1)
√8
√
6
6
k(1, 3, 3) − (2, 0, 2) − (−4/3, 8/3, 4/3)k
(1, 1, −1)
√
=
.
3
Now v · e1 =
compute
√3 ,
2
v · e2 =
√3 ,
6
,
and v · e3 = 0. Therefore v =
√3 e1
2
+
q
3
e,
2 2
to check
3
3
√ e1 + √ e2 = (3/2, 0, 3/2) + (−1/2, 1, 1/2),
2
6
= (1, 1, 2) = v.
2.1.2 For V = R2 , hx, yi = 2x1 y1 − x1 y2 − x2 y1 + 2x2 y2 , apply the Gram-Schmidt procedure
to v1 = (1, 0), v2 = (0, 1) to obtain an orthonormal basis {e1 , e2 }.
Solution: First, since kv1 k2 = 2,
e1 =
v1
(1, 0)
= √ .
kv1 k
2
Next,
v2 − hv2 , e1 ie1
e2 = (0, 1) − 1 h(1, 0), (0, 1)i(1, 0) ,
2
(0, 1) − (−1)(1, 0)
= p
,
h(1, 1), (1, 1)i
(1, 1)
= √ .
2
2.2.1 Prove that U ⊥ is always a subspace of V .
7
Proof. Let v ∈ U ⊥ , then for all u ∈ U , hv, ui = 0. This implies that hcv, ui =
chv, ui = 0 for all u ∈ U , and thus cv ∈ U ⊥ . Therefore U ⊥ is closed under scalar
multiplication.
Let v1 , v2 ∈ U ⊥ , then for all u ∈ U , hv1 , ui = hv2 , ui = 0. This implies that
hv1 + v2 , ui = hv1 , ui + hv2 , ui = 0 for all u ∈ U , and thus v1 + v2 ∈ U ⊥ . Therefore
U ⊥ is closed under vector addition. This implies that U ⊥ is a subspace of V .
2.2.2 For V = R3 , hx, yi = x1 y1 + x2 y2 − x1 y3 − x3 y1 + 4x3 y3 , extend the orthonormal list
e1 = (1, 0, 0), e2 = (0, 1, 0) to an othonormal basis.
Solution: The vector v3 = (0, 0, 1) completes the list to a basis for V .
v3 − hv3 , e1 ie1 − hv3 , e2 ie2
e3 =
,
k(0, 0, 1) − (−1)(1, 0, 0) − (0)(0, 1, 0)k
(1, 0, 1)
= √ .
3
Download