MATH 323 Linear Algebra Lecture 20: Inner product spaces.

advertisement
MATH 323
Linear Algebra
Lecture 20:
Inner product spaces.
Orthogonal sets.
Norm
The notion of norm generalizes the notion of length
of a vector in Rn .
Definition. Let V be a vector space. A function
α : V → R, usually denoted α(x) = kxk, is called
a norm on V if it has the following properties:
(i) kxk ≥ 0, kxk = 0 only for x = 0 (positivity)
(ii) kr xk = |r | kxk for all r ∈ R
(homogeneity)
(iii) kx + yk ≤ kxk + kyk
(triangle inequality)
A normed vector space is a vector space endowed
with a norm. The norm defines a distance function
on the normed vector space: dist(x, y) = kx − yk.
Examples. V = Rn , x = (x1, x2, . . . , xn ) ∈ Rn .
• kxk∞ = max(|x1|, |x2|, . . . , |xn |).
• kxkp = |x1|p + |x2|p + · · · + |xn |p
1/p
Examples. V = C [a, b], f : [a, b] → R.
• kf k∞ = max |f (x)|.
a≤x≤b
• kf kp =
Z
a
b
|f (x)|p dx
1/p
, p ≥ 1.
, p ≥ 1.
Inner product
The notion of inner product generalizes the notion
of dot product of vectors in Rn .
Definition. Let V be a vector space. A function
β : V × V → R, usually denoted β(x, y) = hx, yi,
is called an inner product on V if it is positive,
symmetric, and bilinear. That is, if
(i) hx, xi ≥ 0, hx, xi = 0 only for x = 0 (positivity)
(ii) hx, yi = hy, xi
(symmetry)
(iii) hr x, yi = r hx, yi
(homogeneity)
(iv) hx + y, zi = hx, zi + hy, zi (distributive law)
An inner product space is a vector space endowed
with an inner product.
Examples. V = Rn .
• hx, yi = x · y = x1y1 + x2y2 + · · · + xn yn .
• hx, yi = d1 x1y1 + d2x2 y2 + · · · + dn xn yn ,
where d1 , d2, . . . , dn > 0.
• hx, yi = (Dx) · (Dy),
where D is an invertible n×n matrix.
Problem. Find an inner product on R2 such that
he1 , e1 i = 2, he2 , e2 i = 3, and he1 , e2i = −1,
where e1 = (1, 0), e2 = (0, 1).
Let x = (x1, x2), y = (y1, y2) ∈ R2 .
Then x = x1 e1 + x2e2 , y = y1e1 + y2 e2.
Using bilinearity, we obtain
hx, yi = hx1 e1 + x2e2 , y1e1 + y2 e2i
= x1 he1 , y1e1 + y2 e2i + x2he2 , y1e1 + y2e2 i
= x1y1he1 , e1i + x1 y2he1 , e2i + x2y1he2 , e1i + x2y2he2 , e2i
= 2x1y1 − x1y2 − x2y1 + 3x2y2.
It remains to check that hx, xi > 0 for x 6= 0.
hx, xi = 2x12 − 2x1 x2 + 3x22 = (x1 − x2 )2 + x12 + 2x22 .
Example. V = Mm,n (R), space of m×n matrices.
• hA, Bi = trace (AB T ).
If A = (aij ) and B = (bij ), then hA, Bi =
n
m P
P
i=1 j=1
Examples. V = C [a, b].
Z b
• hf , g i =
f (x)g (x) dx.
a
• hf , g i =
Z
b
f (x)g (x)w (x) dx,
a
where w is bounded, piecewise continuous, and
w > 0 everywhere on [a, b].
w is called the weight function.
aij bij .
Theorem Suppose hx, yi is an inner product on a
vector space V . Then
hx, yi2 ≤ hx, xihy, yi for all x, y ∈ V .
Proof: For any t ∈ R let vt = x + ty. Then
hvt , vt i = hx + ty, x + tyi = hx, x + tyi + thy, x + tyi
= hx, xi + thx, yi + thy, xi + t 2hy, yi.
hx, yi
Assume that y 6= 0 and let t = −
. Then
hy, yi
hx, yi2
hvt , vt i = hx, xi + thy, xi = hx, xi −
.
hy, yi
Since hvt , vt i ≥ 0, the desired inequality follows.
In the case y = 0, we have hx, yi = hy, yi = 0.
Cauchy-Schwarz Inequality:
p
p
|hx, yi| ≤ hx, xi hy, yi.
Corollary 1 |x · y| ≤ kxk kyk for all x, y ∈ Rn .
Equivalently, for all xi , yi ∈ R,
(x1y1 + · · · + xn yn )2 ≤ (x12 + · · · + xn2)(y12 + · · · + yn2).
Corollary 2 For any f , g ∈ C [a, b],
2 Z b
Z b
Z
2
|f (x)| dx ·
f (x)g (x) dx ≤
a
a
b
|g (x)|2 dx.
a
Norms induced by inner products
Theorem Suppose hx, yi is anpinner product on a
vector space V . Then kxk = hx, xi is a norm.
Proof: Positivity is obvious. Homogeneity:
p
p
p
kr xk = hr x, r xi = r 2hx, xi = |r | hx, xi.
Triangle inequality (follows from Cauchy-Schwarz’s):
kx + yk2 = hx + y, x + yi
= hx, xi + hx, yi + hy, xi + hy, yi
≤ hx, xi + |hx, yi| + |hy, xi| + hy, yi
≤ kxk2 + 2kxk kyk + kyk2 = (kxk + kyk)2.
Examples. • The length of a vector in Rn ,
p
|x| = x12 + x22 + · · · + xn2,
is the norm induced by the dot product
x · y = x1 y1 + x2 y2 + · · · + xn yn .
• The norm kf k2 =
Z
a
b
|f (x)|2 dx
1/2
on the
vector space C [a, b] is induced by the inner product
Z b
hf , g i =
f (x)g (x) dx.
a
Angle
Since |hx, yi| ≤ kxk kyk, we can define the angle
between nonzero vectors in any vector space with
an inner product (and induced norm):
hx, yi
.
∠(x, y) = arccos
kxk kyk
Then hx, yi = kxk kyk cos ∠(x, y).
In particular, vectors x and y are orthogonal
(denoted x ⊥ y) if hx, yi = 0.
x+y
x
y
Pythagorean Law:
x ⊥ y =⇒ kx + yk2 = kxk2 + kyk2
Proof:
kx + yk2 = hx + y, x + yi
= hx, xi + hx, yi + hy, xi + hy, yi
= hx, xi + hy, yi = kxk2 + kyk2.
y
x−y
x+y
x
x
y
Parallelogram Identity:
kx + yk2 + kx − yk2 = 2kxk2 + 2kyk2
Proof: kx+yk2 = hx+y, x+yi = hx, xi + hx, yi + hy, xi + hy, yi.
Similarly, kx−yk2 = hx, xi − hx, yi − hy, xi + hy, yi.
Then kx+yk2 + kx−yk2 = 2hx, xi + 2hy, yi = 2kxk2 + 2kyk2.
Orthogonal sets
Let V be an inner product space with an inner
product h·, ·i and the induced norm k · k.
Definition. A nonempty set S ⊂ V of nonzero
vectors is called an orthogonal set if all vectors in
S are mutually orthogonal. That is, 0 ∈
/ S and
hx, yi = 0 for any x, y ∈ S, x 6= y.
An orthogonal set S ⊂ V is called orthonormal if
kxk = 1 for any x ∈ S.
Remark. Vectors v1, v2, . . . , vk ∈ V form an
orthonormal set if and only if
1 if i = j,
hvi , vj i =
0 if i 6= j.
Examples. • V = Rn , hx, yi = x · y.
The standard basis e1 = (1, 0, 0, . . . , 0),
e2 = (0, 1, 0, . . . , 0), . . . , en = (0, 0, 0, . . . , 1).
It is an orthonormal set.
• V = R3 , hx, yi = x · y.
v1 = (3, 5, 4), v2 = (3, −5, 4), v3 = (4, 0, −3).
v1 · v2 = 0, v1 · v3 = 0, v2 · v3 = 0,
v1 · v1 = 50, v2 · v2 = 50, v3 · v3 = 25.
Thus the set {v1, v2, v3} is orthogonal but not
orthonormal. An orthonormal set is formed by
normalized vectors w1 = kvv11k , w2 = kvv22k ,
w3 = kvv33k .
• V = C [−π, π], hf , g i =
Z
π
f (x)g (x) dx.
−π
f1 (x) = sin x, f2(x) = sin 2x, . . . , fn (x) = sin nx, . . .
hfm , fn i =
Z
π
−π
sin(mx) sin(nx) dx =
π
0
if m = n,
if m =
6 n.
Thus the set {f1, f2, f3, . . . } is orthogonal but not
orthonormal.
It is orthonormal with respect to a scaled inner
product
Z
1 π
f (x)g (x) dx.
hhf , g ii =
π −π
Orthogonality =⇒ linear independence
Theorem Suppose v1, v2, . . . , vk are nonzero
vectors that form an orthogonal set. Then
v1 , v2, . . . , vk are linearly independent.
Proof: Assume t1 v1 + t2 v2 + · · · + tk vk = 0
for some scalars t1, t2 , . . . , tk ∈ R. We have to
show that all these scalars are zeroes.
For any index 1 ≤ i ≤ k we have
ht1 v1 + t2 v2 + · · · + tk vk , vi i = h0, vi i = 0.
=⇒ t1 hv1 , vi i + t2 hv2 , vi i + · · · + tk hvk , vi i = 0
By orthogonality, ti hvi , vi i = 0 =⇒ ti = 0.
Orthonormal bases
Let v1 , v2, . . . , vn be an orthonormal basis for an
inner product space V .
Theorem Let x = x1v1 + x2 v2 + · · · + xn vn and
y = y1v1 + y2v2 + · · · + yn vn , where xi , yj ∈ R. Then
(i) hx, yi = x1 y1 + x2y2 + · · · + xn yn ,
p
(ii) kxk = x12 + x22 + · · · + xn2 .
Proof: (ii) follows from (i) when y = x.
* n
+
*
+
n
n
n
X
X
X
X
hx, yi =
x i vi ,
y j vj =
x i vi ,
y j vj
i =1
=
j=1
n
n
XX
i =1
xi yj hvi , vj i =
i =1 j=1
n
X
i =1
j=1
xi yi .
Orthogonal projection
Theorem Let V be an inner product space and V0 be a
finite-dimensional subspace of V . Then any vector x ∈ V is
uniquely represented as x = p + o, where p ∈ V0 and
o ⊥ V0 .
The component p is called the orthogonal projection of the
vector x onto the subspace V0 .
x
o
p
V0
The projection p is closer to x than any other vector in V0 .
Hence the distance from x to V0 is kx − pk = kok.
Let V be an inner product space. Let p be the
orthogonal projection of a vector x ∈ V onto a
finite-dimensional subspace V0 .
If V0 is a one-dimensional subspace spanned by a
hx, vi
v.
vector v then p =
hv, vi
If v1 , v2, . . . , vn is an orthogonal basis for V0 then
hx, v1i
hx, v2i
hx, vn i
p=
v1 +
v2 + · · · +
vn .
hv1 , v1i
hv2 , v2i
hvn , vn i
Indeed, hp, vi i =
n
X
hx, vj i
hx, vi i
hvj , vi i =
hvi , vi i = hx, vi i
hv
,
v
i
hv
,
v
i
j
j
i
i
j=1
=⇒ hx−p, vi i = 0 =⇒ x−p ⊥ vi =⇒ x−p ⊥ V0 .
Coordinates relative to an orthogonal basis
Theorem If v1 , v2, . . . , vn is an orthogonal basis
for an inner product space V , then
hx, v2i
hx, vn i
hx, v1 i
v1 +
v2 + · · · +
vn
x=
hv1 , v1i
hv2 , v2i
hvn , vn i
for any vector x ∈ V .
Corollary If v1 , v2, . . . , vn is an orthonormal basis
for an inner product space V , then
x = hx, v1 iv1 + hx, v2iv2 + · · · + hx, vn ivn
for any vector x ∈ V .
Download