Chapter 6 Inner Product Spaces 6.1 Inner Products General Inner Products { Definition: An inner product on a real vector space V is a function that associates a real number <u,v> with each pair of vectors u and v in V in such a way that the following axioms are satisfied for all vectors u, v, and w in V and all scalars k. 1. 2. 3. 4. <u,v> = <v,u> <u+v,w> = <u,w>+<v,w> <ku,v> = k<u,v> axiom] <v,v> ≥ 0 and <v,v> = 0 iff axiom] [symmetry axiom] [Additivity axion] [Homogeneity v=0 [Positivity A real vector space with an inner product is called a real inner product space. General Inner Products { Weighted Euclidean Inner product If w1, w2,...,wn are positive real numbers (weights), the formula: <u,v> =w1u1v1 + w2u2v2 + ... + wnunvn Length and Distance in Inner Product Spaces { Definition If V is an inner product space, then the norm (or length) of a vector u in V is denoted by ║u║ and is defined by ║u║ = <u,u>½. The distance between two points (vectors) u and v is denoted by d(u,v) and is defined by d(u,v) = ║u-v║ Length and Distance in Inner Product Spaces { Example: [Norm and Distance in Rn] If u=(u1,u2,...,un) and v=(v1,v2,...,vn) are vectors in Rn with the Euclidean inner product, then u = u,u 1 2 = (u • u) 2 = u12 + u22 +L+ un2 1 d (u, v) = u − v = u − v, u − v = 1 2 = [(u − v) • (u − v)] 2 (u1 − v1 )2 + (u2 − v2 )2 +L+ (un − vn )2 1 Unit Circles and Spheres in Inner Product Spaces { If V is an inner product space, then the set of points in V that satisfy ║u║=1 is called the unit sphere or sometimes the unit circle in V. Unit Products Generated by Matrices { Let ⎡ u1 ⎤ ⎡ v1 ⎤ ⎢u ⎥ ⎢v ⎥ 2 u = ⎢ ⎥ and v = ⎢ 2 ⎥ ⎢M⎥ ⎢M⎥ ⎢ ⎥ ⎢ ⎥ ⎣u n ⎦ ⎣vn ⎦ be vectors in Rn, and let A be an invertible nxn matrix. Inner product on Rn generated by A formula: u, v = A u • A v T u, v = ( A v ) A u u, v { = vT AT Au Example: [Inner Product Generated by the Identity Matrix] The inner product on Rn generated by the nxn identity matrix is the Euclidean inner product, since substituting A=I: u, v = Iu • Iv = u • v Unit Products Generated by Matrices The weighted Euclidean inner product <u,v>=3u1v1+2u2v2 is the inner product on R2 generated by ⎡ 3 A = ⎢ ⎣ 0 0 ⎤ ⎥ 2⎦ ⎡ 3 0 ⎤⎡ 3 v 2 ]⎢ ⎥⎢ 0 2 ⎣ ⎦⎣ 0 ⎡ 3 0 ⎤ ⎡ u1 ⎤ = [v 1 v 2 ]⎢ ⎥⎢ ⎥ ⎣0 2 ⎦ ⎣u 2 ⎦ = 3u 1v 1 + 2 u 2 v 2 u, v = [v 1 0 ⎤ ⎡ u1 ⎤ ⎥⎢ ⎥ 2 ⎦ ⎣u 2 ⎦ Unit Products Generated by Matrices In general, the weighted Euclidean inner product u,v = w1u1v1 + w2u2v2 +L+ wnunvn is the inner product on Rn generated by ⎡ ⎢ A= ⎢ ⎢ ⎢ ⎢⎣ w1 0 0 w2 0 L 0 L M M M 0 0 0 L 0 ⎤ ⎥ 0 ⎥ M ⎥ ⎥ wn ⎥⎦ Unit Products Generated by Matrices { ⎡u1 u2 ⎤ ⎡v1 v2 ⎤ U=⎢ andV = ⎢ ⎥ ⎥ u u v v ⎣ 3 4⎦ ⎣ 3 4⎦ ( ) ( inner product on M22: ) U, V = tr U T V = tr V T U = u1v1 + u2 v2 + u3v3 + u4 v4 { Example: [An Inner Product on M22] Unit Products Generated by Matrices { Theorem: [Properties of Inner Products] If u, v, and w are vectors in a real inner product space, and k is any scalar, then: (a) 0, v = v,0 = 0 (b) u, v + w = u, v + u,w (c) u, kv = k u, v (d) u − v,w = u,w − v,w (d) u, v − w = u, v − u,w { Example: [Calculating with Inner Products] u − 2v,3u + 4v = u,3u = u,3u + 4v − 2v,3u + u,4v + 4v − 2v,3u − 2v,4v = 3 u, u + 4 u, v − 6 v, u − 8 v, v = 3 u 2 + 4 u, v − 6 u, v − 8 v = 3 u 2 − 2 u, v − 8 v 2 2 Exercises { Show that u, v = 5u1v1 − u1v2 − u 2 v1 + 10u 2 v2 is the inner product on R2 generated by ⎡ 2 1⎤ A=⎢ ⎥ − 1 3 ⎣ ⎦ Use that inner product to compute u, v if u = (0,−3) and v = (1,7) 6.2 Angle and Orthogonality in Inner Product Spaces Cauchy-Schwarz Inequality { If u and v are nonzero vectors in R2 or R3 and θ is the angle between them, then: u • v = cos θ = cos θ = u , v u { { u v cos v θ u • v u v u , v u v , cos θ ≤ 1 ≤ 1 Theorem: [Cauchy-Schwarz Inequality] If u and v are vectors in a real inner product space, then u , v ≤ u v Alternative forms: u , v 2 ≤ u , u v , v u,v 2 ≤ u 2 v 2 Cauchy-Schwarz Inequality { { Theorem: [Properties of Length] If u and v are vectors in an inner product space V, and if k is any scalar, then: (a) u ≥ 0 (b) u = 0 if and only if u = 0 (c) k u = k u (d) u + v ≤ u + v (Triangle inequality ) Theorem: [Properties of Distance] If u, v, and w are vectors in an inner product space V, and if k is any scalar, then: (a) d (u, v ) ≥ 0 (b) d (u, v ) = 0 if and only if u = v (c) d (u, v ) = d (v, u ) (d) d (u, v ) ≤ d (u, w ) + d (w, v ) (Triangle inequality) Angle Between Vectors { { cosθ = u, v and 0 ≤ θ ≤ π Angle between u and v: u v Example: [Cosine of an Angle Between Two Vectors in R4] Let have the Euclidean inner product. Find the cosine of the angle θ between the vectors u=(4,3,1,-2) and v=(-2,1,2,3) Solution: u = 30 , v = 18 , and u, v = − 9 cos θ = u, v u v =− 9 3 =− 30 18 2 15 Orthogonality { { Definition: Two vectors u and v in an inner product space are called orthogonal if <u,v>=0 Theorem: [Generalized Theorem of Pythagoras] If u and v are orthogonal vectors in an inner product space, then 2 2 u+v = u + v 2 Orthogonal Complements { { L Definition: Let W be a subspace of an inner product space V. A vector u in V is said to be orthogonal to W if it is orthogonal to every vector in W, and the set of all vectors in V that are orthogonal to W is called the orthogonal complement of W. Theorem: [Properties of Orthogonal Complements] If W is a subspace of a finite-dimensional inner product space V, then: a) b) c) W⊥ is a subspace of V The only vector common to W and W⊥ is 0 The orthogonal complement of W⊥ is W; that is, (W⊥)⊥ =W v A Geometric Link Between Nullspace and Row Space { Theorem: If A is mxn matrix, then: a) b) { The nullspace of A and the row space of A are orthogonal complements in Rn with respect to the Euclidean inner product The nullspace of AT and the column space of A are orthogonal complements in Rm with respect to the Euclidean inner product Example: [Basis for an Orthogonal Complement] Let W be the subspace of R5 spanned by the vectors w1=(2,2,-1,0,1), w2=(-1,-1,2,-3,1) w3=(1,1,-2,0,-1), w4=(0,0,1,1,1) Find a basis for the orthogonal complement of W A Geometric Link Between Nullspace and Row Space Solution: ⎡ ⎢ A = ⎢ ⎢ ⎢ ⎣ v 1 − 2 2 − 1 0 − 1 1 0 − 1 1 0 2 − 3 0 1 ⎡ ⎢ ⎢ = ⎢ ⎢ ⎢ ⎢⎣ − 1⎤ 1 ⎥⎥ 0 ⎥ and ⎥ 0 ⎥ 0 ⎥⎦ − 2 1 v 2 ⎡ ⎢ ⎢ = ⎢ ⎢ ⎢ ⎢⎣ 1 ⎤ 1 ⎥⎥ − 1⎥ ⎥ 1 ⎦ − 1⎤ 0 ⎥⎥ − 1⎥ ⎥ 0 ⎥ 1 ⎥⎦ Vectors v1 and v2 form a basis for this nullspace and form a basis for the orthogonal complement of W Summary { Theorem: [Equivalent Statements] If A is an nxn matrix, and if TA:Rn|Rn is multiplication by A, then the following are equivalent: a) b) c) d) e) f) g) h) A is invertible Ax=0 has only the trivial solution The reduced row-echelon form of A is In A is expressible as a product of elementary matrices Ax=b is consistent for every nx1 matrix b Ax=b has exactly one solution for every nx1 matrix b det(A) ‡ 0 The range of TA is Rn Summary i) j) k) l) m) n) o) p) q) r) s) TA is one-to-one The column vectors of A are linearly independent The row vectors of A are linearly independent The column vectors of A span Rn The row vectors of A span Rn The column vectors of A form a basis for Rn The row vectors of A form a basis for Rn A has rank n A has nullity 0 The orthogonal complement of the nullspace of A is Rn The orthogonal complement of the row space of A is {0} Exercises { Find a basis for the orthogonal complement of the subspace of Rn spanned by the vectors (a) v1 = (2,0,−1), v 2 = (4,0,−2) (b) v1 = (1,4,5,2), v 2 = (2,1,3,0), v 3 = (−1,3,2,2) 6.3 Orthonormal Bases; GramSchmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition { { Definition A set of vectors in an inner product space is called an orthogonal set if all pairs of distinct vectors in the set are orthogonal. An orthogonal set in which each vector has norm 1 is called orthonormal. Example: [An Orthogonal Set in R3] Let u1=(0,1,0), u2=(1,0,1), u3=(1,0,-1) and assume that R3 has the Euclidean inner product. Set of vectors S={u1, u2, u3} is orthogonal since u1 , u2 = u1 , u3 = u2 , u3 = 0 Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition { If v is a nonzero vector in an inner product 1 space, then the vector v v has norm 1 (normalizing v), since 1 1 1 v { v = = v v v v = 1 An orthogonal set of nonzero vectors can always be converted to an orthonormal set by normalizing each of its vectors. The Euclidean norms of the vectors are u 1 = 1, u 2 = 2, u3 = 2 normalizing u1, u2, and u3 yields v1 = u1 u1 = ( 0 ,1 , 0 ), v 2 = u u 2 2 1 ⎞ ⎛ 1 = ⎜ ,0 , ⎟, v 2 2 ⎝ ⎠ 3 = u u 3 3 ⎛ 1 ,0 ,− = ⎜ 2 ⎝ 1 ⎞ ⎟ 2 ⎠ Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition { { { In an inner product space, a basis consisting of orthonormal vectors is called an orthonormal basis, and a basis consisting of orthogonal vectors is called an orthogonal basis. Orthonormal basis for R3: i=(1,0,0), j=(0,1,0), k=(0,0,1) Orthonormal basis for Rn: e1=(1,0,0,…,0), e2=(0,1,0,…,0), …, en=(0,0,0,…,1) Coordinates Relative to Orthonormal Bases { Theorem: If S={v1,v2,…,vn} is an orthonormal basis for an inner product space V, and u is any vector in V, then u = u, v1 v1 + u, v2 v2 +L+ u, vn vn { The scalars u, v1 , u, v2 ,L, u, vn are the coordinates of the vector u relative to the orthonormal basis S={v1,v2,…,vn} and (u)S = ( u,v1 , u,v2 ,L, u,vn ) is the coordinate vector of u relative to this basis Coordinates Relative to Orthonormal Bases { Example: [Coordinate Vector Relative to an Orthonormal Basis] Let v1=(0,1,0), v2=(-4/5,0,3/5),v3=(3/5,0,4/5) Express the vector u=(1,1,1) as a linear combination of the vectors in S, and find the coordinate vector (u)S. 1 7 u, v = 1, u, v = − , u, v = 5 5 Solution: 1 2 3 1 7 v2 + v3 5 5 4⎞ 3⎞ 7 ⎛3 1⎛ 4 (1,1,1) = ( 0 ,1, 0 ) − ⎜ − , 0 , ⎟ + ⎜ , 0 , ⎟ 5⎠ 5⎠ 5⎝5 5⎝ 5 u = v1 − The coordinate vector of u relative to S is (u ) S = ( u, v 1 , u, v 2 , L , u, v n ) = ⎛⎜ 1, − 1 , 7 ⎞⎟ ⎝ 5 5⎠ Coordinates Relative to Orthonormal Bases { Theorem: If S is an orthonormal basis for an n-dimensional inner product space, and if (u) S = (u1, u2 ,L, un ) and ( v) S = (v1, v2 ,L, vn ) then: (a) u = u12 + u22 + L + un2 (b) d (u, v) = (u1 − v1 ) 2 + (u2 − v2 ) 2 + L + (un − vn ) 2 (c) u, v = u1v1 + u2 v2 + Lun vn Coordinates Relative to Orthonormal Bases { Example: [Calculating Norms Using Orthonormal Bases] If R3 has the Euclidean inner product, then the norm of the vector u=(1,1,1) is u = (u • u) 2 = 12 + 12 + 12 = 3 1 The coordinate vector of u relative to S ⎛ 1 7⎞ (from last example) is (u) S = ⎜1,− , ⎟ ⎝ 5 5⎠ The norm of u is 2 2 ⎛ 1⎞ ⎛7⎞ u = 1 + ⎜− ⎟ + ⎜ ⎟ = ⎝ 5⎠ ⎝5⎠ 2 75 = 25 3 Coordinates Relative to Orthogonal Bases { If S={v1,v2,…,vn} is an orthogonal basis for a vector space V, then normalizing each of these ⎧vectors yields the orthonormal v v ⎪⎫ ⎪ v basis S ′ = ⎨⎪ v , v , L , v ⎬⎪ ⎩ ⎭ if u is any vector in V: 1 2 n 1 2 n v1 v1 u, v 1 u = u, u= v1 { 2 v1 v v2 v + u, 2 + L + u, n v1 v2 v2 vn u, v 2 u, v n + L + v1 + v vn 2 2 2 v2 vn vn vn Theorem: If S={v1,v2,…,vn} is an orthogonal set of nonzero vectors in an inner product space, then S is linearly independent Orthogonal Projections { { Theorem: [Projection Theorem] If W is a finite-dimensional subspace of an inner product space V, then every vector u in V can be expressed in exactly one way as u=w1+w2 where w1 is in W and w2 is in W⊥. Theorem: Let W be a finite-dimensional subspace of an inner product space V a) If {v1,v2,…,vr} is an orthonormal basis for W, and u is any vector in V, then projW u = u, v1 v1 + u, v 2 v 2 + L + u, vr vr b) If {v1,v2,…,vr} is an orthogonal basis for W, and u is any vector in V, then projW ⊥ u = u, v1 v1 2 v1 + u, v 2 v2 2 v2 + L+ u, vr vr 2 vr Finding Orthogonal and Orthonormal Bases { Theorem: Every nonzero finitedimensional inner product space has an orthonormal basis QR-Decomposition { Theorem: [QR-Decomposition] If A is an mxn matrix with linearly independent column vectors, then A can be factored as A = QR where Q is an mxn matrix with orthonormal column vectors, and R is an nxn invertible upper triangular matrix Exercises { Find the QR-decomposition of the matrix, where possible: ⎡1 0 2⎤ ⎡1 − 1⎤ ⎢0 1 1 ⎥ (a) ⎢ (b) ⎥ ⎢ ⎥ 2 3 ⎣ ⎦ ⎢⎣1 2 0⎥⎦