18.06.02: ‘Vectors’ Lecturer: Barwick Wednesday 05 February 2016 18.06.02: ‘Vectors’ Lines vs. vectors A vector is a list of real numbers ๐1 ๐ ๐โ = ( 2 ) . โฎ ๐๐ We draw this vector as an arrow pointing from the point (0, 0, … , 0) to the point (๐1 , ๐2 , … , ๐๐ ): . (0, 0, … , 0) (๐1 , ๐2 , … , ๐๐ ) . 18.06.02: ‘Vectors’ By extending this vector, we obtain a line: . . This is the unique line between the origin and (๐1 , ๐2 , … , ๐๐ ). Question. How can we write a formula for a parametrization of this line? 18.06.02: ‘Vectors’ ๐(๐ก) = (๐ก๐1 , ๐ก๐2 , … , ๐ก๐๐ ). Note that there’s one exception to this: the zero vector 0 0 0โ โ ( ) โฎ 0 doesn’t give us much of a line. This is the only special vector. We’ll return to this. 18.06.02: ‘Vectors’ Two vectors may determine the same line: . . Question. When does this happen? . 18.06.02: ‘Vectors’ If there’s a real number ๐ such that ๐1 ๐ ( 2 )=( โฎ ๐๐ ๐๐1 ๐๐2 ), โฎ ๐๐๐ then we say that ๐โ is a scalar multiple of ๐,โ and we just write ๐โ = ๐๐.โ In this case, as long as ๐โ and ๐โ are nonzero, they determine the same line. So lines through the origin are the “same thing” as nonzero vectors, up to scaling. 18.06.02: ‘Vectors’ R1 and R2 We write R๐ for the collection of all vectors ๐1 ๐ ( 2 ). โฎ ๐๐ Thus R1 is simply the set of real numbers, but we regard those real numbers as arrows from the origin to the number as it sits on the number line: . . . 18.06.02: ‘Vectors’ Note that if ๐ ∈ R1 is a nonzero real number, then any real number is a scalar mutiple of ๐. In other words, R1 is 1-dimensional. ๐1 ). This is the set of vectors that ๐2 point from the origin to points on the plane. How about R2 ? This is the set of vectors ( A nonzero vector ๐โ ∈ R2 specifies a line through the origin. Two vectors ๐,โ ๐โ ∈ R2 specify two distinct lines through the origin if and only if there exists no real number ๐ such that ๐โ = ๐๐:โ . 18.06.02: ‘Vectors’ Now remember that we translated each line along the other to end up with a parallelogram. Let’s do that here in such a way that the vectors meet head to tail: . We thus get a new vector that points at the new vertex we created. . 18.06.02: ‘Vectors’ Now let’s clear out the lines spanned by these vectors, and gaze lovingly at the vectors themselves: . What we’ve drawn there is a way of taking two vectors ๐โ and ๐,โ translating ๐โ along ๐โ and translating ๐โ along ๐โ to make a parallelogram. The diagonal vector of that parallelogram is ๐ ๐ ๐ + ๐1 ๐โ + ๐โ = ( 1 ) + ( 1 ) = ( 1 ). ๐2 ๐2 ๐2 + ๐2 This is vector addition. 18.06.02: ‘Vectors’ The fact that you can get that diagonal by translating ๐โ along ๐โ or by translating ๐โ along ๐โ is way of visualizing the commutativity of vector summation. It turns out that all the algebraic properties of vectors have pictures to go along with them. Question. What does associativity look like? adding with 0?โ the formation of negatives? the distribution of scalar multiplication over vector addition? 18.06.02: ‘Vectors’ R3 ๐1 Of course R is the set of all vectors ( ๐2 ). Consider two vectors ๐,โ ๐โ ∈ R3 ๐3 such that there exists no real number ๐ such that ๐โ = ๐๐.โ These two vectors define a plane: the vectors that can be writen as a linear combination ๐ ๐โ + ๐ก๐;โ we call this plane the span of ๐โ and ๐.โ So a vector ๐ โ does not lie in the span of ๐โ and ๐โ if and only if it can’t be written as a linear combination of of ๐โ and ๐.โ 3 Once we’ve found such a ๐,โ I claim that any vector ๐ฃ โ ∈ R3 can be written as a linear combination of ๐,โ ๐,โ and ๐.โ Geometrically, this means any point (๐ฃ1 , ๐ฃ2 , ๐ฃ3 ) lies in the span of ๐,โ ๐,โ and ๐.โ 18.06.02: ‘Vectors’ ๐ฃ1 Algebraically, this means that for any ๐ฃ โ = ( ๐ฃ2 ), there is a solution (๐, ๐ , ๐ก) ๐ฃ3 to the equation ๐ฃ โ = ๐๐โ + ๐ ๐โ + ๐ก๐.โ But that equation is secretly the system of linear equations ๐ฃ1 = ๐๐1 + ๐ ๐1 + ๐ก๐1 ; ๐ฃ2 = ๐๐2 + ๐ ๐2 + ๐ก๐2 ; ๐ฃ3 = ๐๐3 + ๐ ๐3 + ๐ก๐3 . What we’re saying is that this set of equations has a solution. 18.06.02: ‘Vectors’ But there’s more: the vectors ๐,โ ๐,โ and ๐ โ are linearly indepenedent – that is, the lines they span are independent. What this means algebraically is that none of them are zero, and there’s no way to write ๐ โ = ๐ ๐โ + ๐ก๐โ or ๐โ = ๐๐โ + ๐๐ โ or ๐โ = ๐๐โ + ๐๐.โ 18.06.02: ‘Vectors’ We can express this more efficiently: what it means for ๐,โ ๐,โ and ๐ โ to be linearly indepenedent is that if ๐๐โ + ๐ ๐โ + ๐ก๐ โ = 0,โ then ๐ = ๐ = ๐ก = 0. ๐ฃ1 What we will learn is that for any ๐ฃ โ = ( ๐ฃ2 ), there is a unique solution ๐ฃ3 โ (๐, ๐ , ๐ก) to the equation ๐ฃ โ = ๐๐โ + ๐ ๐ + ๐ก๐.โ 18.06.02: ‘Vectors’ Let’s do an example. We’ll find three independent vectors. We have to start 1 with the nonzero vector ๐โ = ( 2 ). For our next vector, we just need to 1 2 โ select a vector that is not a multiple of ๐.โ Here’s one: ๐ = ( 0 ). 0 18.06.02: ‘Vectors’ ๐1 Now for the tricky bit. We want a third vector, ๐ โ = ( ๐2 ), that doesn’t lie ๐3 โ in the plane spanned by ๐โ and ๐. That’s more than just making sure that ๐ โ is not a scalar multiple of ๐โ or ๐.โ −5/2 1 OK, does ( 15 ) lie in the plane spanned by ๐โ = ( 2 ) and ๐โ = 15/2 1 2 ( 0 )?? 0 18.06.02: ‘Vectors’ It looks like it does! 1 2 −2/5 15 ( 2 ) + (−5) ( 0 ) = ( 15 ) . 2 1 0 15/2 So how do we make sure that we get a vector that doesn’t live on that plane? 0 Here’s one that will work: ( 0 ). What makes me so sure that will work? 2 18.06.02: ‘Vectors’ 1 2 0 โ See, a linear combination of ๐โ = ( 2 ) and ๐ = ( 0 ) that gives ( 0 ) 1 0 2 would have to have a nonzero coefficient on ๐,โ and that will give a nonzero second coordinate. So we have our example: 1 2 0 โ ๐โ = ( 2 ) , ๐ = ( 0 ) , ๐ โ = ( 0 ) 1 0 2 are three linearly independent vectors in R3 . 18.06.02: ‘Vectors’ ๐ฃ1 Now my claim is that any vector ๐ฃ โ = ( ๐ฃ2 ) can be written as a unique ๐ฃ3 2 โ linear combination of ๐,โ ๐, and ๐.โ If ๐ฃ โ = ( −6 ), for example, we’re trying 14 to solve this system of linear equations: 2 = ๐ + 2๐ + 0๐ก; −6 = 2๐ + 0๐ + 0๐ก; 14 = ๐ + 0๐ + 2๐ก. So we straight away get ๐ = −3, ๐ = 5/2, and ๐ก = 17/2. 18.06.02: ‘Vectors’ Here is the general algebraic picture: Definition. If ๐1โ , ๐2โ , … , ๐๐โ are vectors of R๐ , then the span of ๐1โ , ๐2โ , … , ๐๐โ is the set of all linear combinations ๐1 ๐1โ + ๐2 ๐2โ + … + ๐๐ ๐๐โ . 18.06.02: ‘Vectors’ ๐ฃ1 ๐ฃ So ๐ฃ โ = ( 2 ) lies in the span of ๐1โ = ( โฎ ๐ฃ๐ if and only if the system of linear equations ๐11 ๐21 ) , … , ๐๐โ = ( โฎ ๐๐1 ๐ฃ1 = ๐1 ๐11 + ๐2 ๐12 + โฏ + ๐๐ ๐1๐ ; ๐ฃ2 = ๐1 ๐21 + ๐2 ๐22 + โฏ + ๐๐ ๐2๐ ; โฎ ๐ฃ๐ has at least one solution. = ๐1 ๐๐1 + ๐2 ๐๐2 + โฏ + ๐๐ ๐๐๐ . ๐1๐ ๐2๐ ) โฎ ๐๐๐ 18.06.02: ‘Vectors’ Geometrically, that means that a vector ๐ฃ โ lies in the span of ๐1โ , ๐2โ , … , ๐๐โ when you can build a multidimensional parallelopiped out of ๐1โ , ๐2โ , … , ๐๐โ to get ๐ฃ โ pointing from the origin across the diagonal. 18.06.02: ‘Vectors’ Figure 1: Doesn’t know how to make an image of multidimensional paralellopipeds. 18.06.02: ‘Vectors’ Definition. We say vectors ๐1โ , ๐2โ , … , ๐๐โ of R๐ are linearly independent if we’re in the following situation: any vanishing linear combination ๐1 ๐1โ + ๐2 ๐2โ + … + ๐๐ ๐๐โ = 0โ must be a trivial linear combination – that is, we must have ๐1 = ๐2 = โฏ = ๐๐ = 0. 18.06.02: ‘Vectors’ ๐11 ๐1๐ ๐ ๐ So the vectors ๐1โ = ( 21 ) , … , ๐๐โ = ( 2๐ ) are linearly indepenโฎ โฎ ๐๐1 ๐๐๐ dent if and only if any system of linear equations ๐ฃ1 = ๐1 ๐11 + ๐2 ๐12 + โฏ + ๐๐ ๐1๐ ; ๐ฃ2 = ๐1 ๐21 + ๐2 ๐22 + โฏ + ๐๐ ๐2๐ ; โฎ ๐ฃ๐ has at most one solution. = ๐1 ๐๐1 + ๐2 ๐๐2 + โฏ + ๐๐ ๐๐๐ . 18.06.02: ‘Vectors’ Geometrically, that means that none of the vectors ๐๐โ lie in the span of the remaining vectors ๐1โ , ๐2โ , … , ๐๐−1 โ , ๐๐+1 โ , … , ๐๐โ . 18.06.02: ‘Vectors’ Figure 2: Cool. Can we go play in the snow now? 18.06.02: ‘Vectors’ We’d like a more efficient way of checking whether a vector lies in a span of a certain collection of vectors and whether that collection of vectors is linearly independent. To do this, we’ll want to introduce two fundamental manipulations of vectors: forming the dot product of two vectors and combining a list of vectors into a matrix. The dot product extracts a bunch of helpful geometry, and matrices give us an efficient way to organize and conceptualize systems of linear equations. For next week, please read §§1.2–2.1 of Strang. The first problem set will be posted on Monday.