Vector Space and Linearly independence

advertisement
Vector Spaces and Linear independence
Definition: A vector space is a set V on which two operations + and · are defined, called vector
addition and scalar multiplication.
The operation + (vector addition) must satisfy the following conditions:
Closure: If u and v are any vectors in V, then the sum u + v belongs to V.
(1) Commutative law: For all vectors u and v in V, u + v = v + u
(2) Associative law: For all vectors u, v, w in V, u + (v + w) = (u + v) + w
(3) Additive identity: The set V contains an additive identity element, denoted by 0, such
that for any vector v in V, 0 + v = v and v + 0 = v.
(4) Additive inverses: For each vector v in V, the equations v + x = 0 and x + v =
0 have a solution x in V, called an additive inverse of v, and denoted by - v.
The operation · (scalar multiplication) is defined between real numbers (or scalars) and vectors,
and must satisfy the following conditions:
Closure: If v in any vector in V, and c is any real number, then the product c · v belongs
to V.
(5) Distributive law: For all real numbers c and all vectors u, v in V, c · (u + v) = c · u +
c·v
(6) Distributive law: For all real numbers c, d and all vectors v in V, (c+d) · v = c · v + d
·v
(7) Associative law: For all real numbers c,d and all vectors v in V, c · (d · v) = (cd) · v
(8) Unitary law: For all vectors v in V, 1 · v = v
Subspaces
Definition: Let V be a vector space, and let W be a subset of V. If W is a vector space with respect
to the operations in V, then W is called a subspace of V.
Theorem: Let V be a vector space, with operations + and ·, and let W be a subset of V. Then W
is a subspace of V if and only if the following conditions hold.
Sub0 W is nonempty: The zero vector belongs to W.
Sub1 Closure under +: If u and v are any vectors in W, then u + v is in W.
Sub2 Closure under ·: If v is any vector in W, and c is any real number, then c · v is in
W.
Vector addition and scalar
multiplication: a vector v (blue) is
added to another vector w (red, upper
illustration). Below, w is stretched by a
factor of 2, yielding the sum v
+ 2w.
First example: arrows in the plane
The first example of a vector space consists of arrows in a fixed plane, starting at one fixed point.
This is used in physics to describe forces or velocities. Given any two such arrows, v and w, the
parallelogram spanned by these two arrows contains one diagonal arrow that starts at the origin,
too. This new arrow is called the sum of the two arrows and is denoted v + w. Another operation
that can be done with arrows is scaling: given any positive real number a, the arrow that has the
same direction as v, but is dilated or shrunk by multiplying its length by a, is called multiplication
of v by a. It is denoted av. When a is negative, av is defined as the arrow pointing in the opposite
direction, instead.
The following shows a few examples: if a = 2, the resulting vector aw has the same direction as
w, but is stretched to the double length of w (right image below). Equivalently 2w is the sum w +
w. Moreover, (−1)v = −v has the opposite direction and the same length as v (blue vector
pointing down in the right image).
Second example: ordered pairs of numbers
A second key example of a vector space is provided by pairs of real numbers x and y. (The order
of the components x and y is significant, so such a pair is also called an ordered pair.) Such a pair
is written as (x, y). The sum of two such pairs and multiplication of a pair with a number is
defined as follows:
(x1, y1) + (x2, y2) = (x1 + x2, y1 + y2)
and
a (x, y) = (ax, ay).
Linearly independence:
In the theory of vector spaces the concept of linear dependence and linear independence of
the vectors in a subset of the vector space is central to the definition of dimension. A set of vectors
is said to be linearly dependent if one of the vectors in the set can be defined as a linear
combination of the other vectors. If no vector in the set can be written in this way, then the vectors
are said to be linearly independent.
A vector space can be of finite dimension or infinite dimension depending on the number of
linearly independent basis vectors. The definition of linear dependence and the ability to determine
whether a subset of vectors in a vector space are linearly dependent are central to determining a set
of basis vectors for a vector space.
Linearly independent vectors in R3
Linearly dependent vectors in a plane in
R3.
Definition:
The vectors in a subset S=(v1,v2,...,vk) of a vector space V are said to be linearly dependent, if
there exist a finite number of distinct vectors v1, v2, ..., vn in S and scalars a1, a2, ..., an, not all zero,
such that
where zero denotes the zero vector.
Notice that if not all of the scalars are zero, then at least one is non-zero, say a1, in which case
this equation can be written in the form
Example. Any set containing the zero vector is dependent. For if
nontrivial linear relationship in S.
, then
is a
Example. Consider
The vectors
as a vector space over
and
Then
in the usual way.
are independent in
, so
. To prove this, suppose
.
More generally, if F is a field, the standard basis vectors
are independent in
.
Example. The vectors
and
numbers a and b, not both 0, such that
are dependent in
. To show this, I have to find
There are many pairs of numbers that work. For example,
Likewise,
More generally, vectors
one another.
and
Example. The set of vectors
is a dependent set in
. For
in
are dependent if and only if they are multiples of
(One of the things I'll discuss shortly is how you find numbers like 3, 4, and 1 which give a linear
combination which equals .)
Example. The vectors
are independent in
.
Suppose
This is equivalent to the following set of linear equations:
You can verify that the solution is
. Hence, the vectors are independent.
The previous example illustrates an algorithm for determining whether a set of vectors is
independent. To determine whether vectors , , ...,
in a vector space V are independent, I
try to solve
If the only solution is
are dependent.
, then the vectors are independent; otherwise, they
Here's the most important special case of this.
Suppose that , , ...
are vectors in
equivalent to the matrix equation
, where F is a field. The vector equation above is
I'll obtain the solution
this way:

To test whether vectors ,
as columns and row reduce.
if and only if the coefficient matrix row reduces
, ...,
in
are independent, form the matrix with the vectors
The vectors are independent if and only if the row reduced echelon matrix has the
identity matrix as its upper block (with rows of zeros below):
I've drawn this picture as if
--- that is, as if the number of vectors is no greater than their
dimension. If
, row reduction as above cannot produce the
identity matrix as an
upper block.
Corollary. If
, a set of m vectors in
Example. I know that the set of vectors
is dependent.
in
is dependent without doing any computation. Any set of three (or more) vectors in
dependent.
is
Example. Determine whether the set of vectors
is independent in
.
I'll work this example from scratch and show the steps in the reasoning, then connect the result
with the algorithm I gave above.
The question is whether you can find
, not all 0, such that
This amounts to solving the system
The augmented matrix is
However, I can see that row operations will never change the fourth column, so I can omit it. So
the row reduction is
Remembering that there is a fourth all-zero column that I'm not writing, this says
The parametrized solution is
So, for example, I can get a nonzero solution by setting
fact,
. Then
and
In the row-reduced echelon matrix, you can see that I didn't get a copy of the
If this had happened,
says
,
, and
. And in
identity matrix.
, and the vectors would have been independent.
You can just do the algorithm if you wish, but it's always better to understand where it's coming
from.
Example.
is a vector space over the reals. The set
is independent.
For if
it follows that
for all i.
The next lemma says that a independent set can be thought of as a set without "redundancy", in the
sense that you can't build any one of the vectors out of the others.
Lemma. Let V be an F-vector space, and let
. S is linearly independent if and only if no
can be expressed as a linear combination of other vectors in S.
("Other" means vectors other than v itself.)
Proof. Suppose S is linearly independent. Suppose
for
,
for all i,
.
Then
is a nontrivial linear relation among elements of S. This contradicts linear independence. Hence, v
cannot be a linear combination of other vectors in S.
Conversely, suppose no
Suppose
where
and
Suppose that at least one
can be expressed as a linear combination of other vectors in S.
for all i. I want to show that
for al i.
is nonzero. Assume without loss of generality that
I've expressed v as a linear combination of other vectors in S \contra. Hence,
is independent.
. Write
for all i, and S
Recall that if S is a set of vectors in a vector space V, the span of S is
That is, the span consists of all linear combinations of vectors in S.
Definition. A set of vectors S spans a subspace W if
linear combination of elements of S.
Example. Let
Then
; that is, if every element of W is a
For example, consider
. Is this vector in the span of S?
I need numbers a and b such that
This is equivalent to the matrix equation
Row reduce the augmented matrix:
The solution is
The vector
,
. That is,
is in the span of S.
On the other hand, try the same thing with the vector
amounts to the following row reduction:
. Solving
But the last matrix says "
numbers a and b. Therefore,

", a contradiction. The system is inconsistent, so there are no such
is not in the span of S.
To determine whether the vector
augmented matrix
is in the span of
,
, ...,
in
, form the
If the system has a solution, b is in the span, and coefficients of a linear combination
of the v's which add up to b are given by a solution to the system. If the system has no
solutions, then b is not in the span of the v's.
Example. The span of the set of
is all of
. In other words, if
vectors
is any vecotr in
, there are real numbers a, b, c such that
In words, any vector is a linear combination of the three vectors in the set.
Obviously, there are sets of three vectors in
which don't span. (For example, take three vectors
which are multiples of one another.) Geometrically, the span of a set of vectors in
can be a line
through the origin, a plane through the origin, and so on. (The span must contain the origin,
because a subspace must contain the zero vector.)
Example. Determine whether the vector is in the span of the set
(a)
.
I want to find a and b such that
This is the system
Form the augmented matrix and row reduce:
The last matrix says
(b)
and
. Therefore,
is in the span of S.
.
I want to find a and b such that
This is the system
Form the augmented matrix and row reduce:
The last row of the row reduced echelon matrix says "
". This contradiction implies that the
system is has no solutions. Therefore,
is not in the span of S.
Thus, v1 is shown to be a linear combination of the remaining vectors. It is worth noting that nonzero a1 and the equation defining linear dependence together imply that at least one other scalar ai
is non-zero.
The vectors in a set T=(v1,v2,...,vn) are said to be linearly independent if the equation
can only be satisfied by ai=0 for i=1,..., n. This implies that no vector in the set can be represented
as a linear combination of the remaining vectors in the set. In other words, a set of vectors is
linearly independent if the only representations of 0 as a linear combination of its vectors is the
trivial representation in which all the scalars ai are zero.[2]
Download