Lesson 2: Vector Spaces

advertisement
Lesson 2: Vector Spaces
September 27, 2014
Lesson 2: Vector Spaces
Vector Spaces
Let V be a set, and let +, · be two operations, the first one (sum)
defined between the elements of V , and the second one (product by
scalars) defined between V , and the elements of R (resp. C). We say
that (V , +, ·) is a vector space over R (resp. C) if the following
properties hold:
(i) (V , +) is a conmutative group:
Internal Law: ∀~u , ~v ∈ V , ~u + ~v ∈ V .
~ ∈ V , ~u + (~v + w
~ ) = (~u + ~v ) + w
~.
Associative: ∀~u , ~v , w
Neutral element: there exists ~0 ∈ V such that ~u + ~0 = ~u .
Inverse element: for all ~u ∈ V , there exists −~u ∈ V such that
~u + (−~u ) = ~0.
Conmutative: ∀~u , ~v ∈ V , ~u + ~v = ~v + ~u .
Lesson 2: Vector Spaces
Vector Spaces
Let V be a set, and let +, · be two operations, the first one (sum)
defined between the elements of V , and the second one (product by
scalars) defined between V , and the elements of R (resp. C). We say
that (V , +, ·) is a vector space over R (resp. C) if the following
properties hold:
(ii) The operation · satisfies that:
∀λ ∈ R (resp. C), ∀~u , ~v ∈ V , λ · (~u + ~v ) = λ~u + λ~v .
∀λ, µ ∈ R (resp. C), ∀~u , ~v ∈ V , (λ + µ) · ~u = λ~u + λ~v .
∀λ, µ ∈ R (resp. C), ∀~u , ~v ∈ V , λ · (µ~u ) = (λ · µ) · ~u .
1 · ~u = ~u .
Lesson 2: Vector Spaces
Vector Spaces
In order to make explicit whether we work over R or C, one writes
(V (R), +, ·) or (V (C), +, ·). In the sequel, we will assume that we work
over R, although the results are equivalent for C. In fact, vector spaces
can be defined over sets other than R or C (fields).
Examples: Khan Academy (click)
Lesson 2: Vector Spaces
Vector Spaces
Proposition
Let (V (R), +, ·) be a vector space over R. Then for all λ ∈ R and for all
~u ∈ V , it holds that:
(1) λ · ~0 = ~0.
(2) 0 · ~u = ~0.
(3) λ · ~u = ~0 if and only if λ = 0 or ~u = ~0.
(4) λ · (−~u ) = (−λ) · ~u = −(λ · ~u ).
Lesson 2: Vector Spaces
Linear Dependence. Bases.
Definition
A linear combination of vectors ~u1 , ~u2 , . . . , ~un is another vector of the
form
λ1~u1 + λ2~u2 + · · · + λn~un
where λi ∈ R for i = 1, . . . , n.
Remark The vector ~0 can be considered as a linear combination of any
vectors ~u1 , ~u2 , . . . , ~un , because ~0 = 0 · ~u1 + 0 · ~u2 + · · · + 0 · ~un .
Lesson 2: Vector Spaces
Linear Dependence. Bases.
Definition
We say that {~u1 , ~u2 , . . . , ~un } are linearly dependent (l.d.) if at least of
them is a linear combination of the rest. If {~u1 , ~u2 , . . . , ~un } are not l.d.,
we say that they are linearly independent (l.i.).
Lesson 2: Vector Spaces
Linear Dependence. Bases.
For vectors in Rn : in order to check the linear independence of
several vectors, form a matrix with them as columns (or rows), and
compute the rank (Example: Khan Academy (click))
The above idea can be extended to other cases (but we need the
notion of coordinates of a vector, to do this; we will see it later).
Lesson 2: Vector Spaces
Linear Dependence. Bases.
Theorem
The vectors {~u1 , ~u2 , . . . , ~un } are linearly independent if and only the only
linear combination of them fulfilling λ1~u1 + · · · + λ~un = ~0 satisfies
λ1 = · · · = λn = 0.
Proof.: Khan Academy (click)
Lesson 2: Vector Spaces
Linear Dependence. Bases.
Definition
We say that S = {~u1 , . . . , ~un } is a spanning set of V if any vector in V
can be written as a linear combination of the vectors in S.
Definition
We say that B = {~u1 , . . . , ~un } is a basis of V if it is a spanning set of V
and they are linearly independent.
Examples: Khan Academy (click)
Important remark: vector spaces may have infinitely many bases!!
Lesson 2: Vector Spaces
Linear Dependence. Bases.
Definition
We say that a vector space has finite dimension if it has a basis
consisting of finitely many vectors.
Examples:
1
Rn has dimension n, since it admits the basis (called the canonical
basis)
{(1, 0, 0, . . . , 0), (0, 1, 0, . . . , 0), . . . , (0, 0, 0, . . . , 1)}
2
3
Notation: ~e1 = (1, 0, . . . , 0), ~e2 = (0, 1, . . . , 0), . . .,
~en = (0, 0, . . . , 1).
Matrices of fixed dimension, polynomials of fixed degree.
At least an example of a vector space which is NOT this kind?
Lesson 2: Vector Spaces
Linear Dependence. Bases.
Theorem
If V has finite dimension, then all the bases of V have the same number
of vectors (the dimension of V , dim(V )).
Intuitively, the dimension of a vector space is the number of parameters
that one needs to specify in order to identify a concrete element in the
space.
Lesson 2: Vector Spaces
Linear Dependence. Bases.
Theorem
Let V be a vector space with finite dimension, and let dim(V ) = n. Then
the following statements are true:
(i) If S spans V , then you can extract a basis from S.
(ii) Every system consisting of more than n vectors is linearly dependent.
(iii) Every spanning system contains at least n vectors.
(iv) Given B = {~u1 , . . . , ~un } ⊂ V (a subset of exactly n vectors), the
following statements are equivalent: (a) B is a basis; (b) B is linearly
independent; (iii) B is a spanning system.
Lesson 2: Vector Spaces
Linear Dependence. Bases.
Definition
Let B = {~u1 , . . . , ~un } be a basis of V , and let ~v ∈ V . The coordinates
of ~v with respect to B are the scalars λ1 , . . . , λn ∈ R such that
~v = λ1~u1 + . . . + λn~un
Usually we write
~v = (λ1 , . . . , λn )B ; we also say ~v has coordinates (λ1 , . . . , λn ) in B.
Lesson 2: Vector Spaces
Linear Dependence. Bases.
Theorem
Let V be a vector space of finite dimension n, and let B = {~u1 , . . . , ~un }
be a basis of V . Then every vector ~v ∈ V has unique coordinates with
respect to B.
Proof. + Examples: Khan Academy (click)
Lesson 2: Vector Spaces
Vector Subspaces.
Definition
Let (V (R), +, ·) be a vector space. We say that W ⊂ V is a vector
subspace of V if (W (R), +, ·) has also a structure of vector space.
Lesson 2: Vector Spaces
Vector Subspaces.
Theorem
W ⊂ V is a vector subspace if and only if ∀ ~u , ~v ∈ W , ∀λ, µ ∈ R,
λ~u + µ~v ∈ W (i.e. if and only if every linear combination of two vectors
in W , stays in W ).
Examples: Khan Academy (click)
Observation: If W is a vector subspace, then ~0 ∈ W .
Lesson 2: Vector Spaces
Vector Subspaces.
Since a vector subspace is in fact a vector space inside of another
bigger vector space, it makes sense to speak about its dimension,
about bases, etc.
Again, the dimension of a vector subspace is the number of
parameters defining an element of the subspace. Examples:
Khan Academy (click)
Lesson 2: Vector Spaces
Vector Subspaces.
Definition
let S = {~u1 , . . . , ~un } be a subset of V . The linear variety spanned by S
(or simply the linear span of S) is the set consisting of all the vectors
which are linear combinations of the vectors in S, i.e.
L(S) = {~x ∈ V |~x = λ1~u1 + · · · + λn~un }
L(S) is a vector subspace, and its dimension is the rank of the
system S, i.e. the number of linearly independent vectors in S.
In R2 , the linear span of a vector is a line.
In R3 , the linear span of one vector is also a line; the linear span of
two independent vectors is a plane.
In Rn , n ≥ 4, one vector spans a line, two independent vectors span
a plane, three independent vectors span a space,...
Lesson 2: Vector Spaces
Vector Subspaces.
Definition
Let S1 , S2 be two vector subspaces of V . Then we define:
(i) The intersection S1 ∩ S2 of S1 , S2 :
S1 ∩ S2 = {~x ∈ V |~x ∈ S1 and ~x ∈ S2 }
(ii) The sum S1 + S2 :
S1 + S2 = {~x ∈ V |~x = ~u + ~v , ~u ∈ S1 , ~v ∈ S2 }
Lesson 2: Vector Spaces
Vector Subspaces.
Theorem
S1 ∩ S2 and S1 + S2 are vector subspaces. Moreover, it holds that
dim(S1 + S2 ) = dim(S1 ) + dim(S2 ) − dim(S1 ∩ S2 )
Question: Let S1 , S2 be two different planes in R3 containing ~0 and
sharing a same line. What is S1 + S2 ?
Lesson 2: Vector Spaces
Vector Subspaces.
Observations:
S1 ∩ S2 always contains ~0; it can happen S1 ∩ S2 = {~0}.
If B1 = {~u1 , . . . , ~um } is a basis of S1 , and B2 = {~u1′ , . . . , ~up′ } is a
basis of S2 , then
S1 + S2 = L(~u1 , . . . , ~um , ~u1′ , . . . , ~up′ )
Furthermore, a basis of S1 + S2 can be found by taking the linearly
independent vectors in ~u1 , . . . , ~um , ~u1′ , . . . , ~up′ .
Lesson 2: Vector Spaces
Vector Subspaces.
Definition
Two subspaces S1 , S2 are said to be independent if every vector
~x ∈ S1 + S2 can be expressed in a unique way as a sum of vectors of S1
and S2 , i.e. if the decomposition ~x = ~u + ~v where ~u ∈ S1 and ~v ∈ S2 , is
unique. In this
L situation, one also says that the sum is direct, which is
denoted S1 S2 .
Theorem
L
S1 S2 if and only if S1 ∩ S2 = {~0}.
Proof. + Example (Khan Academy (click)
Lesson 2: Vector Spaces
Vector Subspaces.
Equations of a vector subspace: Khan Academy (click)
1
2
3
Vector equation.
Parametric equations.
Implicit equations.
Change of basis: Khan Academy (click)
Lesson 2: Vector Spaces
Download