Uploaded by testforkblh

Algebra Liniowa: Przestrzenie Wektorowe, Aksjomaty i Rozpiętości

advertisement
Linear Algebra – Week 1
Vector Space
Definition 1.1 — Vector Space
A vector space V over a field F is a set equipped with two operations, addition and
scalar multiplication:
(a) Addition + : V × V → V
(b) Scalar multiplication · : F × V → V
that satisfies the following eight axioms for all u, v, w ∈ V and all a, b ∈ F :
VS1. u + (v + w) = (u + v) + w
VS2. u + v = v + u
VS3. There exists an element 0 ∈ V such that u + 0 = u for all u ∈ V
VS4. For each u ∈ V , there exists an element −u ∈ V such that u + (−u) = 0
VS5. a(bv) = (ab)v
VS6. 1v = v, where 1 is the multiplicative identity in F
VS7. a(u + v) = au + av
VS8. (a + b)v = av + bv
Definition 1.2 — Vector and Scalar
If V is a vector space over F , then the elements of V are called vectors, the elements
of F are called scalars.
Additionally, 0 ∈ V is called the zero vector, and −u is called the inverse element of
u in Definition 1.1.
Example.
a) V = F [x] is a vector space over F .
Where addition and scalar multiplication are the regular operations.
b) V = F n are vector spaces over F , where n ∈ N.
For u = (x1 , x2 , . . . , xn ), v = (y1 , y2 , . . . , yn ) ∈ V, a ∈ F .
Where u + v = (x1 + y1 , x2 + y2 , . . . , xn + yn ), av = (ax1 , ax2 , . . . , axn ).
1
Linear Algebra – Week 1
2
c) V = R>0 is a vector space over F = R
For x, y ∈ V, α ∈ F :
Addition is x ⊕ y = xy. Scalar multiplication is α ⊙ x = xα = eα log x .
The zero vector 0 is 1 ∈ V .
Checking the eight axioms are satisfied for the above three examples is left as exercise.
We now prove some elementary theorems and corollaries about the vectors:
Theorem 1.1 — Cancellation Law for Vector Addition
Let v, v1 , v2 ∈ V . If v + v1 = v + v2 , then v1 = v2 .
Proof.
Let w be a inverse of v.
w + v + v1 = w + v + v2
=⇒
0 + v1 = 0 + v2
=⇒
v1 = v2
■
Corollary 1.2 — Uniqueness of the Zero Vector
The zero vector 0 (also called the origin) in VS3 is unique.
Proof.
Suppose there are two origins 01 , 02 .
We have
01 + 02 = 02 + 01
=⇒ 01 = 02
(VS1)
(VS3)
■
Linear Algebra – Week 1
3
Corollary 1.3 — Uniqueness of the Inverse Element
The inverse element of v is unique, for all v ∈ V .
Proof.
Suppose there are two inverses w1 , w2 of v.
We have 0 = v + w1 and 0 = v + w2 .
Thus v + w1 = v + w2 , then by Theorem 1.1 we have w1 = w2
Remark.
The unique inverse of v is −v := (−1) · v
We can perform the operation:
v + (−v) = v + (−1) · v
= 1 · v + (−1) · v
= (1 + (−1)) · v
= (0 · v)
=0
By definition
(VS6)
(VS8)
By field operation
See below
To show that 0v = 0, we calculate (1 + 0) · v in two ways:
(1 + 0)v = 1v = v = v + 0
(1 + 0)v = 1v + 0v = v + 0v
And this gives us v + 0 = v + 0v, then by Theorem 1.1 we have 0v = 0.
■
Linear Algebra – Week 1
4
Definition 1.3 — Linear Combination
Let V be a vector space and S ⊂ V .
v ∈ V is a linear combination of vectors in S if v can be expressed as:
v = α 1 v 1 + . . . + α n vn
for some α ∈ F, vi ∈ S.
Example.
a) If V = F 3 , S = {v1 = (1, 1, 0), v2 = (0, 1, 1)}.
Then v = (2, 5, 3) = 2v1 + 3v2 is a linear combination of vectors in S, but
w = (2, 6, 3) is not.
b) If V = F [x], S = {x + 1, x2 + 3x, x2 + 5}
Then x2 is a linear combination of vectors in S, since we can solve
x2 = α1 (x + 1) + α2 (x2 + 3x) + α(x2 + 5) for α1 , α2 , α3 . We get
, α2 = 58 , α3 = 83 .
α1 = −15
8
x3 is NOT a linear combination of vectors in S. If
x3 = α1 (x + 1) + α2 (x2 + 3x) + α3 (x2 + 5) then
x3 − α1 (x + 1) − α2 (x2 + 3x) − α(x2 + 5) = 0 should have infinitely roots in F .
But a polynomial equation of degree of 3 can only have 3 solutions.
Linear Algebra – Week 1
5
Span
Definition 1.4 — Span
Let V be a vector space over F , and let S be a subset of V .
The subspace spanned by S over F is denoted by:
spanF (S) := {α1 v1 + . . . + αn vn | vi ∈ S, αi ∈ F }
spanF (S) can be interperted as "the set of all linear combinations of vectors in S over
F ".
Example.
a) Let V = R2 , F = R, S = {v1 = (1, 2)}.
Then spanF (S) = {αv1 | α ∈ R} = {(α, 2α) | α ∈ R}
y
spanF (S)
x
b) Let V = R3 , F = R, S = {v1 = (1, 2, 3), v2 = (0, 1, 1)}
Then the span is:
spanR {v1 , v2 } = {α1 v1 + α2 v2 | α1 , α2 ∈ R}
= {(α1 , 2α1 + α2 , 3α1 + α2 ) | α1 , α2 ∈ R}
We can notice that this is a plane in R3 which passes through the origin.
c) Let V = F [x], S = {1, x, x2 , x3 , x4 , . . . } (elements of S are called monomials).
Then the span is:
spanF (S) = {α0 1 + α1 x + · · · + αn xn | αi ∈ F, n ∈ Z≥0 }
= F [x]
Linear Algebra – Week 1
6
Theorem 1.4 — Span is a Vector Space
Let V /F be a vector space, and S ⊂ V such that S ̸= ∅. spanF (S) is a vector space
with addition and scalar product induced from V .
Proof.
Let W = spanF (S) ∈ V . To show W is a vector space, it suffices to show that:
a) x + y ∈ W , ∀x, y ∈ W .
b) αx ∈ W , ∀α ∈ F, x ∈ W .
c) 0 ∈ W .
Let x = α1 v1 + · · · + αn vn , y = β1 w1 + · · · + βm wm , where vi , wj ∈ S, αi , βj ∈ F .
We now prove each of the identies:
a) x + y = ni=1 αi vi + m
j=1 βj wj ∈ W . Since by definition, it is a linear
combination of vectors in S.
P
P
b) αx = (αα1 )v1 + · · · + (ααn )vn ∈ W
c) 0 = 0v, for all v ∈ V .
■
Linear Algebra – Week 1
7
Definition 1.5 — Generating Set
Let V /F be a vector space, and S ̸= ∅ ⊂ V . We say S is a generating set of V , if
V = spanF (S).
The elements of S are called generators if S is a generating set of V .
Definition 1.6 — Finite Dimensional
Let V /F be a vector space. We say V is finite dimensional, if V is spanned by a finite
subset S ⊂ V .
Example.
a) Let V = {(x, y, z, w) ∈ R4 | x + 2y − z + w = 0, 2x + y + z + 2w = 0}
Then V = spanR {(−1, 1, 1, 0), (0, 1, 1, −1)}.
Since V can be spanned by a set with two elements, V is finite dimensional over
R.
b) Let V = C/R.
Then V = spanR {1, i}. Thus V is finite dimensional.
c) Let V = C/Q, then V is not finite dimensional.
d) Let V = F [x]/F , then V is not finite dimensional.
Proof.
Suppose V is spanned by S = {f1 , . . . , fn |fi ∈ F [x]}. Let
N = 1 + max {deg fi | i = 1, . . . , n}, and xN ∈ V .
Since V is spanned by S. We have the following property:
xN = α 1 f 1 + · · · + α n f n
αi ∈ F
=⇒ xN − α1 f1 − α2 f2 − · · · − αn fn = 0
But we can see that the above equation has only N roots (making the
equation not hold for every x ∈ F ), which contradicts the fact that xN can
be spanned by S. This further indicates that our assumption that V can be
spanned by a finite subset S is wrong.
■
Linear Algebra – Week 1
Definition 1.7 — Subspace
Let V /F be a vector space. A subset W ⊂ V is a subspace of V , if:
a) x + y ∈ W , for all x + y ∈ W .
b) αx ∈ W , for all α ∈ F, x ∈ W .
c) 0 ∈ W , or W ̸= ∅
Remark.
Let V /F be a vector space. If W is a subspace of V , then W/F is a vector space, with
addition and scalar product induced from V .
Remark.
Let W1 , W2 ⊂ V are subspaces, then W1 ∩ W2 is also a subspace. But in general
W1 ∪ W2 is NOT a subspace, unless W1 ⊂ W2 or W2 ⊂ W1 .
(Check this)
Example.
a) Let V = C/R, then R ⊂ V is a subspace of V . Note that spanR {1} = R
b) Let V = R2 , W1 = {(x, y) | y = 3x} ⊂ V is a subspace in V .
W2 = {(x, y) | y = 3x + 1} is NOT a subspace in V , as it contains no zero vector.
c) W = {(x, y) | y = x2 } is NOT a subspace. Since (1, 1) + (2, 4) ̸⊂ W .
d) Let V = F [x]. Then W1 = {f ∈ V | f (0) = 0} is a subspace, but
W2 = {f ∈ V | f (0) = 3} is NOT a subspace, since the zero vector, f0 (x) = 0 is
not in W .
8
Linear Algebra – Week 1
9
Corollary 1.5 — Redudant Vectors
Suppose V is a vector space spanned by a generating set S. In general, S may contain
"redudant" vectros.
Example.
If S = {v1 , v2 , v3 = 2v1 + 3v2 }. Then spanF (S) = spanF {v1 , v2 }. Since:
spanF (S) = {α1 v1 + α2 v2 + α3 v3 | αi ∈ F }
= {(α1 + 2α3 )v1 + (α2 + 3α2 ) v2 | αi ∈ F }
= spanF {v1 , v2 }
Definition 1.8 — Linearly Dependent
Let V /F be a vector space, and {v1 , . . . , vn } ∈ V . We say {v1 , . . . , vn } linearly dependent, if ∃ non-trival α1 , . . . , αn ∈ F , such that:
α1 v1 + · · · + αn vn = 0
Where non-trivial means "not all zeros".
Example.
n √ o
a) Let V = C/R a vector space. Then 1, 2 are linearly dependent. Since:
−1 √
1·1+ √ · 2=0
2
n √ o
b) Let V = C/Q a vector space. Then 1, 2 are NOT linearly dependent. Since
if:
√
α1 · 1 + α2 · 2 = 0,
αi ∈ Q, α1 ̸= 0 or α2 ̸= 0
We have
√
Which is a contradiction.
2=−
α1
∈Q
α2
Linear Algebra – Week 1
10
Definition 1.9 — Linearly Independent
Let V /F be a vector space, and {v1 , . . . , vn } ∈ V . We say {v1 , . . . , vn } linearly independent, if the equation:
α1 v1 + · · · + αn vn = 0
only has the trival solution such that α1 = α2 = · · · = αn = 0.
Corollary.
{v1 , . . . , vn } is lineary independent ⇐⇒ {v1 , . . . , vn } is NOT lineary dependent.
Example.
a) Let V = R3 /R, v1 = (1, 0, −1) , v2 = (1, 2, 3) , v3 = (0, 1, 1). Then {v1 , v2 , v3 } are
linearly independent, since:
α1 v1 + α2 v2 + α3 v3 = 0
=⇒ (α1 + α2 , 2α2 + α3 , −α1 + 3α2 + α3 ) = 0
=⇒ α1 = α2 = α3 = 0
Download