Linear independence

advertisement
Linear independence
• Solution of a m × n system
• Rank of a matrix
• Linear independence and dependence
• Spanning a subspace
• Basis for a subspace
• Dimension of a subspace


1
3 3 2
6 9 5
A= 2
−1 −3 3 0
The echelon form:

1
A → E = 0
0
3 3
0 3
0 0

2
1
0
A typical “Staircase pattern” of the echelon form:

∗
∗
∗
∗ ...
∗
 0
∗
∗
∗
.
.
.
∗

 0
0
0
∗
.
.
.
∗

 0
0
0
0
.
.
.
∗

. . . . . . . . . . . . . . . . . .
0
0
0
0 ...
0

∗
∗ 

∗ 

∗ 

...
∗
Important example of subspaces: Solutions of a homogeneous equation
Ax = 0
form a linear subspace. Note, that 0 is always a solution, but there maybe more.
This depends on the echelon form.
Lemma Ax = 0 has either 1 solution which is identically 0, or infinitely many.
Corollary Suppose Ax = b has a solution then it either has infinitely many or
only 1
Corollary Solution of Ax = b “algorithm”: find a solution of Ax = b, if possible
then find “all” solutions of Ax = 0.
Note use of vector spaces - seems that we gained nothing, but it is useful.
1
Examples: ode, ∆u = f (x).
Compare it with cases when A 6= P LE: if case 1: there are two consecutive
rows of E such that the leading nonzero entry in the higher row appears two or
more entries farther to the left than the leading nonzero entry in the lower row.

 

1 3 5 | 1
1 3 5 | 1 0 0 3 | 2 , 0 0 3 | 2 , 1 3 5 7 | 1 ,
0 0 1 1 | 2
0 0 0 | 2
0 0 0 | 0
or case 2 E has nonzero entries
columns:

1 3 |
0 1 |
0 0 |
or case 3 E has nonzero entries
rows:

1
0
0
on the diagonal, but it has more rows than
 
1
1
2  , 0
2
0

3 | 1
1 | 2 ,
0 | 0
on the diagonal, but it has more columns than
3
1
0

5 7 | 1
3 3 | 2 .
1 1 | 2
Lemma a) Ax = 0 has only a trivial solution if and only if its echelon form has
nonzero elements on the diagonal and the number of rows is larger or equal than
the number of columns. If A is also square, then there is A−1 , otherwise only
the left inverse exists.
b) Ax = 0 has a nontrivial solution if and only if
i) its echelon form has a zero the diagonal, ii) the number of rows is smaller
than the number of columns, or, equivalently, Ax = 0 has more unknowns than
equations.
Note If Ax = 0 has a nontrivial solution, then all solutions can be found by
basic/free variables method.
Definition Rank r of a matrix is the number of nonzero rows of its echelon
form.
Lemma a) r ≤ min(m, n).
b) If r = m (m rows), then there is always a solution of Ax = b (cf. right inverse
exists).
c) If r = n (n columns), there is no more than 1 solution (cf. left inverse exists).
Important note: In “real world” either r = m or r = n. Why?
When there is no solutions?
If r < m, inconsistency is possible. Why?
The rank counts the number of genuinely independent rows in A Definition
Vectors v1 , v2 , . . . vn are linearly dependent if there is nontrivial sequence of
scalars c1 , c2 , . . . , cn , such that
c1 v1 + c2 v2 + . . . cn vn = 0.
2
Vectors v1 , v2 , . . . vn are linearly independent if there is no such nontrivial
sequences of scalars. Nontrivial means c1 , c2 , . . . , cn 6= 0, 0, . . . , 0.
Nontrivial is very important.
Examples:
1. Any 3 vectors in R2 .
2. Functions 1, sin2 x, cos2 x are linearly dependent.
3. The r nonzero rows of the echelon matrix are linearly independent. Hence
there are exactly (unspecified) r rows of the original matrix which are linearly
independent. Why?
4. Any number of vectors where one of them is a zero vector are linearly dependent.
5. Consider (1, 0, . . . , 0, . . . ), (0, 1, . . . , 0, . . . ), . . . (0, 0, . . . , 1, . . . ), . . . and (1, 1, . . . , 1, . . . )
are they linearly independent?
A set of linearly dependent vectors v1 , v2 , . . . vn has redundancy!
Definition A span of vectors v1 , v2 , . . . vn is a space V that consists of all possible its linear combinations.
If v1 , v2 , . . . vn are linearly dependent, then their span equals to a span of some
0
subset v10 , v20 , . . . vm
, m < n of these vectors.
V is a vector space. Why?
Note: A general definition of a span may contain infinitely many vi . This is
useful in Functional Analysis, however there linear combinations still contain
only finitely many vectors.
When no vectors are wasted?
Definition For a vector space a set v1 , v2 , . . . vn is a basis if
a) they are linearly independent,
b) they span V .
Question: How many bases there are for a given vector space?
The characteristic number of a vector space is its dimension.
Dimension is a count of degrees of freedom
Example: Dimension of the space of solutions of Ax = 0 is the number of free
variables.
Lemma Any two bases for a vector space V contain the same number of vectors.
This number is the dimension of V .
Question: How to find a basis for a vector space?
3
Download