Homogeneous equations, Linear independence

advertisement
Homogeneous equations, Linear independence
1. Homogeneous equations:
Ex 1: Consider system:
B"
B"
 #B#
œ!
 #B3 œ !
 B$ œ !
B#
Matrix equation:
Ô "
"
Õ !
#
!
"
! ×Ô B" × Ô ! ×
#
B# œ ! œ 0Þ
ØÕ
"
B$ Ø Õ ! Ø
Ð3Ñ
Homogeneous equation:
Ex œ 0.
At least one solution:
x œ 0Þ
Other solutions called nontrivial solutions.
Theorem 1: A nontrivial solution of Ð$Ñ exists iff [if and only if] the system has
at least one free variable in row echelon form. The same is true for any
homogeneous system of equations.
Proof: If there are no free variables, there is only one solution and that
must be the trivial solution. Conversely, if there are free variables, then they can
be non-zero, and there is a nontrivial solution. …
Ex 2: Reduce the system above:
Ô "
"
Õ !
#
!
"
!
#
"
l !×
l !
l !Ø
as beforeÔ
"
!
Ä
Õ!
# !
" "
! !
l !×
l !
l !Ø
Ê
B"  #B# œ !à
B#  B$ œ !à
! œ !Þ
Note that B$ œ free variable (non-pivot); hence general solution is
B# œ B$ à
B" œ #B# œ #B$ Þ
Ô B" × Ô #B$ ×
Ô # ×
B$
x œ B# œ
œ B$ " ß
Õ B$ Ø Õ B$ Ø
Õ " Ø
Parametric vector form of solution.
B$ arbitrary: straight line -
Theorem 2: A homogeneous system always has a nontrivial solution if the
number of equations is less than the number of unknowns.
Pf: If we perform a Gaussian elimination on the system, then the
reduced augmented matrix has the form:
Ô"
Ö!
Ö
Ö!
Ö
0
Õ
+"#
!
!
0
+"$
"
!
0
ã
á
+#%
"
0
á
+$&
0
á
á
l !×
l !Ù
Ù
l ãÙ
Ù
l ã
l !Ø
with the remaining rows zeroes on the left side. If the number of
equations is less than the number of unknowns, then not every column
can have a 1 in it, so there are free variables. By previous theorem,
there are nontrivial solutions. …
1. Inhomogeneous equations:
[we should briefly mention the relationship between homogeneous and
inhomogeneous equations:]
Consider general system:
Ex œ bÞ
(1)(1)
Suppose p is a particular solution of (1), so Ep œ b. If x is any other solution of
(1), we still have Ex œ b. Subtracting the two equations:
Ex  E p œ 0
Ê
EÐx  pÑ œ 0Þ
So v2 œ x  p satisfies the homogeneous equation. Generally:
Theorem 1: M0 p is a particular solution of (1), then for any other solution x,
we have that v2 œ x  p solves the homogeneous equation (i.e., with b œ 0).
Thus every solution x of (1) can be written x œ p  v2, where v2 is a solution of
the homogeneous equation.
2. Application: Network flows
Traffic pattern at Drummond Square:
Quantities in cars/min. What are the flows on the inside streets? One equation
for each node:
B"  B$  B% œ %!
B"  B# œ #!!
B#  B$  B& œ "!!
B %  B& œ '!
Ô "
Ö "
Ö
!
Õ !
!
"
"
!
"
!
1
!
"
!
!
"
!
!
"
"
l
l
l
l
%! ×
#!! Ù
Ù
"!!
'! Ø
Ô"
Ö!
Ö
!
Õ!
!
"
"
!
" "
" "
1
!
!
"
! l
%! ×
! l "'! Ù
Ù
" l "!!
" l
'! Ø
Ô"
Ö!
Ö
!
Õ!
!
"
"
!
"
"
1
!
"
"
!
"
!
!
"
"
l
l
l
l
%! ×
"'! Ù
Ù
"!!
'! Ø
Ô"
Ö!
Ö
!
Õ!
!
"
!
!
"
"
0
!
"
"
"
"
!
!
"
"
l
l
l
l
%! ×
"'! Ù
Ù
'!
'! Ø
!
!
"
"
l
l
l
l
%! ×
"'! Ù
Ù
'!
'! Ø
Ô"
Ö!
Ö
!
Õ!
!
"
!
!
"
"
!
!
"
"
"
"
Ô"
Ö!
Ö
!
Õ!
!
"
!
!
"
"
!
!
!
!
"
!
"
"
"
!
l "!! ×
l "!! Ù
Ù
l '!
l
! Ø
So:
B" œ "!!  B$  B&
B# œ "!!  B$  B&
B% œ '!  B& ß
where B$ ß B& are free.
Constraint: if for example all flows have to be positive; then we require B3 !
for all 3. Therefore:
B$ ß B& !
"!! Ÿ B$  B& Ÿ "!!
B& Ÿ '!
This corresponds to a region in the B$ ß B& plane - can be plotted if desired.
If they closed off road B$ and B& ß then we have B$ œ B& œ !, so that
B" œ 100ß B# œ "00, B% œ '!
note that then traffic flow becomes uniquely determined.
Definition 1:
A collection of vectors v" ß v# ß á ß v8 is linearly independent if no
vector in the collection is a linear combination of the others.
Equivalently,
Definition 2: A collection of vectors v" ß á ß v8 is linearly independent if the
only way we can have -" v"  -# v#  á  -8 v8 œ 0 is if all of the -3 œ !Þ
Equivalence of the definitions:
Def 1 Ê Def 2
If no vector is a linear combination of the others, then if
-" v "  - # v #  á  - 8 v 8 œ 0
we will show that -" ß á ß -8 have also to be 0.
Proof: Suppose not (for contradiction). Without loss of generality,
assume -" Á ! (proof works same way otherwise). Then we have:
v" œ  -# Î-" v#  á  -8 Î-" v8 ß
contradicting that no vector is a combination of the others. Thus the -3
all have to be 0 as desired. …
Note: If W# is a collection of vectors and W" is a subcollection of W2 ß then
If W# is linearly independent
Ê no vector in W# is a linear combination of the others
Ê no vector in W" is a linear combination of the others (since every
vector in W" is also in W# )
Ê W" is linearly independent.
Logically equivalent [contrapositive]
If W" is linearly dependent (i.e., not independent)
Ê W# is linearly dependent
[These are stated more formally in the book as theorems.]
Theorem 2: P/> W œ Öv" ß á ß v8 × be a collection of vectors in ‘. . Then W
is linearly dependent if and only if one of the vectors v3 is a linear combination
of the previous ones v" ß á ß v3" Þ
Proof: ( Ê ) If W is linearly dependent, then there is a set of constnats -3 not all
! such that
-" v"  á  -8 v8 œ 0.
Let -5 be the last non-zero coefficient. Then the rest of the coefficients are
zero, and
-" v"  -# v#  á  -5" v5"  -5 v5 œ 0
Ä
v5 œ  -" Î-5 v"  -# Î-5 v #  á  -5" Î-5 v5"
i.e. one of the vectors is a linear combination of the previous ones.
( É ) Obvious.
3. Checking for linear independence:
Example 2: Consider the vectors
"
"
"
ß v$ œ ”
v " œ ” •ß v # œ ”
•
"
"
! •
Are they linearly independent?
-" v "  - # v #  - $ v $ œ 0
Ê
-"  - #  - $ œ !
-"  - #
œ !
Reduced matrix:
"
”"
"
"
Ä”
"
!
"
#
Ä”
"
!
"
"
"
!
!
!•
l
l
"
"
l
l
!
!•
"
"Î#
l
l
!
!•
Conclude: there are free variables. By theorem on homogeneous
equations there is a nontrivial solution, so -3 need not be !.
Thus not all -3 must be ! Ê
not linearly independent.
[note that if number of vectors is greater than the size of the vectors,
this will always happen].
More generally thus:
Theorem 3: In ‘8 , if we have more than 8 vectors, they cannot be
linearly independent.
From above we have:
Algorithm: To check whether vectors are linearly independent, form a matrix
with them as columns, and row reduce.
(a) If reduced matrix has free variables (i.e., b a non-pivot column), then
they are not independent.
(b) If there are no free variables (i.e., there are no nonpivot columns), they
are independent.
Download