1.3 Consistent Systems of Linear Equations § 1.5 Matrix Operations

advertisement
§ 1.3 Consistent
Systems of Linear
Equations
§ 1.5 Matrix
Chapter 1
Operations
Matrices and Systems of Linear Equations
§ 1.3 Consistent Systems of Linear Equations
§ 1.5 Matrix Operations
Lecture Linear Algebra - Math 2568M on January 14, 2013
MW 605
Oguz Kurt
oguz@math.ohio-state.edu
292-9659
Off. Hrs:
MWF 10:20-11:20
The Ohio State University
1.1
Outline
§ 1.3 Consistent
Systems of Linear
Equations
§ 1.5 Matrix
Operations
1 § 1.3 Consistent Systems of Linear Equations
2 § 1.5 Matrix Operations
1.2
Homogeneous Systems
1
Definition A system of linear equations is called homogeneous
of the constant term in each equation is zero. Otherwise, it is
called non − homogeneous. If the [A|b] is homogeneous, then
b = 0.
2
Theorem 2.3 If [A|0] is a homogeneous system of m linear
equations with n variables, where m < n, then the system has
infinitely many solutions.
§ 1.3 Consistent
Systems of Linear
Equations
§ 1.5 Matrix
Operations
Proof.
After applying Gauss-Jordan Elimination to [A|0], we get [B |0] in
reduced row echelon form. We make two observations on [B |0]:
(1) [B |0] is consistent: If B has an all-zeros row, then that row
corresponds to the equation 0 = 0 because it is homogenous. So, it
has at least one solution.
(2) It has at least one free variable: Since B has m equations, it can at
most have m non-zero rows. Hence, it has at least n − m > 0 free
variables.
(1) and (2) together imply that the system has infinitely many
solutions. (each free variable introduces a new variable in the solution
set)
1.3
A note on the part (1) of Proof:
Remark 1: A friend of yours asked in email why (1) and (2) imply the
existence of infinitely many solutions.
• If it is impossible to get the form 0 = d 6= 0, then say [B |0] is the
reduced row echelon of [A|0]. We can then write the dependent
variables (variables corresponding to leading terms) in terms of
independent variables (variables corresponding to columns
without a leading term in reduced row echelon form.)
• This directly gives at least one solution and in fact infinitely many
solutions since different assignments to non-leading variables
gives various solutions.
§ 1.3 Consistent
Systems of Linear
Equations
§ 1.5 Matrix
Operations
Theorem
If [A|b] is an m × n system of linear equations (m equations, n
unknowns) with m < n, then either it is inconsistent or it has infinitely
many solutions.
Proof.
If it is an inconsistent system, then the proof is completed. So, we may
assume that the system is consistent. Since m < n, the reduced row
echelon form can have at most m leading terms. Hence, it must have
at least n − m ≥ 1 free variables. Hence, it must have infinitely many
solutions.
1.4
§ 1.3 Consistent
What if it is non-homogeneous?
Systems of Linear
Equations
§ 1.5 Matrix
Operations
The last row in the following SLEs gives 0 = 1. This is not possible. So,
it has no solution.


1 0 1
0 1 2
0 0 1
Remark 1: An SLEs has NO solution (inconsistent), if its augmented
matrix [A|b] is row equivalent to some [B |c] such that one of the rows
of [B |c] has the form
£
0
0
...
0
0
d 6= 0
¤
Remark 2: It is enough to apply elementary row operations and
Gauss-Jordan Elimination to [A|b]. The process will itself tell you
whether the system is consistent or inconsistent.
1.5
Dependent/Independent Variables
§ 1.3 Consistent
Systems of Linear
Equations
§ 1.5 Matrix
Operations
Definition
Let [A|b] be a system of linear equations and [Rref (A)|c] be its
reduced row echelon form. The variables of [A|b] corresponding to
leading terms of [Rref (A)|c] are called the dependent variables of the
system of linear equations. If a column of [Rref (A)|c] has no leading
term, then the corresponding variable is called an independent (free)
variable of the system of linear equations.
Remark 1: Note that a variable is either dependent or independent.
Remark 2: # of dependent variables + # of independent variables =
# of all variables (unknowns)
1.6
§ 1.3 Consistent
Example:
Systems of Linear
Equations
Solve the following system of linear equations:
§ 1.5 Matrix
Operations
2
−2

2
−2

2x1
−2x1
2x1
−2x1
+
−
+
−
2
−2
2
−2
−1
4
5
−2
2x2
2x2
2x2
2x2
−
+
+
−
x3
4x3
5x3
2x3
=
=
=
=
1
1
5
−3
1
1
R2 :=R2 +R1 ,R3 :=R3 +R2 
1
0
 −−

−−−−−−−−−−−−−−−−−−→
5  R4 :=R4 +R1 ,R1 :=R1 /2 0
−3
0

1
0
0
0



1
R2 :=R2 /3,R4 :=R4 +R2 0
−−−−−−−−−−−−−−−−−−→ 
0
R3 :=R3 −3R2
0
1
0
0
0
−1/2
1
0
0
d
x
1/2
 1

2/3
1
 −−−−−−−−−−−→ 


0
0
R1 :=R1 +R2 /2 

0
0
0

−1/2
3
9
−3
f
x2
1
0
0
0
d
x3
0
1
0
0
1/2
2 

6 
−2




5/6


2/3

0 
0
x1
−1
5/6
5
2
x2 = t , x1 = − t , x3 := ⇒ x2  = t  1  +  0 
6
3
x3
0
2/3



 

1.7
1
2
Definition A matrix is a rectangular array of numbers called the
entries, or elements, of the matrix.
Example:
·
3
4
1
0
¸
2
,
3
"p
5
2
−1
π
#
1 ,
2
£
1
1
¤
1
1 ,
2
Systems of Linear
Equations
§ 1.5 Matrix
Operations

4
17
A 1 × m matrix is called a row matrix, and an n × 1 matrix is called
a column matrix.
A general m × n matrix A has the form

a11
a
 21
A=
 ..
 .
am1
5
0

§ 1.3 Consistent
a12
a22
..
.
am2
···
···
..
.
···

a1n
a2n 

.. 

. 
amn
The diagonal entries of A are a11 , a22 , a33 , . . . . If m = n then A is
called a square matrix. A square matrix whose nondiagonal
entries are all zero is called a diagonal matrix. A diagonal matrix
all of whose diagonal entries are the same is called a
scalar matrix. If the scalar on the diagonal is 1, the scalar matrix
is called an identity matrix.
1.8
§ 1.3 Consistent
Matrix Addition and Scalar Multiplication
Systems of Linear
Equations
§ 1.5 Matrix
A + B = [aij + bij ],
Operations
cA = c[aij ] = [caij ] where c is a scalar.
Remark 1: A − B = A + (−B), A + O = A = O + A,
A − A = O = −A + A.
Remark 2: We can only add matrices A and B if they have the
same dimensions m × n.
Example Let
·
A=
·
A+B =
1
−2
1
−2
5
−3
5
−3
¸
·
3
−1
, B=
7
3
¸ ·
3
−1
+
7
3
2
5
¸
2
5
·
1
0
=
1
0
¸
·
1
4
, C=
0
2
7
2
4
7
3
1
¸
¸
A + C is not possible since A is a 2 × 3-matrix while C is a 2 × 2
matrix.
1.9
§ 1.3 Consistent
Matrix Multiplication
1
Systems of Linear
Equations
Definition If A is an m × n matrix and B is an n × r matrix, then
the product C = AB is an m × r matrix. The (i , j) entry of the
product is computed as follows:
cij = ai1 b1j + ai2 b2j + · · · + ain bnj =
n
X
§ 1.5 Matrix
Operations
aik bkj
k =1
2


− R1 −
− R
−


2
 is an m × n matrix and
Definition If A = 
.


.
−
.
−
− Rm −


|
|
...
|

B = C1 C2 . . . Cr  is an n × r matrix, then the product
|
|
...
|
C = AB is an m × r matrix. The (i , j) entry of the product is
computed as follows:
cij = Ri (A) · Cj (B)
.
Remark 1: Note that rows of A (Ri ’s) and columns of B (Cj ’s) are
vectors in Rn . So, their dot product makes total sense.
1.10
§ 1.3 Consistent
Systems of Linear
Equations
Example Compute AB if
·
A=
1
3
−2
−1
§ 1.5 Matrix
Operations
1
1

¸
−4
0
−2
2
and B =  5
−1
3
−1
0
−1

1
6
c1,1 = (1)(−4) + (3)(5) + (1)(−1) = 10
,
c1,2 = (1)(0) + (3)(−2) + (1)(2) = −4,
c1,3 = (1)(3) + (3)(−1) + (1)(0) = 0
,
c1,4 = (1)(−1) + (3)(1) + (1)(6) = 8,
c2,1 = (−2)(−4) + (−1)(5) + (1)(−1) = 2
,
c2,2 = (−2)(0) + (−1)(−2) + (1)(2) = 4,
c2,3 = (−2)(3) + (−1)(−1) + (1)(0) = −5
,
c2,4 = (−2)(−1) + (−1)(1) + (1)(6) = 7
·
AB =
10
2
−4
4
0
−5
8
7
¸
1.11
Partitioned Matrices
§ 1.3 Consistent
Systems of Linear
Equations
§ 1.5 Matrix
Theorem
Operations
Let A be an m × n matrix, ei a 1 × m standard unit vector, and ej an
n × 1 standard unit vector. Then
a. ei A is the ith row of A and
b. Aej is the jth column of A.
Proof.
ei is a row vector. So, for all j,ei · Cj (A) = [ei A]j = [A]i ,j ⇒ ei A = Ri (A).
ej is a column vector. So, for all i,
Ri (A) · ej = [Aej ]i = [A]i ,j ⇒ Aej = Cj (A).
Theorem
Let A be an m × n matrix and B be an n × r matrix. Then
a. Ri (A)B is the ith row of AB and
b. ACj (B) is the jth column of AB.
1.12
§ 1.3 Consistent
Systems of Linear
Equations
§ 1.5 Matrix
Theorem
Operations
Let A be an m × n matrix and B be an n × r matrix. Then
a. Ri (A)B is the ith row of AB and
b. ACj (B) is the jth column of AB.
Proof.
We can write Ri (A) = ai ,1 e1 + ai ,2 e2 + . . . + ai ,n en where ei ’s are 1 × n
standard unit vectors. This (by previous theorem) implies
Ri (A)B
[Ri (A)B]j = Ri (A)Cj (B)
=
ai ,1 e1 B + ai ,2 e2 B + . . . + ai ,n en B
=
ai ,1 R1 (B) + ai ,2 R2 (B) + . . . + ai ,n Rn (B)
=
ai ,1 b1,j + ai ,2 b2,j + . . . + ai ,n bn,j = [AB]i ,j
So, Ri (A)B is the i th row of AB.
Part (b) is an exercise!!
1.13
§ 1.3 Consistent
Examples
·
1
A=
1
0
Systems of Linear
Equations
2
1
4
and B = 1
3
−1
§ 1.5 Matrix

Operations
2
0
 
·
AC1 (B ) =

¸
3
−1
1
0
¸ 4
· ¸
·
2  
13
1
1 =
and AC2 (B ) =
1
2
0
3
3
−1

3
−1

¸ −1
· ¸
2  
5
2 =
1
−2
0
.
.
. 5


.
.
2
.−2
Remark 1: The above form is called the matrix-column representation of the
product.

.
13
.
Therefore AB = [AC1 (B ).AC2 (B )] = 

·
2
1
0

4
¸
3
−1
2
1
and B = 1
R1 (A)B = [1
3
4
2] 1
3
A=
3


−1
−1

2
0


2  = [13
0



5] and R2 (A)B = [0
−1
4
1] 1
3
−1

2  = [2
0
−2]

R1 (A)B
13
5
··· 
Therefore, AB =  · · ·  =  · · ·
R2 (A)B
2
−2
Remark 2: The above form is called the row-matrix representation of the product.
1.14
§ 1.3 Consistent
Matrix Powers
Systems of Linear
Equations
§ 1.5 Matrix
Operations
Definition
If A is a square matrix and r and s are nonnegative integers, then
1. Ar As = Ar +s
2. (Ar )s = Ars
Example
·
1
If A =
1
Example
·
1
If A =
¸
"
¸
·
#
2n−1
1
then An = n−1
1
2
2n−1
for all n ≥ 1.
2n−1
¸
·
¸
·
¸
1
2 1
3 2
5 3
, then A2 =
, A3 =
, A4 =
an so on.In
1 0
1 0
1 0
1 0
·
¸
f
f
fact, An = n n−1 , (n ≥ 1) where fn is defined by the following
1
0
sequence
Fibonacci Sequence: f0 = f1 = 1, fn = fn−1 + fn−2 for n ≥ 2
1.15
§ 1.3 Consistent
The Transpose of a Matrix
Systems of Linear
Equations
§ 1.5 Matrix
1
Definition The transpose of an m × n matrix A is the n × m
matrix AT obtained by interchanging the rows and columns of A.
That is, the ith column of AT is the ith row of A for all i.
Operations
[AT ]i,j = [A]j,i
2
Definition A square matrix A is symmetric if AT = A, that is, if A
is equal to its own transpose.
Example Let
1

A = 3
2
3
5
0
·
2
1
0 and B =
−1
4

2
3
¸
Remark 1: A square matrix A is symmetric if and only if Aij = Aji
for all i and j.
1.16
Download