Multiplication of Vectors by Matrices

advertisement
2.2
Multiplication of Vectors by Matrices – Linear Functions
In section 1.4 we discussed how to multiply a row vector a times a column vector x to get
a number ax. We saw that by holding one of the two vectors a or x fixed, we obtained a
linear function of the other. In this section we extend multiplication to a matrix times a
vector and again obtain a linear function.
Definition 1.
a. To multiply a matrix A by a column vector x one multiplies each row of A by x to
get a column vector Ax. More precisely, the ith component of Ax is the ith row of
A times x, i.e.
n
(Ax)i = (Ai,●)x =  Aijxj
j=1
In order to do this the number of columns n in A must be equal to the number of
components in x.
b. To multiply a row vector p by a matrix A one multiplies p by each column of A to
get a row vector pA. More precisely, the jth component of pA is p times the jth
column of A, i.e.
m
(pA)j = p(A●,j) =  piAij
i=1
In order to do this the number of rows m in A must equal the number of
components of p.
Examples.
 (5, 7) 3  
 (2, 3) 2  
  32  
 (3, 5) 3  
2
5 7 2
 
Ax =  2 3   3  =
3 5
 10 + 21 
 31 
=  4 + 9  =  13 
 6 + 15 
 21 
For each row of A you go across that row of A and down x multiplying corresponding
components and adding to get the corresponding component of Ax.
5 7
5
7




pA = (2, 3, 5)  2 3  = ((2, 3, 5)  2  , (2, 3, 5)  3 )
3 5
3
5
= (10 + 6 + 15, 14 + 9 + 25) = (31, 48)
2.2 - 1
Example 1. In Example 1 in section 1.3 an electronics company made two types of circuit boards for
computers, namely ethernet cards and sound cards. Each of these boards requires a certain number of
resistors, capacitors and transistors as follows
resistors
capacitors
transistors
ethernet card
5
2
3
sound card
7
3
5
Let
e = # of ethernet cards the company makes in a certain day
s = # of sound card the company makes in a certain day
r = # of resistors needed to produce the e ethernet cards and s sound cards
c = # of capacitors needed to produce the e ethernet cards and s sound cards
t = # of transistors needed to produce the e ethernet cards and s sound cards
pr = price of a resistor
pc = price of a capacitor
pr = price of a transistor
pe = cost of all the resistors, inductors and transistors in an ethernet card
ps = cost of all the resistors, inductors and transistors in an sound card
Then we had the following linear functions.
e
r = 5e + 7s = (5, 7)  s  = (5, 7) x
e
c = 2e + 3s = (2, 3)  s  = (2, 3) x
e
t = 3e + 5s = (3, 5)  s  = (3, 5) x
5
5
pe = 5pr + 2pe + 3pt = (pr, pc, pt)  2  = p  2 
3
3
7
 
7
ps = 7pr + 3pe + 2pt = (pr, pc, pt)  3  = p  3 
2
2
where
e
x =  s 
p = (pr, pc, pt)
If we group r, c and t into a column vector y then we have
 (5, 7)  s  
 (2, 3)  e  
  es  
 (3, 5)  s  
e
y =
r
 5e + 7s 
 c  =  2e + 3s  =
t
 3e + 5s 
A =
5 7
2 3
3 5
where
2.2 - 2
=
5 7
 2 3  x = Ax
3 5
The point is that a set of linear equations can be represented compactly by a single vector matrix equation
y = Ax. Similarly, if we group pe and ps into a vector q then one has
5
7
q = (pe, ps) = (5pr + 2pe + 3pt, 7pr + 3pe + 2pt) = ((pr, pc, pt)  2 , (pr, pc, pt)  3  )
3
2
5
7


= (pr, pc, pt)  2 3  = pA
3 5
Identity Matrices. There is a special group of matrices called the identity matrices.
These are square matrices with the property that they have 1's on the main diagonal and
0's everywhere else. A square matrix is one where the number of rows and columns are
equal. The main diagonal of a matrix A are those entries whose row and column
subscripts are equal, i.e. the entries Aii for some i. The identity matrices are denoted by I.
Here are some identity matrices.
1
I =  0
1
I = 0
0
0
1
0 0
1 0
0 1
= the 22 identity matrix

 = the 33 identity matrix


I =
 10 01 00  00 


.
.
.

.


0 0 0  1
= the nn identity matrix
Thus
Iij =
1

0
if i = j
if i  j
These are called the identity matrices because they act like the number 1 for matrix
multiplication. If x is a column vector and p is a row vector then
Ix = x
pI = p
To see the first of these two relations, consider the ith component of Ix.
n
(Ix)i =  Iijxj
j=1
Since Iij = 0 unless j = i one has (Ix)i = Iiixi = xi. So Ix = x.
2.2 - 3
Another Way to Multiply a Vector by a Matrix. The following proposition gives
another way of viewing multiplication of a vector by a matrix.
Proposition 1.
a. Ax is the linear combination of the columns of A using the components of x as the
coefficients, i.e.
n
Ax = x1A●,1 + x2A●,2 +  + xnA●,n =  xjA●,j
j=1
b. pA is the linear combination of the rows of A using the components of p as the
coefficients, i.e.
pA = p1A1,● + p2A2,● +  + pmAm,● =
m
 piAi,●
p=1
Proof. To prove part A note that
n
n
n
j=1
j=1
j=1
(  xjA●,j)i =  xj(A●,j)i =  xjAij = (Ax)i
The proof of part b is similar. //
Examples.
5 7 2
5
7
 10   21 
 31 
Ax =  2 3   3  = 2 2  + 3 3  =  4  +  9  =  13 
 21 
3 5
3
5
 6   15 
5 7
pA = (2, 3, 5)  2 3  = 2(5, 7) + 3(2, 3) + 5(3, 5)
3 5
= (10, 14) + (6, 9) + (15, 25) = (31, 48)
Algebraic Properties of Multiplication. The product of a vector and a matrix satisfies
many of the familiar algebraic properties of multiplication.
Proposition 2. If A and B are matrices, x and y are column vectors, p and q are row
vectors and a is a number then the following are true.
(A + B)x = Ax + Bx
A(x + y) = Ax + Ay
p(A + B) = pA + pB
(p + q)A = pA + qA
A(ax) = a(Ax)
(aA)x = a(Ax)
(ap)A = a(pA)
p(aA) = a(pA)
2.2 - 4
(Ax)T = xTAT
p(Ax) = (pA)x
(pA)T = ATpT
Proof. These are all pretty easy. We prove (Ax)T = xTAT and p(Ax) = p(Ax) as
illustrations. To prove the first note that ((Ax)T)j = (Ax)j = (Aj,●)x = xT(Aj,●)T = xT((AT)●,j) =
(xTAT)j. Note that we used the fact that the jth column of AT is the same as the transpose of
the jth row of A. To prove p(Ax) = p(Ax) note that
n
n
n
m

m

p(Ax) =  pi(Ax)i =  pi   Aijxj =    piAijxj


i=1
i = 1 j = 1
i = 1 j = 1
n
m


=    bij
i = 1 j = 1 
m
m
m
 n

 n

(pA)x =  (pA)jxj =    piAij xj =    piAijxj


j=1
j = 1 i = 1
j = 1 i = 1
m
n


=    bij
j = 1 i = 1 
n
m
m 
 n 
where bij = piAijxj. In general    bij =    bij because in both case one is
i = 1 j = 1 
j = 1 i = 1 
summing bij over all combinations of i and j where i runs from 1 to n and j runs from one
to n. //
2.2 - 5
Download