Uploaded by Seth Howard

MatrixTheory Norms

advertisement
Matrix Theory, Math6304
Lecture Notes from November 20, 2012
taken by Andy Chang
Last Time:
• Example for induced matrix norms
• Matrix norms and spectral radius
5.3
Equivalence of norms and Gelfand formula
5.3.1 Theorem. Any two norms � · � and |||·||| on Cn are equivalent, that is, there exists c > 1
such that for every x ∈ Cn ,
1
�x� ≤ |||x||| ≤ c�x�.
c
Proof. For later.
5.3.2 Theorem (Gelfand formula). If |||·||| is any matrix norm on Mn , A ∈ Mn , then
��� ��� 1
ρ(A) = lim ���Ak ��� k .
k→∞
For the proof, we want to show double-sided inequalities.
Proof. We have ρ(A)k as the maximum modulus occurring amongst the eigenvalues raised to the
k-power so this is equivalent to taking the spectral radius of Ak , thus we obtain ρ(Ak ) = ρ(Ak ).
By the usual inequality of the spectral radius and the matrix norm, we get the inequality
��� ���
ρ(Ak ) = ρ(Ak ) ≤ ���Ak ���
so taking the k-th root,
��� ��� 1
ρ(A) ≤ ���Ak ��� k .
Conversely, we prove that for every � > 0, there exists K ∈ N such that for all for all k, k ≥ K,
��� k ���
���A ��� ≤ ρ(A) + �.
�
�
We recall that there is a norm � · � on Mn with �A� ≤ ρ(A) + 2� and there is a c ≥ 1 such
that |||x||| ≤ c�x� for all x ∈ Mn .
1
Utilizing the previous theorem and submultiplicativity, we obtain,
�
�k
��� k ���
���A ��� ≤ c�Ak � ≤ c�A�k ≤ c ρ(A) + �
2
and taking the k-th root
�
�
��� k ��� k1
���A ��� ≤ c k1 ρ(A) + �
2
1/k
Note that as k → ∞, c → 1. Thus, choosing K sufficiently large gives that, for all k ≥ K,
��� k ��� k1
���A ��� ≤ ρ(A) + �.
The first statement gives that there is matrix norm that is epsilon close to the spectral radius.
The second statement says no matter the norm, raising it to a power, evaluating this result, then
taking the kth root results in an expression which, as k → ∞, converges to the spectral radius.
Thus, you can always find the largest eigenvalue for a matrix by looking at the powers and
applying any matrix norm. This is useful in obtaining explicit formulas for matrix norms with
difficult expressions.
5.4
Eigenvalue Perturbation
Question: What can we say about the spectrum of a matrix A with Ai,i = 1 and ”small” off
diagonal?
Essentially, we have something that is close to the identity matrix, so we expect that all eigenvalues are close to 1.
Answer: Let A = I + B, Bj,j = 0 for all j ∈ {1, 2, · · · , n}.
Assume we have an eigenvalue λ, Ax = λx then,
(I + B)x = λx ⇒ Bx = (λ − I)x
so we have λ ∈ {z ∈ C : |z − 1| ≤ ρ(B)}.
From ρ ≤ �B�∞→∞ = max
j
In terms of A,
n
�
l=1
|Bj,l |
|λ − 1| ≤ max
j
n
�
l=1
l�=j
|Aj,l |
Later, we will see how to relate all eigenvalues and diagonal entries with a similar expression.
In order to prepare this, we will start with a qualitative result on the eigenvalue perturbation.
2
5.4.3 Theorem. Let {Ak } be a sequence in Mn , Ak → A entry-wise with some A ∈ Mn and
λj (Ak ) are the eigenvalues of Ak . Then there exists permutations {πk } on {1, 2, · · · , n} such
that
lim λπk (Ak ) = λj (Ak )
k→+∞
Proof. Assume the converse.
If the eigenvalues of a sequence {Ak }, where Ak → A, do not converge, then there exists � > 0,
and eigenvalue λ of multiplicity m of A and subsequence {Aki } such that the number of eigenvalues of Aki in D� (λ) = {z ∈ K : |z − λ| ≤ �} is smaller than m.
Using Schur’s theorem, we triangularize {Aki }, so Ti = Ui∗ Aki Ui and choose Ti and the columns
of Ui such that the first n − m + 1 diagonal entries of Ti are outside of the epsilon ball D� (λ).
(Schur triangularization allows to put the eigenvalues in any order.) By compactness of unitaries,
we can choose a subsequence Uil → U with U unitary and
Til = Ui∗l Akil Uil −→ U ∗ AU = T
with T upper triangular (because each of Til is upper triangular). But A has eigenvalue λ on
first m entries of its diagonal, which contradicts entry-wise convergence and the property of
prelimiting Til .
Next, we arrive at a quantitative result on pertubations.
5.4.4 Theorem (Geršgorin). Let A = [Ai,j ]ni,j=1 ∈ Mn .
For
j ∈ {1, 2, · · · , n}, Rj (A) =
n
�
l=1
l�=j
|Aj,l |, Gj (A) = {z ∈ C : |z − Aj,j | ≤ Rj (A),
then if λ is an eigenvalue of A, we have,
λ∈
n
�
Gj (A).
j=1
Furthermore, if the union of k such discs Gj (A) is disconnected from the others, then it contains
k-eigenvalues.
Proof. Take λ as an eigenvalue, Ax = λx, where x �= 0. Pick p such that |xp | = �x�∞
then,
n
�
λxp = [Ax]p =
Ap,j xj
j=1
and
xp (λ − Ap,p ) =
3
n
�
j=1
j�=p
Ap,j xj
We estimate
|xp ||λ − Ap,p | ≤
n
�
j=1
j�=p
|Ap,j ||xj | = Rp (A)
and λ is in a disc centered at some Ap,p with radius Rp (A).
To prove eigenvalue count, we use continuity.
Sketch:
Figure 1:
Write A = D + B, with D diagonal, Dj,j, = Aj,j for all j, and B = A − D. Set A� = D + �B,
for 0 ≤ � ≤ 1. Note that Rj (A� ) = �Rj (A).
WLOG, assume that the first k discs G1 (A), G2 (A), · · · , Gk (A) form a connected region, disjoint
from other discs. Then,
k
�
H(�) =
Gj (A� ) ⊂ H(1)
j=1
is also disjoint from Gj (A� ), j > k. Furthermore, by continuity of eigenvalues, � �→ λj (A� ) is
continuous starting at λj (A0 ) = Aj,j and all λj (A� ) remain in the H(1). This implies that there
are at least k eigenvalues of A1 ≡ A in H(1).
Moreover, by the same argument for j > k and
�
j>k
most k eigenvalues of A are in H(1).
4
Gj (A� )
�
H(1) = ∅, we have that at
Download