Spectral radius

From HandWiki
Short description: Largest absolute value of an operator's eigenvalues

In mathematics, the spectral radius of a square matrix is the maximum of the absolute values of its eigenvalues.[1] More generally, the spectral radius of a bounded linear operator is the supremum of the absolute values of the elements of its spectrum. The spectral radius is often denoted by ρ().

Definition

Matrices

Let λ1, ..., λn be the eigenvalues of a matrix ACn×n. The spectral radius of A is defined as

ρ(A)=max{|λ1|,,|λn|}.

The spectral radius can be thought of as an infimum of all norms of a matrix. Indeed, on the one hand, ρ(A)A for every natural matrix norm ; and on the other hand, Gelfand's formula states that ρ(A)=limkAk1/k. Both of these results are shown below.

However, the spectral radius does not necessarily satisfy A𝐯ρ(A)𝐯 for arbitrary vectors 𝐯n. To see why, let r>1 be arbitrary and consider the matrix

Cr=(0r1r0).

The characteristic polynomial of Cr is λ21, so its eigenvalues are {1,1} and thus ρ(Cr)=1. However, Cr𝐞1=r𝐞2. As a result,

Cr𝐞1=r>1=ρ(Cr)𝐞1.

As an illustration of Gelfand's formula, note that Crk1/k1 as k, since Crk=I if k is even and Crk=Cr if k is odd.

A special case in which A𝐯ρ(A)𝐯 for all 𝐯n is when A is a Hermitian matrix and is the Euclidean norm. This is because any Hermitian Matrix is diagonalizable by a unitary matrix, and unitary matrices preserve vector length. As a result,

A𝐯=U*DU𝐯=DU𝐯ρ(A)U𝐯=ρ(A)𝐯.

Bounded linear operators

In the context of a bounded linear operator A on a Banach space, the eigenvalues need to be replaced with the elements of the spectrum of the operator, i.e. the values λ for which AλI is not bijective. We denote the spectrum by

σ(A)={λ:AλIis not bijective}

The spectral radius is then defined as the supremum of the magnitudes of the elements of the spectrum:

ρ(A)=supλσ(A)|λ|

Gelfand's formula, also known as the spectral radius formula, also holds for bounded linear operators: letting denote the operator norm, we have

ρ(A)=limkAk1k=infk*Ak1k.

A bounded operator (on a complex Hilbert space) is called a spectraloid operator if its spectral radius coincides with its numerical radius. An example of such an operator is a normal operator.

Graphs

The spectral radius of a finite graph is defined to be the spectral radius of its adjacency matrix.

This definition extends to the case of infinite graphs with bounded degrees of vertices (i.e. there exists some real number C such that the degree of every vertex of the graph is smaller than C). In this case, for the graph G define:

2(G)={f:V(G)𝐑 : vV(G)f(v)2<}.

Let γ be the adjacency operator of G:

{γ:2(G)2(G)(γf)(v)=(u,v)E(G)f(u)

The spectral radius of G is defined to be the spectral radius of the bounded linear operator γ.

Upper bounds

Upper bounds on the spectral radius of a matrix

The following proposition gives simple yet useful upper bounds on the spectral radius of a matrix.

Proposition. Let ACn×n with spectral radius ρ(A) and a sub-multiplicative matrix norm ||⋅||. Then for each integer k1:

ρ(A)Ak1k.

Proof

Let (v, λ) be an eigenvector-eigenvalue pair for a matrix A. By the sub-multiplicativity of the matrix norm, we get:

|λ|k𝐯=λk𝐯=Ak𝐯Ak𝐯.

Since v ≠ 0, we have

|λ|kAk

and therefore

ρ(A)Ak1k.

concluding the proof.

Upper bounds for spectral radius of a graph

There are many upper bounds for the spectral radius of a graph in terms of its number n of vertices and its number m of edges. For instance, if

(k2)(k3)2mnk(k3)2

where 3kn is an integer, then[2]

ρ(G)2mnk+52+2m2n+94

Symmetric matrices

For real-valued matrices A the inequality ρ(A)A2 holds in particular, where 2 denotes the spectral norm. In the case where A is symmetric, this inequality is tight:

Theorem. Let An×n be symmetric, i.e., A=AT. Then it holds that ρ(A)=A2.

Proof

Let (vi,λi)i=1n be the eigenpairs of A. Due to the symmetry of A, all vi and λi are real-valued and the eigenvectors vi are orthonormal. By the definition of the spectral norm, there exists an xn with x2=1 such that A2=Ax2. Since the eigenvectors vi form a basis of n, there exists factors δ1,,δnn such that x=i=1nδivi which implies that

Ax=i=1nδiAvi=i=1nδiλivi.

From the orthonormality of the eigenvectors vi it follows that

Ax2=i=1nδiλivi2=i=1n|δi||λi|vi2=i=1n|δi||λi|

and

x2=i=1nδivi2=i=1n|δi|vi2=i=1n|δi|.

Since x is chosen such that it maximizes Ax2 while satisfying x2=1, the values of δi must be such that they maximize i=1n|δi||λi| while satisfying i=1n|δi|=1. This is achieved by setting δk=1 for k=argmaxi=1n|λi| and δi=0 otherwise, yielding a value of Ax2=|λk|=ρ(A).

Power sequence

The spectral radius is closely related to the behavior of the convergence of the power sequence of a matrix; namely as shown by the following theorem.

Theorem. Let ACn×n with spectral radius ρ(A). Then ρ(A) < 1 if and only if

limkAk=0.

On the other hand, if ρ(A) > 1, limkAk=. The statement holds for any choice of matrix norm on Cn×n.

Proof

Assume that Ak goes to zero as k goes to infinity. We will show that ρ(A) < 1. Let (v, λ) be an eigenvector-eigenvalue pair for A. Since Akv = λkv, we have

0=(limkAk)𝐯=limk(Ak𝐯)=limkλk𝐯=𝐯limkλk

Since v ≠ 0 by hypothesis, we must have

limkλk=0,

which implies |λ|<1. Since this must be true for any eigenvalue λ, we can conclude that ρ(A) < 1.

Now, assume the radius of A is less than 1. From the Jordan normal form theorem, we know that for all ACn×n, there exist V, JCn×n with V non-singular and J block diagonal such that:

A=VJV1

with

J=[Jm1(λ1)0000Jm2(λ2)0000Jms1(λs1)000Jms(λs)]

where

Jmi(λi)=[λi1000λi1000λi1000λi]𝐂mi×mi,1is.

It is easy to see that

Ak=VJkV1

and, since J is block-diagonal,

Jk=[Jm1k(λ1)0000Jm2k(λ2)0000Jms1k(λs1)000Jmsk(λs)]

Now, a standard result on the k-power of an mi×mi Jordan block states that, for kmi1:

Jmik(λi)=[λik(k1)λik1(k2)λik2(kmi1)λikmi+10λik(k1)λik1(kmi2)λikmi+200λik(k1)λik1000λik]

Thus, if ρ(A)<1 then for all i |λi|<1. Hence for all i we have:

limkJmik=0

which implies

limkJk=0.

Therefore,

limkAk=limkVJkV1=V(limkJk)V1=0

On the other side, if ρ(A)>1, there is at least one element in J that does not remain bounded as k increases, thereby proving the second part of the statement.

Gelfand's formula

Gelfand's formula, named after Israel Gelfand, gives the spectral radius as a limit of matrix norms.

Theorem

For any matrix norm ||⋅||, we have[3]

ρ(A)=limkAk1k.

Moreover, in the case of a consistent matrix norm limkAk1k approaches ρ(A) from above (indeed, in that case ρ(A)Ak1k for all k).

Proof

For any ε > 0, let us define the two following matrices:

A±=1ρ(A)±εA.

Thus,

ρ(A±)=ρ(A)ρ(A)±ε,ρ(A+)<1<ρ(A).

We start by applying the previous theorem on limits of power sequences to A+:

limkA+k=0.

This shows the existence of N+N such that, for all kN+,

A+k<1.

Therefore,

Ak1k<ρ(A)+ε.

Similarly, the theorem on power sequences implies that Ak is not bounded and that there exists NN such that, for all kN,

Ak>1.

Therefore,

Ak1k>ρ(A)ε.

Let N = max{N+, N}. Then,

ε>0N𝐍kNρ(A)ε<Ak1k<ρ(A)+ε,

that is,

limkAk1k=ρ(A).

This concludes the proof.

Corollary

Gelfand's formula yields a bound on the spectral radius of a product of commuting matrices: if A1,,An are matrices that all commute, then

ρ(A1An)ρ(A1)ρ(An).

Numerical example

The convergence of all 3 matrix norms to the spectral radius.

Consider the matrix

A=[912284118]

whose eigenvalues are 5, 10, 10; by definition, ρ(A) = 10. In the following table, the values of Ak1k for the four most used norms are listed versus several increasing values of k (note that, due to the particular form of this matrix,.1=.):

Notes and references

  1. Gradshteĭn, I. S. (1980). Table of integrals, series, and products. I. M. Ryzhik, Alan Jeffrey (Corr. and enl. ed.). New York: Academic Press. ISBN 0-12-294760-6. OCLC 5892996. 
  2. Guo, Ji-Ming; Wang, Zhi-Wen; Li, Xin (2019). "Sharp upper bounds of the spectral radius of a graph" (in en). Discrete Mathematics 342 (9): 2559–2563. doi:10.1016/j.disc.2019.05.017. 
  3. The formula holds for any Banach algebra; see Lemma IX.1.8 in Dunford & Schwartz 1963 and Lax 2002, pp. 195–197

Bibliography

  • Dunford, Nelson; Schwartz, Jacob (1963), Linear operators II. Spectral Theory: Self Adjoint Operators in Hilbert Space, Interscience Publishers, Inc. 
  • Lax, Peter D. (2002), Functional Analysis, Wiley-Interscience, ISBN 0-471-55604-1 

See also