Short description: Largest absolute value of an operator's eigenvalues

In mathematics, the spectral radius of a square matrix is the maximum of the absolute values of its eigenvalues.[1] More generally, the spectral radius of a bounded linear operator is the supremum of the absolute values of the elements of its spectrum. The spectral radius is often denoted by ρ(·).

## Definition

### Matrices

Let λ1, ..., λn be the eigenvalues of a matrix ACn×n. The spectral radius of A is defined as

$\displaystyle{ \rho(A) = \max \left \{ |\lambda_1|, \dotsc, |\lambda_n| \right \}. }$

The spectral radius can be thought of as an infimum of all norms of a matrix. Indeed, on the one hand, $\displaystyle{ \rho(A) \leqslant \|A\| }$ for every natural matrix norm $\displaystyle{ \|\cdot\| }$; and on the other hand, Gelfand's formula states that $\displaystyle{ \rho(A) = \lim_{k\to\infty} \|A^k\|^{1/k} }$. Both of these results are shown below.

However, the spectral radius does not necessarily satisfy $\displaystyle{ \|A\mathbf{v}\| \leqslant \rho(A) \|\mathbf{v}\| }$ for arbitrary vectors $\displaystyle{ \mathbf{v} \in \mathbb{C}^n }$. To see why, let $\displaystyle{ r \gt 1 }$ be arbitrary and consider the matrix

$\displaystyle{ C_r = \begin{pmatrix} 0 & r^{-1} \\ r & 0 \end{pmatrix} }$.

The characteristic polynomial of $\displaystyle{ C_r }$ is $\displaystyle{ \lambda^2 - 1 }$, so its eigenvalues are $\displaystyle{ \{-1, 1\} }$ and thus $\displaystyle{ \rho(C_r) = 1 }$. However, $\displaystyle{ C_r \mathbf{e}_1 = r \mathbf{e}_2 }$. As a result,

$\displaystyle{ \| C_r \mathbf{e}_1 \| = r \gt 1 = \rho(C_r) \|\mathbf{e}_1\|. }$

As an illustration of Gelfand's formula, note that $\displaystyle{ \|C_r^k\|^{1/k} \to 1 }$ as $\displaystyle{ k \to \infty }$, since $\displaystyle{ C_r^k = I }$ if $\displaystyle{ k }$ is even and $\displaystyle{ C_r^k = C_r }$ if $\displaystyle{ k }$ is odd.

A special case in which $\displaystyle{ \|A\mathbf{v}\| \leqslant \rho(A) \|\mathbf{v}\| }$ for all $\displaystyle{ \mathbf{v} \in \mathbb{C}^n }$ is when $\displaystyle{ A }$ is a Hermitian matrix and $\displaystyle{ \|\cdot\| }$ is the Euclidean norm. This is because any Hermitian Matrix is diagonalizable by a unitary matrix, and unitary matrices preserve vector length. As a result,

$\displaystyle{ \|A\mathbf{v}\| = \|U^*DU\mathbf{v}\| = \|DU\mathbf{v}\| \leqslant \rho(A) \|U\mathbf{v}\| = \rho(A) \|\mathbf{v}\| . }$

### Bounded linear operators

In the context of a bounded linear operator A on a Banach space, the eigenvalues need to be replaced with the elements of the spectrum of the operator, i.e. the values $\displaystyle{ \lambda }$ for which $\displaystyle{ A - \lambda I }$ is not bijective. We denote the spectrum by

$\displaystyle{ \sigma(A) = \left\{ \lambda \in \Complex: A - \lambda I \; \text{is not bijective} \right\} }$

The spectral radius is then defined as the supremum of the magnitudes of the elements of the spectrum:

$\displaystyle{ \rho(A) = \sup_{\lambda \in \sigma(A)} |\lambda| }$

Gelfand's formula, also known as the spectral radius formula, also holds for bounded linear operators: letting $\displaystyle{ \|\cdot\| }$ denote the operator norm, we have

$\displaystyle{ \rho(A) = \lim_{k \to \infty}\|A^k\|^{\frac{1}{k}}=\inf_{k\in\mathbb{N}^*} \|A^k\|^{\frac{1}{k}}. }$

A bounded operator (on a complex Hilbert space) is called a spectraloid operator if its spectral radius coincides with its numerical radius. An example of such an operator is a normal operator.

### Graphs

The spectral radius of a finite graph is defined to be the spectral radius of its adjacency matrix.

This definition extends to the case of infinite graphs with bounded degrees of vertices (i.e. there exists some real number C such that the degree of every vertex of the graph is smaller than C). In this case, for the graph G define:

$\displaystyle{ \ell^2(G) = \left \{ f : V(G) \to \mathbf{R} \ : \ \sum\nolimits_{v \in V(G)} \left \|f(v)^2 \right \| \lt \infty \right \}. }$

Let γ be the adjacency operator of G:

$\displaystyle{ \begin{cases} \gamma : \ell^2(G) \to \ell^2(G) \\ (\gamma f)(v) = \sum_{(u,v) \in E(G)} f(u) \end{cases} }$

The spectral radius of G is defined to be the spectral radius of the bounded linear operator γ.

## Upper bounds

### Upper bounds on the spectral radius of a matrix

The following proposition gives simple yet useful upper bounds on the spectral radius of a matrix.

Proposition. Let ACn×n with spectral radius ρ(A) and a consistent matrix norm ||⋅||. Then for each integer $\displaystyle{ k \geqslant 1 }$:

$\displaystyle{ \rho(A)\leq \|A^k\|^{\frac{1}{k}}. }$

Proof

Let (v, λ) be an eigenvector-eigenvalue pair for a matrix A. By the sub-multiplicativity of the matrix norm, we get:

$\displaystyle{ |\lambda|^k\|\mathbf{v}\| = \|\lambda^k \mathbf{v}\| = \|A^k \mathbf{v}\| \leq \|A^k\|\cdot\|\mathbf{v}\|. }$

Since v ≠ 0, we have

$\displaystyle{ |\lambda|^k \leq \|A^k\| }$

and therefore

$\displaystyle{ \rho(A)\leq \|A^k\|^{\frac{1}{k}}. }$

concluding the proof.

### Upper bounds for spectral radius of a graph

There are many upper bounds for the spectral radius of a graph in terms of its number n of vertices and its number m of edges. For instance, if

$\displaystyle{ \frac{(k-2)(k-3)}{2} \leq m-n \leq \frac{k(k-3)}{2} }$

where $\displaystyle{ 3 \le k \le n }$ is an integer, then[2]

$\displaystyle{ \rho(G) \leq \sqrt{2 m-n-k+\frac{5}{2}+\sqrt{2 m-2 n+\frac{9}{4}}} }$

## Power sequence

The spectral radius is closely related to the behavior of the convergence of the power sequence of a matrix; namely as shown by the following theorem.

Theorem. Let ACn×n with spectral radius ρ(A). Then ρ(A) < 1 if and only if

$\displaystyle{ \lim_{k \to \infty} A^k = 0. }$

On the other hand, if ρ(A) > 1, $\displaystyle{ \lim_{k \to \infty} \|A^k\| = \infty }$. The statement holds for any choice of matrix norm on Cn×n.

Proof

Assume that $\displaystyle{ A^k }$ goes to zero as $\displaystyle{ k }$ goes to infinity. We will show that ρ(A) < 1. Let (v, λ) be an eigenvector-eigenvalue pair for A. Since Akv = λkv, we have

\displaystyle{ \begin{align} 0 &= \left(\lim_{k \to \infty} A^k \right) \mathbf{v} \\ &= \lim_{k \to \infty} \left(A^k\mathbf{v} \right ) \\ &= \lim_{k \to \infty} \lambda^k\mathbf{v} \\ &= \mathbf{v} \lim_{k \to \infty} \lambda^k \end{align} }

Since v ≠ 0 by hypothesis, we must have

$\displaystyle{ \lim_{k \to \infty}\lambda^k = 0, }$

which implies |λ| < 1. Since this must be true for any eigenvalue λ, we can conclude that ρ(A) < 1.

Now, assume the radius of A is less than 1. From the Jordan normal form theorem, we know that for all ACn×n, there exist V, JCn×n with V non-singular and J block diagonal such that:

$\displaystyle{ A = VJV^{-1} }$

with

$\displaystyle{ J=\begin{bmatrix} J_{m_1}(\lambda_1) & 0 & 0 & \cdots & 0 \\ 0 & J_{m_2}(\lambda_2) & 0 & \cdots & 0 \\ \vdots & \cdots & \ddots & \cdots & \vdots \\ 0 & \cdots & 0 & J_{m_{s-1}}(\lambda_{s-1}) & 0 \\ 0 & \cdots & \cdots & 0 & J_{m_s}(\lambda_s) \end{bmatrix} }$

where

$\displaystyle{ J_{m_i}(\lambda_i)=\begin{bmatrix} \lambda_i & 1 & 0 & \cdots & 0 \\ 0 & \lambda_i & 1 & \cdots & 0 \\ \vdots & \vdots & \ddots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_i & 1 \\ 0 & 0 & \cdots & 0 & \lambda_i \end{bmatrix}\in \mathbf{C}^{m_i \times m_i}, 1\leq i\leq s. }$

It is easy to see that

$\displaystyle{ A^k=VJ^kV^{-1} }$

and, since J is block-diagonal,

$\displaystyle{ J^k=\begin{bmatrix} J_{m_1}^k(\lambda_1) & 0 & 0 & \cdots & 0 \\ 0 & J_{m_2}^k(\lambda_2) & 0 & \cdots & 0 \\ \vdots & \cdots & \ddots & \cdots & \vdots \\ 0 & \cdots & 0 & J_{m_{s-1}}^k(\lambda_{s-1}) & 0 \\ 0 & \cdots & \cdots & 0 & J_{m_s}^k(\lambda_s) \end{bmatrix} }$

Now, a standard result on the k-power of an $\displaystyle{ m_i \times m_i }$ Jordan block states that, for $\displaystyle{ k \geq m_i-1 }$:

$\displaystyle{ J_{m_i}^k(\lambda_i)=\begin{bmatrix} \lambda_i^k & {k \choose 1}\lambda_i^{k-1} & {k \choose 2}\lambda_i^{k-2} & \cdots & {k \choose m_i-1}\lambda_i^{k-m_i+1} \\ 0 & \lambda_i^k & {k \choose 1}\lambda_i^{k-1} & \cdots & {k \choose m_i-2}\lambda_i^{k-m_i+2} \\ \vdots & \vdots & \ddots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_i^k & {k \choose 1}\lambda_i^{k-1} \\ 0 & 0 & \cdots & 0 & \lambda_i^k \end{bmatrix} }$

Thus, if $\displaystyle{ \rho(A) \lt 1 }$ then for all i $\displaystyle{ |\lambda_i| \lt 1 }$. Hence for all i we have:

$\displaystyle{ \lim_{k \to \infty}J_{m_i}^k=0 }$

which implies

$\displaystyle{ \lim_{k \to \infty} J^k = 0. }$

Therefore,

$\displaystyle{ \lim_{k \to \infty}A^k=\lim_{k \to \infty}VJ^kV^{-1}=V \left (\lim_{k \to \infty}J^k \right )V^{-1}=0 }$

On the other side, if $\displaystyle{ \rho(A)\gt 1 }$, there is at least one element in J that does not remain bounded as k increases, thereby proving the second part of the statement.

## Gelfand's formula

Gelfand's formula, named after Israel Gelfand, gives the spectral radius as a limit of matrix norms.

### Theorem

For any matrix norm ||⋅||, we have[3]

$\displaystyle{ \rho(A)=\lim_{k \to \infty} \left \|A^k \right \|^{\frac{1}{k}} }$.

Moreover, in the case of a consistent matrix norm $\displaystyle{ \lim_{k \to \infty} \left \|A^k \right \|^{\frac{1}{k}} }$ approaches $\displaystyle{ \rho(A) }$ from above (indeed, in that case $\displaystyle{ \rho(A) \leq \left \|A^k \right \|^{\frac{1}{k}} }$ for all $\displaystyle{ k }$).

### Proof

For any ε > 0, let us define the two following matrices:

$\displaystyle{ A_{\pm}= \frac{1}{\rho(A) \pm\varepsilon}A. }$

Thus,

$\displaystyle{ \rho \left (A_{\pm} \right ) = \frac{\rho(A)}{\rho(A) \pm \varepsilon}, \qquad \rho (A_+) \lt 1 \lt \rho (A_-). }$

We start by applying the previous theorem on limits of power sequences to A+:

$\displaystyle{ \lim_{k \to \infty} A_+^k=0. }$

This shows the existence of N+N such that, for all kN+,

$\displaystyle{ \left\|A_+^k \right \| \lt 1. }$

Therefore,

$\displaystyle{ \left \|A^k \right \|^{\frac{1}{k}} \lt \rho(A)+\varepsilon. }$

Similarly, the theorem on power sequences implies that $\displaystyle{ \|A_-^k\| }$ is not bounded and that there exists NN such that, for all k ≥ N,

$\displaystyle{ \left\|A_-^k \right \| \gt 1. }$

Therefore,

$\displaystyle{ \left\|A^k \right\|^{\frac{1}{k}} \gt \rho(A)-\varepsilon. }$

Let N = max{N+, N}. Then,

$\displaystyle{ \forall \varepsilon\gt 0\quad \exists N\in\mathbf{N} \quad \forall k\geq N \quad \rho(A)-\varepsilon \lt \left \|A^k \right \|^{\frac{1}{k}} \lt \rho(A)+\varepsilon, }$

that is,

$\displaystyle{ \lim_{k \to \infty} \left \|A^k \right \|^{\frac{1}{k}} = \rho(A). }$

This concludes the proof.

### Corollary

Gelfand's formula yields a bound on the spectral radius of a product of commuting matrices: if $\displaystyle{ A_1, \ldots, A_n }$ are matrices that all commute, then

$\displaystyle{ \rho(A_1 \cdots A_n) \leq \rho(A_1) \cdots \rho(A_n). }$

### Numerical example

Consider the matrix

$\displaystyle{ A=\begin{bmatrix} 9 & -1 & 2\\ -2 & 8 & 4\\ 1 & 1 & 8 \end{bmatrix} }$

whose eigenvalues are 5, 10, 10; by definition, ρ(A) = 10. In the following table, the values of $\displaystyle{ \|A^k\|^{\frac{1}{k}} }$ for the four most used norms are listed versus several increasing values of k (note that, due to the particular form of this matrix,$\displaystyle{ \|.\|_1=\|.\|_\infty }$):

k $\displaystyle{ \|\cdot\|_1=\|\cdot\|_\infty }$ $\displaystyle{ \|\cdot\|_F }$ $\displaystyle{ \|\cdot\|_2 }$
1 14 15.362291496 10.681145748
2 12.649110641 12.328294348 10.595665162
3 11.934831919 11.532450664 10.500980846
4 11.501633169 11.151002986 10.418165779
5 11.216043151 10.921242235 10.351918183
$\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$
10 10.604944422 10.455910430 10.183690042
11 10.548677680 10.413702213 10.166990229
12 10.501921835 10.378620930 10.153031596
$\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$
20 10.298254399 10.225504447 10.091577411
30 10.197860892 10.149776921 10.060958900
40 10.148031640 10.112123681 10.045684426
50 10.118251035 10.089598820 10.036530875
$\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$
100 10.058951752 10.044699508 10.018248786
200 10.029432562 10.022324834 10.009120234
300 10.019612095 10.014877690 10.006079232
400 10.014705469 10.011156194 10.004559078
$\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$
1000 10.005879594 10.004460985 10.001823382
2000 10.002939365 10.002230244 10.000911649
3000 10.001959481 10.001486774 10.000607757
$\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$
10000 10.000587804 10.000446009 10.000182323
20000 10.000293898 10.000223002 10.000091161
30000 10.000195931 10.000148667 10.000060774
$\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$ $\displaystyle{ \vdots }$
100000 10.000058779 10.000044600 10.000018232

## Notes and references

1. Gradshteĭn, I. S. (1980). Table of integrals, series, and products. I. M. Ryzhik, Alan Jeffrey (Corr. and enl. ed.). New York: Academic Press. ISBN 0-12-294760-6. OCLC 5892996.
2. Guo, Ji-Ming; Wang, Zhi-Wen; Li, Xin (2019). "Sharp upper bounds of the spectral radius of a graph" (in en). Discrete Mathematics 342 (9): 2559–2563. doi:10.1016/j.disc.2019.05.017.
3. The formula holds for any Banach algebra; see Lemma IX.1.8 in Dunford & Schwartz 1963 and Lax 2002, pp. 195–197

## Bibliography

• Dunford, Nelson; Schwartz, Jacob (1963), Linear operators II. Spectral Theory: Self Adjoint Operators in Hilbert Space, Interscience Publishers, Inc.
• Lax, Peter D. (2002), Functional Analysis, Wiley-Interscience, ISBN 0-471-55604-1