Shift matrix

From HandWiki
Revision as of 19:03, 6 February 2024 by Steve Marsio (talk | contribs) (url)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

In mathematics, a shift matrix is a binary matrix with ones only on the superdiagonal or subdiagonal, and zeroes elsewhere. A shift matrix U with ones on the superdiagonal is an upper shift matrix. The alternative subdiagonal matrix L is unsurprisingly known as a lower shift matrix. The (i, j )th component of U and L are

[math]\displaystyle{ U_{ij} = \delta_{i+1,j}, \quad L_{ij} = \delta_{i,j+1}, }[/math]

where [math]\displaystyle{ \delta_{ij} }[/math] is the Kronecker delta symbol.

For example, the 5 × 5 shift matrices are

[math]\displaystyle{ U_5 = \begin{pmatrix} 0 & 1 & 0 & 0 & 0 \\ 0 & 0 & 1 & 0 & 0 \\ 0 & 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 0 & 1 \\ 0 & 0 & 0 & 0 & 0 \end{pmatrix} \quad L_5 = \begin{pmatrix} 0 & 0 & 0 & 0 & 0 \\ 1 & 0 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 & 0 \\ 0 & 0 & 1 & 0 & 0 \\ 0 & 0 & 0 & 1 & 0 \end{pmatrix}. }[/math]

Clearly, the transpose of a lower shift matrix is an upper shift matrix and vice versa.

As a linear transformation, a lower shift matrix shifts the components of a column vector one position down, with a zero appearing in the first position. An upper shift matrix shifts the components of a column vector one position up, with a zero appearing in the last position.[1]

Premultiplying a matrix A by a lower shift matrix results in the elements of A being shifted downward by one position, with zeroes appearing in the top row. Postmultiplication by a lower shift matrix results in a shift left. Similar operations involving an upper shift matrix result in the opposite shift.

Clearly all finite-dimensional shift matrices are nilpotent; an n × n shift matrix S becomes the zero matrix when raised to the power of its dimension n.

Shift matrices act on shift spaces. The infinite-dimensional shift matrices are particularly important for the study of ergodic systems. Important examples of infinite-dimensional shifts are the Bernoulli shift, which acts as a shift on Cantor space, and the Gauss map, which acts as a shift on the space of continued fractions (that is, on Baire space.)

Properties

Let L and U be the n × n lower and upper shift matrices, respectively. The following properties hold for both U and L. Let us therefore only list the properties for U:

The following properties show how U and L are related:

  • LT = U; UT = L
  • The null spaces of U and L are
    [math]\displaystyle{ N(U) = \operatorname{span}\left\{ (1, 0, \ldots, 0)^\mathsf{T} \right\}, }[/math]
    [math]\displaystyle{ N(L) = \operatorname{span}\left\{ (0, \ldots, 0, 1)^\mathsf{T} \right\}. }[/math]
  • The spectrum of U and L is [math]\displaystyle{ \{0\} }[/math]. The algebraic multiplicity of 0 is n, and its geometric multiplicity is 1. From the expressions for the null spaces, it follows that (up to a scaling) the only eigenvector for U is [math]\displaystyle{ (1, 0, \ldots, 0)^\mathsf{T} }[/math], and the only eigenvector for L is [math]\displaystyle{ (0, \ldots, 0, 1)^\mathsf{T} }[/math].
  • For LU and UL we have
    [math]\displaystyle{ UL = I - \operatorname{diag}(0, \ldots, 0, 1), }[/math]
    [math]\displaystyle{ LU = I - \operatorname{diag}(1, 0, \ldots, 0). }[/math]
    These matrices are both idempotent, symmetric, and have the same rank as U and L
  • LnaUna + LaUa = UnaLna + UaLa = I (the identity matrix), for any integer a between 0 and n inclusive.

If N is any nilpotent matrix, then N is similar to a block diagonal matrix of the form

[math]\displaystyle{ \begin{pmatrix} S_1 & 0 & \ldots & 0 \\ 0 & S_2 & \ldots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \ldots & S_r \end{pmatrix} }[/math]

where each of the blocks S1S2, ..., Sr is a shift matrix (possibly of different sizes).[2][3]

Examples

[math]\displaystyle{ S = \begin{pmatrix} 0 & 0 & 0 & 0 & 0 \\ 1 & 0 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 & 0 \\ 0 & 0 & 1 & 0 & 0 \\ 0 & 0 & 0 & 1 & 0 \end{pmatrix}; \quad A = \begin{pmatrix} 1 & 1 & 1 & 1 & 1 \\ 1 & 2 & 2 & 2 & 1 \\ 1 & 2 & 3 & 2 & 1 \\ 1 & 2 & 2 & 2 & 1 \\ 1 & 1 & 1 & 1 & 1 \end{pmatrix}. }[/math]

Then,

[math]\displaystyle{ SA = \begin{pmatrix} 0 & 0 & 0 & 0 & 0 \\ 1 & 1 & 1 & 1 & 1 \\ 1 & 2 & 2 & 2 & 1 \\ 1 & 2 & 3 & 2 & 1 \\ 1 & 2 & 2 & 2 & 1 \end{pmatrix}; \quad AS = \begin{pmatrix} 1 & 1 & 1 & 1 & 0 \\ 2 & 2 & 2 & 1 & 0 \\ 2 & 3 & 2 & 1 & 0 \\ 2 & 2 & 2 & 1 & 0 \\ 1 & 1 & 1 & 1 & 0 \end{pmatrix}. }[/math]

Clearly there are many possible permutations. For example, [math]\displaystyle{ S^\mathsf{T} A S }[/math] is equal to the matrix A shifted up and left along the main diagonal.

[math]\displaystyle{ S^\mathsf{T}AS=\begin{pmatrix} 2 & 2 & 2 & 1 & 0 \\ 2 & 3 & 2 & 1 & 0 \\ 2 & 2 & 2 & 1 & 0 \\ 1 & 1 & 1 & 1 & 0 \\ 0 & 0 & 0 & 0 & 0 \end{pmatrix}. }[/math]

See also

Notes

  1. (Beauregard Fraleigh)
  2. (Beauregard Fraleigh)
  3. (Herstein 1964)

References

External links