Centering matrix

From HandWiki
Revision as of 17:31, 6 March 2023 by JOpenQuest (talk | contribs) (add)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Kind of matrix

In mathematics and multivariate statistics, the centering matrix[1] is a symmetric and idempotent matrix, which when multiplied with a vector has the same effect as subtracting the mean of the components of the vector from every component of that vector.

Definition

The centering matrix of size n is defined as the n-by-n matrix

[math]\displaystyle{ C_n = I_n - \tfrac{1}{n}J_n }[/math]

where [math]\displaystyle{ I_n\, }[/math] is the identity matrix of size n and [math]\displaystyle{ J_n }[/math] is an n-by-n matrix of all 1's.

For example

[math]\displaystyle{ C_1 = \begin{bmatrix} 0 \end{bmatrix} }[/math],
[math]\displaystyle{ C_2= \left[ \begin{array}{rrr} 1 & 0 \\ 0 & 1 \end{array} \right] - \frac{1}{2}\left[ \begin{array}{rrr} 1 & 1 \\ 1 & 1 \end{array} \right] = \left[ \begin{array}{rrr} \frac{1}{2} & -\frac{1}{2} \\ -\frac{1}{2} & \frac{1}{2} \end{array} \right] }[/math] ,
[math]\displaystyle{ C_3 = \left[ \begin{array}{rrr} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{array} \right] - \frac{1}{3}\left[ \begin{array}{rrr} 1 & 1 & 1 \\ 1 & 1 & 1 \\ 1 & 1 & 1 \end{array} \right] = \left[ \begin{array}{rrr} \frac{2}{3} & -\frac{1}{3} & -\frac{1}{3} \\ -\frac{1}{3} & \frac{2}{3} & -\frac{1}{3} \\ -\frac{1}{3} & -\frac{1}{3} & \frac{2}{3} \end{array} \right] }[/math]

Properties

Given a column-vector, [math]\displaystyle{ \mathbf{v}\, }[/math] of size n, the centering property of [math]\displaystyle{ C_n\, }[/math] can be expressed as

[math]\displaystyle{ C_n\,\mathbf{v} = \mathbf{v} - (\tfrac{1}{n}J_{n,1}^\textrm{T}\mathbf{v})J_{n,1} }[/math]

where [math]\displaystyle{ J_{n,1} }[/math] is a column vector of ones and [math]\displaystyle{ \tfrac{1}{n}J_{n,1}^\textrm{T}\mathbf{v} }[/math] is the mean of the components of [math]\displaystyle{ \mathbf{v}\, }[/math].

[math]\displaystyle{ C_n\, }[/math] is symmetric positive semi-definite.

[math]\displaystyle{ C_n\, }[/math] is idempotent, so that [math]\displaystyle{ C_n^k=C_n }[/math], for [math]\displaystyle{ k=1,2,\ldots }[/math]. Once the mean has been removed, it is zero and removing it again has no effect.

[math]\displaystyle{ C_n\, }[/math] is singular. The effects of applying the transformation [math]\displaystyle{ C_n\,\mathbf{v} }[/math] cannot be reversed.

[math]\displaystyle{ C_n\, }[/math] has the eigenvalue 1 of multiplicity n − 1 and eigenvalue 0 of multiplicity 1.

[math]\displaystyle{ C_n\, }[/math] has a nullspace of dimension 1, along the vector [math]\displaystyle{ J_{n,1} }[/math].

[math]\displaystyle{ C_n\, }[/math] is an orthogonal projection matrix. That is, [math]\displaystyle{ C_n\mathbf{v} }[/math] is a projection of [math]\displaystyle{ \mathbf{v}\, }[/math] onto the (n − 1)-dimensional subspace that is orthogonal to the nullspace [math]\displaystyle{ J_{n,1} }[/math]. (This is the subspace of all n-vectors whose components sum to zero.)

The trace of [math]\displaystyle{ C_n }[/math] is [math]\displaystyle{ n(n-1)/n = n-1 }[/math].

Application

Although multiplication by the centering matrix is not a computationally efficient way of removing the mean from a vector, it is a convenient analytical tool. It can be used not only to remove the mean of a single vector, but also of multiple vectors stored in the rows or columns of an m-by-n matrix [math]\displaystyle{ X }[/math].

The left multiplication by [math]\displaystyle{ C_m }[/math] subtracts a corresponding mean value from each of the n columns, so that each column of the product [math]\displaystyle{ C_m\,X }[/math] has a zero mean. Similarly, the multiplication by [math]\displaystyle{ C_n }[/math] on the right subtracts a corresponding mean value from each of the m rows, and each row of the product [math]\displaystyle{ X\,C_n }[/math] has a zero mean. The multiplication on both sides creates a doubly centred matrix [math]\displaystyle{ C_m\,X\,C_n }[/math], whose row and column means are equal to zero.

The centering matrix provides in particular a succinct way to express the scatter matrix, [math]\displaystyle{ S=(X-\mu J_{n,1}^{\mathrm{T}})(X-\mu J_{n,1}^{\mathrm{T}})^{\mathrm{T}} }[/math] of a data sample [math]\displaystyle{ X\, }[/math], where [math]\displaystyle{ \mu=\tfrac{1}{n}X J_{n,1} }[/math] is the sample mean. The centering matrix allows us to express the scatter matrix more compactly as

[math]\displaystyle{ S=X\,C_n(X\,C_n)^{\mathrm{T}}=X\,C_n\,C_n\,X\,^{\mathrm{T}}=X\,C_n\,X\,^{\mathrm{T}}. }[/math]

[math]\displaystyle{ C_n }[/math] is the covariance matrix of the multinomial distribution, in the special case where the parameters of that distribution are [math]\displaystyle{ k=n }[/math], and [math]\displaystyle{ p_1=p_2=\cdots=p_n=\frac{1}{n} }[/math].

References

  1. John I. Marden, Analyzing and Modeling Rank Data, Chapman & Hall, 1995, ISBN:0-412-99521-2, page 59.