Schur product theorem

From HandWiki

In mathematics, particularly in linear algebra, the Schur product theorem states that the Hadamard product of two positive definite matrices is also a positive definite matrix. The result is named after Issai Schur[1] (Schur 1911, p. 14, Theorem VII) (note that Schur signed as J. Schur in Journal für die reine und angewandte Mathematik.[2][3])

We remark that the converse of the theorem holds in the following sense. If M is a symmetric matrix and the Hadamard product MN is positive definite for all positive definite matrices N, then M itself is positive definite.

Proof

Proof using the trace formula

For any matrices M and N, the Hadamard product MN considered as a bilinear form acts on vectors a,b as

a*(MN)b=tr(MTdiag(a*)Ndiag(b))

where tr is the matrix trace and diag(a) is the diagonal matrix having as diagonal entries the elements of a.

Suppose M and N are positive definite, and so Hermitian. We can consider their square-roots M12 and N12, which are also Hermitian, and write

tr(MTdiag(a*)Ndiag(b))=tr(M12M12diag(a*)N12N12diag(b))=tr(M12diag(a*)N12N12diag(b)M12)

Then, for a=b, this is written as tr(A*A) for A=N12diag(a)M12 and thus is strictly positive for A0, which occurs if and only if a0. This shows that (MN) is a positive definite matrix.

Proof using Gaussian integration

Case of M = N

Let X be an n-dimensional centered Gaussian random variable with covariance XiXj=Mij. Then the covariance matrix of Xi2 and Xj2 is

Cov(Xi2,Xj2)=Xi2Xj2Xi2Xj2

Using Wick's theorem to develop Xi2Xj2=2XiXj2+Xi2Xj2 we have

Cov(Xi2,Xj2)=2XiXj2=2Mij2

Since a covariance matrix is positive definite, this proves that the matrix with elements Mij2 is a positive definite matrix.

General case

Let X and Y be n-dimensional centered Gaussian random variables with covariances XiXj=Mij, YiYj=Nij and independent from each other so that we have

XiYj=0 for any i,j

Then the covariance matrix of XiYi and XjYj is

Cov(XiYi,XjYj)=XiYiXjYjXiYiXjYj

Using Wick's theorem to develop

XiYiXjYj=XiXjYiYj+XiYiXjYj+XiYjXjYi

and also using the independence of X and Y, we have

Cov(XiYi,XjYj)=XiXjYiYj=MijNij

Since a covariance matrix is positive definite, this proves that the matrix with elements MijNij is a positive definite matrix.

Proof using eigendecomposition

Proof of positive semidefiniteness

Let M=μimimiT and N=νininiT. Then

MN=ijμiνj(mimiT)(njnjT)=ijμiνj(minj)(minj)T

Each (minj)(minj)T is positive semidefinite (but, except in the 1-dimensional case, not positive definite, since they are rank 1 matrices). Also, μiνj>0 thus the sum MN is also positive semidefinite.

Proof of definiteness

To show that the result is positive definite requires even further proof. We shall show that for any vector a0, we have aT(MN)a>0. Continuing as above, each aT(minj)(minj)Ta0, so it remains to show that there exist i and j for which corresponding term above is nonzero. For this we observe that

aT(minj)(minj)Ta=(kmi,knj,kak)2

Since N is positive definite, there is a j for which nja0 (since otherwise njTa=k(nja)k=0 for all j), and likewise since M is positive definite there exists an i for which kmi,k(nja)k=miT(nja)0. However, this last sum is just kmi,knj,kak. Thus its square is positive. This completes the proof.

References

  1. Schur, J. (1911). "Bemerkungen zur Theorie der beschränkten Bilinearformen mit unendlich vielen Veränderlichen". Journal für die reine und angewandte Mathematik 1911 (140): 1–28. doi:10.1515/crll.1911.140.1. 
  2. Zhang, Fuzhen, ed (2005). The Schur Complement and Its Applications. Numerical Methods and Algorithms. 4. doi:10.1007/b105056. ISBN 0-387-24271-6. , page 9, Ch. 0.6 Publication under J. Schur
  3. Ledermann, W. (1983). "Issai Schur and His School in Berlin". Bulletin of the London Mathematical Society 15 (2): 97–106. doi:10.1112/blms/15.2.97.