Matrix determinant lemma
In mathematics, in particular linear algebra, the matrix determinant lemma computes the determinant of the sum of an invertible matrix A and the dyadic product, u vT, of a column vector u and a row vector vT.[1][2]
Statement
Suppose A is an invertible square matrix and u, v are column vectors. Then the matrix determinant lemma states that
- [math]\displaystyle{ \det\left(\mathbf{A} + \mathbf{uv}^\textsf{T}\right) = \left(1 + \mathbf{v}^\textsf{T}\mathbf{A}^{-1}\mathbf{u}\right)\,\det\left(\mathbf{A}\right)\,. }[/math]
Here, uvT is the outer product of two vectors u and v.
The theorem can also be stated in terms of the adjugate matrix of A:
- [math]\displaystyle{ \det\left(\mathbf{A} + \mathbf{uv}^\textsf{T}\right) = \det\left(\mathbf{A}\right) + \mathbf{v}^\textsf{T}\mathrm{adj}\left(\mathbf{A}\right)\mathbf{u}\,, }[/math]
in which case it applies whether or not the square matrix A is invertible.
Proof
First the proof of the special case A = I follows from the equality:[3]
- [math]\displaystyle{ \begin{pmatrix} \mathbf{I} & 0 \\ \mathbf{v}^\textsf{T} & 1 \end{pmatrix} \begin{pmatrix} \mathbf{I} + \mathbf{uv}^\textsf{T} & \mathbf{u} \\ 0 & 1 \end{pmatrix} \begin{pmatrix} \mathbf{I} & 0 \\ -\mathbf{v}^\textsf{T} & 1 \end{pmatrix} = \begin{pmatrix} \mathbf{I} & \mathbf{u} \\ 0 & 1 + \mathbf{v}^\textsf{T}\mathbf{u} \end{pmatrix}. }[/math]
The determinant of the left hand side is the product of the determinants of the three matrices. Since the first and third matrix are triangular matrices with unit diagonal, their determinants are just 1. The determinant of the middle matrix is our desired value. The determinant of the right hand side is simply (1 + vTu). So we have the result:
- [math]\displaystyle{ \det\left(\mathbf{I} + \mathbf{uv}^\textsf{T}\right) = \left(1 + \mathbf{v}^\textsf{T}\mathbf{u}\right). }[/math]
Then the general case can be found as:
- [math]\displaystyle{ \begin{align} \det\left(\mathbf{A} + \mathbf{uv}^\textsf{T}\right) &= \det\left(\mathbf{A}\right) \det\left(\mathbf{I} + \left(\mathbf{A}^{-1}\mathbf{u}\right)\mathbf{v}^\textsf{T}\right)\\ &= \det\left(\mathbf{A}\right) \left(1 + \mathbf{v}^\textsf{T} \left(\mathbf{A}^{-1}\mathbf{u}\right)\right). \end{align} }[/math]
Application
If the determinant and inverse of A are already known, the formula provides a numerically cheap way to compute the determinant of A corrected by the matrix uvT. The computation is relatively cheap because the determinant of A + uvT does not have to be computed from scratch (which in general is expensive). Using unit vectors for u and/or v, individual columns, rows or elements[4] of A may be manipulated and a correspondingly updated determinant computed relatively cheaply in this way.
When the matrix determinant lemma is used in conjunction with the Sherman–Morrison formula, both the inverse and determinant may be conveniently updated together.
Generalization
Suppose A is an invertible n-by-n matrix and U, V are n-by-m matrices. Then
- [math]\displaystyle{ \det\left(\mathbf{A} + \mathbf{UV}^\textsf{T}\right) = \det\left(\mathbf{I_m} + \mathbf{V}^\textsf{T}\mathbf{A}^{-1}\mathbf{U}\right)\det(\mathbf{A}). }[/math]
In the special case [math]\displaystyle{ \mathbf{A}=\mathbf{I_n} }[/math] this is the Weinstein–Aronszajn identity.
Given additionally an invertible m-by-m matrix W, the relationship can also be expressed as
- [math]\displaystyle{ \det\left(\mathbf{A} + \mathbf{UWV}^\textsf{T}\right) = \det\left(\mathbf{W}^{-1} + \mathbf{V}^\textsf{T}\mathbf{A}^{-1}\mathbf{U}\right)\det\left(\mathbf{W}\right)\det\left(\mathbf{A}\right). }[/math]
See also
- The Sherman–Morrison formula, which shows how to update the inverse, A−1, to obtain (A + uvT)−1.
- The Woodbury formula, which shows how to update the inverse, A−1, to obtain (A + UCVT)−1.
- The binomial inverse theorem for (A + UCVT)−1.
References
- ↑ Harville, D. A. (1997). Matrix Algebra From a Statistician's Perspective. New York: Springer-Verlag. ISBN 0-387-94978-X.
- ↑ Brookes, M. (2005). "The Matrix Reference Manual (online)". http://www.ee.ic.ac.uk/hp/staff/dmb/matrix/intro.html.
- ↑ Ding, J.; Zhou, A. (2007). "Eigenvalues of rank-one updated matrices with some applications". Applied Mathematics Letters 20 (12): 1223–1226. doi:10.1016/j.aml.2006.11.016. ISSN 0893-9659.
- ↑ William H. Press; Brian P. Flannery; Saul A. Teukolsky; William T. Vetterling (1992). Numerical Recipes in C: The Art of Scientific Computing. Cambridge University Press. pp. 73. ISBN 0-521-43108-5.
Original source: https://en.wikipedia.org/wiki/Matrix determinant lemma.
Read more |