Complete orthogonal decomposition
In linear algebra, the complete orthogonal decomposition is a matrix decomposition.[1][2] It is similar to the singular value decomposition, but typically somewhat[3] cheaper to compute and in particular much cheaper and easier to update when the original matrix is slightly altered.[1] Specifically, the complete orthogonal decomposition factorizes an arbitrary complex matrix [math]\displaystyle{ A }[/math] into a product of three matrices, [math]\displaystyle{ A = U T V^* }[/math], where [math]\displaystyle{ U }[/math] and [math]\displaystyle{ V^* }[/math] are unitary matrices and [math]\displaystyle{ T }[/math] is a triangular matrix. For a matrix [math]\displaystyle{ A }[/math] of rank [math]\displaystyle{ r }[/math], the triangular matrix [math]\displaystyle{ T }[/math] can be chosen such that only its top-left [math]\displaystyle{ r\times r }[/math] block is nonzero, making the decomposition rank-revealing.
For a matrix of size [math]\displaystyle{ m\times n }[/math], assuming [math]\displaystyle{ m \ge n }[/math], the complete orthogonal decomposition requires [math]\displaystyle{ O(mn^2) }[/math] floating point operations and [math]\displaystyle{ O(m^2) }[/math] auxiliary memory to compute, similar to other rank-revealing decompositions.[1] Crucially however, if a row/column is added or removed or the matrix is perturbed by a rank-one matrix, its decomposition can be updated in [math]\displaystyle{ O(mn) }[/math] operations.[1]
Because of its form, [math]\displaystyle{ A = U T V^* }[/math], the decomposition is also known as UTV decomposition.[4] Depending on whether a left-triangular or right-triangular matrix is used in place of [math]\displaystyle{ T }[/math], it is also referred to as ULV decomposition[3] or URV decomposition,[5] respectively.
Construction
The UTV decomposition is usually[6][7] computed by means of a pair of QR decompositions: one QR decomposition is applied to the matrix from the left, which yields [math]\displaystyle{ U }[/math], another applied from the right, which yields [math]\displaystyle{ V^* }[/math], which "sandwiches" triangular matrix [math]\displaystyle{ T }[/math] in the middle.
Let [math]\displaystyle{ A }[/math] be a [math]\displaystyle{ m\times n }[/math] matrix of rank [math]\displaystyle{ r }[/math]. One first performs a QR decomposition with column pivoting:
- [math]\displaystyle{ A\Pi = U \begin{bmatrix} R_{11} & R_{12} \\ 0 & 0 \end{bmatrix} }[/math],
where [math]\displaystyle{ \Pi }[/math] is a [math]\displaystyle{ n\times n }[/math] permutation matrix, [math]\displaystyle{ U }[/math] is a [math]\displaystyle{ m\times m }[/math] unitary matrix, [math]\displaystyle{ R_{11} }[/math] is a [math]\displaystyle{ r\times r }[/math] upper triangular matrix and [math]\displaystyle{ R_{12} }[/math] is a [math]\displaystyle{ r\times(n-r) }[/math] matrix. One then performs another QR decomposition on the adjoint of [math]\displaystyle{ R }[/math]:
- [math]\displaystyle{ \begin{bmatrix} R^*_{11} \\ R^*_{12}\end{bmatrix} = V' \begin{bmatrix} T^* \\ 0\end{bmatrix} }[/math],
where [math]\displaystyle{ V' }[/math] is a [math]\displaystyle{ n\times n }[/math] unitary matrix and [math]\displaystyle{ T }[/math] is an [math]\displaystyle{ r\times r }[/math] lower (left) triangular matrix. Setting [math]\displaystyle{ V = \Pi V' }[/math] yields the complete orthogonal (UTV) decomposition:[1]
- [math]\displaystyle{ A = U \begin{bmatrix} T & 0 \\ 0 & 0 \end{bmatrix} V^* }[/math].
Since any diagonal matrix is by construction triangular, the singular value decomposition, [math]\displaystyle{ A = USV^* }[/math], where [math]\displaystyle{ S_{11} \ge S_{22} \ge \ldots \ge 0 }[/math], is a special case of the UTV decomposition. Computing the SVD is slightly more expensive than the UTV decomposition,[3] but has a stronger rank-revealing property.
See also
References
- ↑ 1.0 1.1 1.2 1.3 1.4 Golub, Gene; van Loon, Charles F. (15 October 1996). Matrix Computations (Third ed.). Johns Hopkins University Press. ISBN 0-8018-5414-8.
- ↑ Björck, Åke (December 1996). Numerical methods for least squares problems. SIAM. ISBN 0-89871-360-9.
- ↑ 3.0 3.1 3.2 Chandrasekaran, S.; Gu, M.; Pals, T. (January 2006). "A Fast ULV Decomposition Solver for Hierarchically Semiseparable Representations". SIAM Journal on Matrix Analysis and Applications 28 (3): 603–622. doi:10.1137/S0895479803436652.
- ↑ Fierro, Ricardo D.; Hansen, Per Christian; Hansen, Peter Søren Kirk (1999). "UTV Tools: Matlab templates for rank-revealing UTV decompositions". Numerical Algorithms 20 (2/3): 165–194. doi:10.1023/A:1019112103049. https://backend.orbit.dtu.dk/ws/files/57025531/10.1023_A_1019112103049.pdf.
- ↑ Adams, G.; Griffin, M.F.; Stewart, G.W. (1991). "Direction-of-arrival estimation using the rank-revealing URV decomposition". [Proceedings] ICASSP 91: 1991 International Conference on Acoustics, Speech, and Signal Processing. Proc. Of IEEE Internat. Conf. On Acoustics, Speech, and Signal Processing. pp. 1385-1388 vol.2. doi:10.1109/icassp.1991.150681. ISBN 0-7803-0003-3.
- ↑ "LAPACK – Complete Orthogonal Factorization". https://netlib.org/lapack/lug/node43.html.
- ↑ "Eigen::CompleteOrthogonalDecomposition". https://eigen.tuxfamily.org/dox-3.3/classEigen_1_1CompleteOrthogonalDecomposition.html.
Original source: https://en.wikipedia.org/wiki/Complete orthogonal decomposition.
Read more |