Mutual coherence (linear algebra)
In linear algebra, the coherence or mutual coherence of a matrix A is defined as the maximum absolute value of the cross-correlations between the columns of A.[1][2] Formally, let [math]\displaystyle{ a_1, \ldots, a_m\in {\mathbb C}^d }[/math] be the columns of the matrix A, which are assumed to be normalized such that [math]\displaystyle{ a_i^H a_i = 1. }[/math] The mutual coherence of A is then defined as[1][2]
- [math]\displaystyle{ M = \max_{1 \le i \ne j \le m} \left| a_i^H a_j \right|. }[/math]
A lower bound is [3]
- [math]\displaystyle{ M\ge \sqrt{\frac{m-d}{d(m-1)}} }[/math]
A deterministic matrix with the mutual coherence almost meeting the lower bound can be constructed by Weil's theorem.[4]
This concept was reintroduced by David Donoho and Michael Elad in the context of sparse representations.[5] A special case of this definition for the two-ortho case appeared earlier in the paper by Donoho and Huo.[6] The mutual coherence has since been used extensively in the field of sparse representations of signals. In particular, it is used as a measure of the ability of suboptimal algorithms such as matching pursuit and basis pursuit to correctly identify the true representation of a sparse signal.[1][2][7] Joel Tropp introduced a useful extension of Mutual Coherence, known as the Babel function, which extends the idea of cross-correlation between pairs of columns to the cross-correlation from one column to a set of other columns. The Babel function for two columns is exactly the Mutual coherence, but it also extends the coherence relationship concept in a way that is useful and relevant for any number of columns in the sparse representation matrix as well.[8]
See also
References
- ↑ 1.0 1.1 1.2 Tropp, J.A. (March 2006). "Just relax: Convex programming methods for identifying sparse signals in noise". IEEE Transactions on Information Theory 52 (3): 1030–1051. doi:10.1109/TIT.2005.864420. https://authors.library.caltech.edu/9040/1/TROieeetit06.pdf.
- ↑ 2.0 2.1 2.2 Donoho, D.L.; M. Elad; V.N. Temlyakov (January 2006). "Stable recovery of sparse overcomplete representations in the presence of noise". IEEE Transactions on Information Theory 52 (1): 6–18. doi:10.1109/TIT.2005.860430.
- ↑ Welch, L. R. (1974). "Lower bounds on the maximum cross-correlation of signals". IEEE Transactions on Information Theory 20 (3): 397–399. doi:10.1109/tit.1974.1055219.
- ↑ Zhiqiang, Xu (April 2011). "Deterministic Sampling of Sparse Trigonometric Polynomials". Journal of Complexity 27 (2): 133–140. doi:10.1016/j.jco.2011.01.007.
- ↑ Donoho, D.L.; Michael Elad (March 2003). "Optimally sparse representation in general (nonorthogonal) dictionaries via L1 minimization". Proc. Natl. Acad. Sci. 100 (5): 2197–2202. doi:10.1073/pnas.0437847100. PMID 16576749. Bibcode: 2003PNAS..100.2197D.
- ↑ Donoho, D.L.; Xiaoming Huo (November 2001). "Uncertainty principles and ideal atomic decomposition". IEEE Transactions on Information Theory 47 (7): 2845–2862. doi:10.1109/18.959265.
- ↑ Fuchs, J.-J. (June 2004). "On sparse representations in arbitrary redundant bases". IEEE Transactions on Information Theory 50 (6): 1341–1344. doi:10.1109/TIT.2004.828141.
- ↑ Joel A. Tropp (2004). "Greed is good: Algorithmic results for sparse approximation". http://web.math.princeton.edu/tfbb/spring03/greed-ticam0304.pdf.
Further reading
- Mutual coherence
- R1magic : R package providing mutual coherence computation.
Original source: https://en.wikipedia.org/wiki/Mutual coherence (linear algebra).
Read more |