Diffusion wavelets

From HandWiki
Revision as of 07:04, 24 October 2022 by WikiG (talk | contribs) (correction)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Diffusion wavelets are a fast multiscale framework for the analysis of functions on discrete (or discretized continuous) structures like graphs, manifolds, and point clouds in Euclidean space. Diffusion wavelets are an extension of classical wavelet theory from harmonic analysis. Unlike classical wavelets whose basis functions are predetermined, diffusion wavelets are adapted to the geometry of a given diffusion operator [math]\displaystyle{ T }[/math] (e.g., a heat kernel or a random walk). Moreover, the diffusion wavelet basis functions are constructed by dilation using the dyadic powers (powers of two) of [math]\displaystyle{ T }[/math]. These dyadic powers of [math]\displaystyle{ T }[/math] diffusion over the space and propagate local relationships in the function throughout the space until they become global. And if the rank of higher powers of [math]\displaystyle{ T }[/math] decrease (i.e., its spectrum decays), then these higher powers become compressible. From these decaying dyadic powers of [math]\displaystyle{ T }[/math] comes a chain of decreasing subspaces. These subspaces are the scaling function approximation subspaces, and the differences in the subspace chain are the wavelet subspaces. Diffusion wavelets were first introduced in 2004 by Ronald Coifman and Mauro Maggioni at Yale University.[1]

Algorithm

This algorithm constructs the scaling basis functions and the wavelet basis functions along with the representations of the diffusion operator [math]\displaystyle{ T }[/math] at these scales.

In the algorithm below, the subscript notation [math]\displaystyle{ \Phi_a }[/math] and [math]\displaystyle{ \Psi_b }[/math] represents the scaling basis functions at scale [math]\displaystyle{ a }[/math] and the wavelet basis functions at scale [math]\displaystyle{ b }[/math] respectively. The notation [math]\displaystyle{ [\Phi_b]_{\Phi_a} }[/math] denotes the matrix representation of the scaling basis [math]\displaystyle{ \Phi_b }[/math] represented with respect to the basis [math]\displaystyle{ \Phi_a }[/math]. Lastly, the notation [math]\displaystyle{ [T]_{\Phi_a}^{\Phi_b} }[/math] denotes the matrix represents of the operator [math]\displaystyle{ T }[/math], where the row space of [math]\displaystyle{ T }[/math] is represented with respect to the basis [math]\displaystyle{ \Phi_a }[/math], and the column space of [math]\displaystyle{ T }[/math] is represented with respect to the basis [math]\displaystyle{ \Phi_b }[/math]. Otherwise put, the domain of operator [math]\displaystyle{ T }[/math] is represented with respect to the basis [math]\displaystyle{ \Phi_a }[/math] and the range is represented with respect to the basis [math]\displaystyle{ \Phi_b }[/math]. The function [math]\displaystyle{ QR }[/math] is a sparse QR decomposition with [math]\displaystyle{ \epsilon }[/math] precision.[2]

// Input:
//    [math]\displaystyle{ T }[/math] is the matrix representation of the diffusion operator.
//    [math]\displaystyle{ \epsilon }[/math] is the precision of the QR decomposition, e.g., 1e-6.
//    [math]\displaystyle{ J }[/math] is the maximum number of scale levels (note: this is an optional upper bound, it may converge sooner.)
// Output:
//    [math]\displaystyle{ \lbrace\Phi_j\rbrace }[/math] is the set of scaling basis functions indexed by scale [math]\displaystyle{ j }[/math].
//    [math]\displaystyle{ \lbrace\Psi_j\rbrace }[/math] is the set of wavelet basis functions indexed by scale [math]\displaystyle{ j }[/math].
  
[math]\displaystyle{ \lbrace\Phi_j\rbrace, \lbrace\Psi_j\rbrace \leftarrow \text{function DiffusionWaveletTree} ( T , \epsilon , J ): }[/math]
    [math]\displaystyle{ \textbf{for } j\leftarrow 0 \text{ to } J-1 }[/math]:
        [math]\displaystyle{ [\Phi_{j+1}]_{\Phi_j}, [T^{2^j}]_{\Phi_j}^{\Phi_{j+1}} \leftarrow QR\left([T^{2^j}]_{\Phi_j}^{\Phi_{j}}, \epsilon\right)   }[/math]
        [math]\displaystyle{ [T^{2^{j+1}}]_{\Phi_{j+1}}^{\Phi_{j+1}} \leftarrow  \left([T^{2^j}]_{\Phi_j}^{\Phi_{j+1}} [\Phi_{j+1}]_{\Phi_j}\right)^2 }[/math]
        [math]\displaystyle{ [\Psi_j]_{\Phi_j} \leftarrow QR\left(I_{\langle\Phi_j\rangle}-[\Phi_{j+1}]_{\Phi_j}\left([\Phi_{j+1}]_{\Phi_j}\right)^*, \epsilon\right) }[/math] 
    [math]\displaystyle{ \textbf{end for} }[/math]

Applications

Mathematics

Diffusion wavelets are of general interest in mathematics. Specifically, they allow for the direct calculation of the Green′s function and the inverse graph Laplacian.

Computer science

Diffusion wavelets have been used extensively in computer science, especially in machine learning. They have been applied to the following fields:

See also

  • Wavelets

References

  1. Coifman, Ronald; Mauro Maggioni (May 2008). "Diffusion Wavelets". Applied and Computational Harmonic Analysis 24 (3): 329–353. Archived from the original on 2012-04-22. https://web.archive.org/web/20120422151024/http://www.math.duke.edu/~mauro/Papers/DiffusionWavelets.pdf. 
  2. Maggioni, Mauro; Mahadevan, Sridhar (2006). "Fast Direct Policy Evaluation using Multiscale Analysis of Markov Diffusion Processes". The 23rd International Conference on Machine Learning. http://www.cs.umass.edu/~mahadeva/papers/icml2006.pdf. 
  3. Mahadevan, Sridhar (2008). "Learning Representation and Control in Markov Decision Processes". Foundations and Trends in Machine Learning 1 (4). 
  4. Wang, Chang; Mahadevan, Sridhar (2010). "Multiscale Manifold Alignment". Univ. Of Massachusetts Technical Report (UM-CS-2010-049). http://www.cs.umass.edu/~mahadeva/papers/UM-CS-2010-049.pdf. 
  5. Mahadevan, Sridhar; Maggioni, Mauro (2006). "Value Function Approximation using Diffusion Wavelets and Laplacian Eigenfunctions". Advances in Neural Information Processing Systems. http://www.cs.umass.edu/~mahadeva/papers/nips-paper1-v5.pdf. 
  6. Wang, Chang; Mahadevan, Sridhar (2009). "Multiscale Dimensionality Reduction with Diffusion Wavelets". Univ. Of Massachusetts Technical Report (UM-CS-2009-030). http://www.cs.umass.edu/~mahadeva/papers/TR-2009-DP.pdf. 
  7. Mahadevan, Sridhar (2007). "Adaptive Mesh Compression in 3D Computer Graphics using Multiresolution Manifold Learning". The 24th International Conference on Machine Learning. http://www.cs.umass.edu/~mahadeva/papers/sridhar-icml07.pdf. 
  8. Wang, Chang; Mahadevan, Sridhar (2009). "Multiscale Analysis of Document Corpora Based on Diffusion Models". The 21st International Joint Conference on Artificial Intelligence. http://www.cs.umass.edu/~chwang/papers/IJCAI-2009-TD.pdf. [yes|permanent dead link|dead link}}]
  9. Wang, Chang; James Fan; Aditya A. Kalyanpur; David Gondek (2011). "Relation Extraction with Relation Topics". The 2011 Conference on Empirical Methods in Natural Language Processing. http://acl.eldoc.ub.rug.nl/mirror/D/D11/D11-1132.pdf. [yes|permanent dead link|dead link}}]

External links