Algebraic signal processing

From HandWiki

Algebraic signal processing (ASP) is an emerging area of theoretical signal processing (SP). In the algebraic theory of signal processing, a set of filters is treated as an (abstract) algebra, a set of signals is treated as a module or vector space, and convolution is treated as an algebra representation. The advantage of algebraic signal processing is its generality and portability.

History

In the original formulation of algebraic signal processing by Puschel and Moura, the signals are collected in an [math]\displaystyle{ \mathcal{A} }[/math]-module for some algebra [math]\displaystyle{ \mathcal{A} }[/math] of filters, and filtering is given by the action of [math]\displaystyle{ \mathcal{A} }[/math] on the [math]\displaystyle{ \mathcal{A} }[/math]-module.[1]

Definitions

Let [math]\displaystyle{ K }[/math] be a field, for instance the complex numbers, and [math]\displaystyle{ \mathcal{A} }[/math] be a [math]\displaystyle{ K }[/math]-algebra (i.e. a vector space over [math]\displaystyle{ K }[/math] with a binary operation [math]\displaystyle{ \ast: \mathcal{A} \otimes \mathcal{A} \to \mathcal{A} }[/math] that is linear in both arguments) treated as a set of filters. Suppose [math]\displaystyle{ \mathcal{M} }[/math] is a vector space representing a set signals. A representation of [math]\displaystyle{ \mathcal{A} }[/math] consists of an algebra homomorphism [math]\displaystyle{ \rho: \mathcal{A} \to \mathrm{End}(\mathcal{M}) }[/math] where [math]\displaystyle{ \mathrm{End}(\mathcal{M}) }[/math] is the algebra of linear transformations [math]\displaystyle{ T: \mathcal{M} \to \mathcal{M} }[/math] with composition (equivalent, in the finite-dimensional case, to matrix multiplication). For convenience, we write [math]\displaystyle{ \rho_{a} }[/math] for the endomorphism [math]\displaystyle{ \rho(a) }[/math]. To be an algebra homomorphism, [math]\displaystyle{ \rho }[/math] must not only be a linear transformation, but also satisfy the property[math]\displaystyle{ \rho_{a \ast b} = \rho_{b}\circ\rho_a \quad \forall a, b \in \mathcal{A} }[/math]Given a signal [math]\displaystyle{ x \in \mathcal{M} }[/math], convolution of the signal by a filter [math]\displaystyle{ a \in \mathcal{A} }[/math] yields a new signal [math]\displaystyle{ \rho_a(x) }[/math]. Some additional terminology is needed from the representation theory of algebras. A subset [math]\displaystyle{ \mathcal{G} \subseteq \mathcal{A} }[/math] is said to generate the algebra if every element of [math]\displaystyle{ \mathcal{A} }[/math] can be represented as polynomials in the elements of [math]\displaystyle{ \mathcal{A} }[/math]. The image of a generator [math]\displaystyle{ g \in \mathcal{G} }[/math] is called a shift operator. In all practically all examples, convolutions are formed as polynomials in [math]\displaystyle{ \mathrm{End}(\mathcal{M}) }[/math] generated by shift operators. However, this is not necessarily the case for a representation of an arbitrary algebra.

Examples

Discrete Signal Processing

In discrete signal processing (DSP), the signal space is the set of complex-valued functions [math]\displaystyle{ \mathcal{M} = \mathcal{L}^2(\mathbb{Z}) }[/math] with bounded energy (i.e. square-integrable functions). This means the infinite series [math]\displaystyle{ \sum_{n = -\infty}^{\infty} |(x)_n| \lt \infty }[/math] where [math]\displaystyle{ |\cdot| }[/math] is the modulus of a complex number. The shift operator is given by the linear endomorphism [math]\displaystyle{ (S x)_n = (x)_{n-1} }[/math]. The filter space is the algebra of polynomials with complex coefficients [math]\displaystyle{ \mathcal{A} = \mathbb{C}[z^{-1},z] }[/math] and convolution is given by [math]\displaystyle{ \rho_{h} = \sum_{k = -\infty}^{\infty} h_k S^k }[/math] where [math]\displaystyle{ h(t) = \sum_{k=-\infty}^{\infty} h_k z^k }[/math] is an element of the algebra. Filtering a signal by [math]\displaystyle{ h }[/math], then yields [math]\displaystyle{ (y)_n = \sum_{k=-\infty}^{\infty} h_k x_{n-k} }[/math] because [math]\displaystyle{ (S^k x)_n = (x)_{n-k} }[/math].

Graph Signal Processing

A weighted graph is an undirected graph [math]\displaystyle{ \mathcal{G} = (\mathcal{V}, \mathcal{E}) }[/math] with pseudometric on the node set [math]\displaystyle{ \mathcal{V} }[/math] written [math]\displaystyle{ a_{ij} }[/math]. A graph signal is simply a real-valued function on the set of nodes of the graph. In graph neural networks, graph signals are sometimes called features. The signal space is the set of all graph signals [math]\displaystyle{ \mathcal{M} = \R^\mathcal{V} }[/math] where [math]\displaystyle{ \mathcal{V} }[/math] is a set of [math]\displaystyle{ n = |\mathcal{V}| }[/math] nodes in [math]\displaystyle{ \mathcal{G} = (\mathcal{V}, \mathcal{E}) }[/math]. The filter algebra is the algebra of polynomials in one indeterminate [math]\displaystyle{ \mathcal{A} = \mathbb{R}[t] }[/math]. There a few possible choices for a graph shift operator (GSO). The (un)normalized weighted adjacency matrix of [math]\displaystyle{ [A]_{ij} = a_{ij} }[/math] is a popular choice, as well as the (un)normalized graph Laplacian [math]\displaystyle{ [L]_{ij} = \begin{cases} \sum_{j = 1}^{n}a_{ij} & i = j \\ -a_{ij} & i \neq j \end{cases} }[/math]. The choice is dependent on performance and design considerations. If [math]\displaystyle{ S }[/math] is the GSO, then a graph convolution is the linear transformation [math]\displaystyle{ \rho_h = \sum_{k=0}^{\infty} h_k S^k }[/math] for some [math]\displaystyle{ h(t) = \sum_{k=0}^{\infty} h_k z^k }[/math], and convolution of a graph signal [math]\displaystyle{ \mathbf{x}: \mathcal{V} \to \mathbb{R} }[/math] by a filter [math]\displaystyle{ h(t) }[/math] yields a new graph signal [math]\displaystyle{ \mathbf{y} = \left(\sum_{k=0}^{\infty} h_k S^k \right) \cdot \mathbf{x} }[/math].

Other Examples

Other mathematical objects with their own proposed signal-processing frameworks are algebraic signal models. These objects include including quivers,[2] graphons,[3] semilattices,[4] finite groups, and Lie groups,[5] and others.

Intertwining Maps

In the framework of representation theory, relationships between two representations of the same algebra are described with intertwining maps which in the context of signal processing translates to transformations of signals that respect the algebra structure. Suppose [math]\displaystyle{ \rho: \mathcal{A} \to \mathrm{End}(\mathcal{M}) }[/math] and [math]\displaystyle{ \rho': \mathcal{A} \to \mathrm{End}(\mathcal{M}') }[/math] are two different representations of [math]\displaystyle{ \mathcal{A} }[/math]. An intertwining map is a linear transformation [math]\displaystyle{ \alpha: \mathcal{M} \to \mathcal{M}' }[/math] such that

[math]\displaystyle{ \alpha \circ \rho_a = \rho'_a \circ \alpha \quad \forall a \in \mathcal{A} }[/math]

Intuitively, this means that filtering a signal by [math]\displaystyle{ a }[/math] then transforming it with [math]\displaystyle{ \alpha }[/math] is equivalent to first transforming a signal with [math]\displaystyle{ \alpha }[/math], then filtering by [math]\displaystyle{ a }[/math]. The z transform[1] is a prototypical example of an intertwining map.

Algebraic Neural Networks

Inspired by a recent perspective that popular graph neural networks (GNNs) architectures are in fact convolutional neural networks (CNNs),[6] recent work has been focused on developing novel neural network architectures from the algebraic point-of-view.[7][8] An algebraic neural network is a composition of algebraic convolutions, possibly with multiple features and feature aggregations, and nonlinearities.

References

  1. 1.0 1.1 Puschel, M.; Moura, J. (2008). "Algebraic Signal Processing Theory: Foundation and 1-D Time". IEEE Transactions on Signal Processing 56 (8): 3572–3585. doi:10.1109/TSP.2008.925261. ISSN 1053-587X. Bibcode2008ITSP...56.3572P. https://ieeexplore.ieee.org/document/4520147. 
  2. Parada-Mayorga, Alejandro; Riess, Hans; Ribeiro, Alejandro; Ghrist, Robert (2020-10-22). "Quiver Signal Processing (QSP)". arXiv:2010.11525 [eess.SP].
  3. Ruiz, Luana; Chamon, Luiz F. O.; Ribeiro, Alejandro (2021). "Graphon Signal Processing". IEEE Transactions on Signal Processing 69: 4961–4976. doi:10.1109/TSP.2021.3106857. ISSN 1053-587X. Bibcode2021ITSP...69.4961R. https://ieeexplore.ieee.org/document/9521757. 
  4. Puschel, Markus; Seifert, Bastian; Wendler, Chris (2021). "Discrete Signal Processing on Meet/Join Lattices". IEEE Transactions on Signal Processing 69: 3571–3584. doi:10.1109/TSP.2021.3081036. ISSN 1053-587X. Bibcode2021ITSP...69.3571P. https://ieeexplore.ieee.org/document/9435014. 
  5. Bernardini, Riccardo; Rinaldo, Roberto (2021). "Demystifying Lie Group Methods for Signal Processing: A Tutorial". IEEE Signal Processing Magazine 38 (2): 45–64. doi:10.1109/MSP.2020.3023540. ISSN 1053-5888. Bibcode2021ISPM...38b..45B. https://ieeexplore.ieee.org/document/9363498. 
  6. Gama, Fernando; Isufi, Elvin; Leus, Geert; Ribeiro, Alejandro (2020). "Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph Neural Networks". IEEE Signal Processing Magazine 37 (6): 128–138. doi:10.1109/MSP.2020.3016143. ISSN 1053-5888. Bibcode2020ISPM...37f.128G. https://ieeexplore.ieee.org/document/9244191. 
  7. Parada-Mayorga, Alejandro; Ribeiro, Alejandro (2021). "Algebraic Neural Networks: Stability to Deformations". IEEE Transactions on Signal Processing 69: 3351–3366. doi:10.1109/TSP.2021.3084537. ISSN 1053-587X. Bibcode2021ITSP...69.3351P. https://ieeexplore.ieee.org/document/9442929. 
  8. Parada-Mayorga, Alejandro; Butler, Landon; Ribeiro, Alejandro (2022). "Convolutional Filtering and Neural Networks with Non Commutative Algebras". arXiv:2108.09923 [cs.LG].

External links