Wrapped distribution

From HandWiki

In probability theory and directional statistics, a wrapped probability distribution is a continuous probability distribution that describes data points that lie on a unit n-sphere. In one dimension, a wrapped distribution consists of points on the unit circle. If [math]\displaystyle{ \phi }[/math] is a random variate in the interval [math]\displaystyle{ (-\infty,\infty) }[/math] with probability density function (PDF) [math]\displaystyle{ p(\phi) }[/math], then [math]\displaystyle{ z = e^{i\phi} }[/math] is a circular variable distributed according to the wrapped distribution [math]\displaystyle{ p_{wz}(\theta) }[/math] and [math]\displaystyle{ \theta = \arg(z) }[/math] is an angular variable in the interval [math]\displaystyle{ (-\pi,\pi] }[/math] distributed according to the wrapped distribution [math]\displaystyle{ p_w(\theta) }[/math]. Any probability density function [math]\displaystyle{ p(\phi) }[/math] on the line can be "wrapped" around the circumference of a circle of unit radius.[1] That is, the PDF of the wrapped variable

[math]\displaystyle{ \theta=\phi \mod 2\pi }[/math] in some interval of length [math]\displaystyle{ 2\pi }[/math]

is

[math]\displaystyle{ p_w(\theta)=\sum_{k=-\infty}^\infty {p(\theta+2\pi k)} }[/math]

which is a periodic sum of period [math]\displaystyle{ 2\pi }[/math]. The preferred interval is generally [math]\displaystyle{ (-\pi\lt \theta\le\pi) }[/math] for which [math]\displaystyle{ \ln(e^{i\theta})=\arg(e^{i\theta})=\theta }[/math].

Theory

In most situations, a process involving circular statistics produces angles ([math]\displaystyle{ \phi }[/math]) which lie in the interval [math]\displaystyle{ (-\infty,\infty) }[/math], and are described by an "unwrapped" probability density function [math]\displaystyle{ p(\phi) }[/math]. However, a measurement will yield an angle [math]\displaystyle{ \theta }[/math] which lies in some interval of length [math]\displaystyle{ 2\pi }[/math] (for example, 0 to [math]\displaystyle{ 2\pi }[/math]). In other words, a measurement cannot tell whether the true angle [math]\displaystyle{ \phi }[/math] or a wrapped angle [math]\displaystyle{ \theta = \phi+2\pi a }[/math], where [math]\displaystyle{ a }[/math] is some unknown integer, has been measured.

If we wish to calculate the expected value of some function of the measured angle it will be:

[math]\displaystyle{ \langle f(\theta)\rangle=\int_{-\infty}^\infty p(\phi)f(\phi+2\pi a)d\phi }[/math].

We can express the integral as a sum of integrals over periods of [math]\displaystyle{ 2\pi }[/math]:

[math]\displaystyle{ \langle f(\theta)\rangle=\sum_{k=-\infty}^\infty \int_{2\pi k}^{2\pi(k+1)} p(\phi)f(\phi+2\pi a)d\phi }[/math].

Changing the variable of integration to [math]\displaystyle{ \theta'=\phi-2\pi k }[/math] and exchanging the order of integration and summation, we have

[math]\displaystyle{ \langle f(\theta)\rangle= \int_0^{2\pi} p_w(\theta')f(\theta'+2\pi a')d\theta' }[/math]

where [math]\displaystyle{ p_w(\theta') }[/math] is the PDF of the wrapped distribution and [math]\displaystyle{ a' }[/math] is another unknown integer [math]\displaystyle{ (a'=a+k) }[/math]. The unknown integer [math]\displaystyle{ a' }[/math] introduces an ambiguity into the expected value of [math]\displaystyle{ f(\theta) }[/math], similar to the problem of calculating angular mean. This can be resolved by introducing the parameter [math]\displaystyle{ z=e^{i\theta} }[/math], since [math]\displaystyle{ z }[/math] has an unambiguous relationship to the true angle [math]\displaystyle{ \phi }[/math]:

[math]\displaystyle{ z=e^{i\theta}=e^{i\phi} }[/math].

Calculating the expected value of a function of [math]\displaystyle{ z }[/math] will yield unambiguous answers:

[math]\displaystyle{ \langle f(z)\rangle= \int_0^{2\pi} p_w(\theta')f(e^{i\theta'})d\theta' }[/math].

For this reason, the [math]\displaystyle{ z }[/math] parameter is preferred over measured angles [math]\displaystyle{ \theta }[/math] in circular statistical analysis. This suggests that the wrapped distribution function may itself be expressed as a function of [math]\displaystyle{ z }[/math] such that:

[math]\displaystyle{ \langle f(z)\rangle= \oint p_{wz}(z)f(z)\,dz }[/math]

where [math]\displaystyle{ p_w(z) }[/math] is defined such that [math]\displaystyle{ p_w(\theta)\,|d\theta|=p_{wz}(z)\,|dz| }[/math]. This concept can be extended to the multivariate context by an extension of the simple sum to a number of [math]\displaystyle{ F }[/math] sums that cover all dimensions in the feature space:

[math]\displaystyle{ p_w(\vec\theta)=\sum_{k_1,...,k_F=-\infty}^{\infty}{p(\vec\theta+2\pi k_1\mathbf{e}_1+\dots+2\pi k_F\mathbf{e}_F)} }[/math]

where [math]\displaystyle{ \mathbf{e}_k=(0,\dots,0,1,0,\dots,0)^{\mathsf{T}} }[/math] is the [math]\displaystyle{ k }[/math]th Euclidean basis vector.

Expression in terms of characteristic functions

A fundamental wrapped distribution is the Dirac comb, which is a wrapped Dirac delta function:

[math]\displaystyle{ \Delta_{2\pi}(\theta)=\sum_{k=-\infty}^{\infty}{\delta(\theta+2\pi k)} }[/math].

Using the delta function, a general wrapped distribution can be written

[math]\displaystyle{ p_w(\theta)=\sum_{k= -\infty}^{\infty}\int_{-\infty}^\infty p(\theta')\delta(\theta-\theta'+2\pi k)\,d\theta' }[/math].

Exchanging the order of summation and integration, any wrapped distribution can be written as the convolution of the unwrapped distribution and a Dirac comb:

[math]\displaystyle{ p_w(\theta)=\int_{-\infty}^\infty p(\theta')\Delta_{2\pi}(\theta-\theta')\,d\theta' }[/math].

The Dirac comb may also be expressed as a sum of exponentials, so we may write:

[math]\displaystyle{ p_w(\theta)=\frac{1}{2\pi}\,\int_{-\infty}^\infty p(\theta')\sum_{n=-\infty}^{\infty}e^{in(\theta-\theta')}\,d\theta' }[/math].

Again exchanging the order of summation and integration:

[math]\displaystyle{ p_w(\theta)=\frac{1}{2\pi}\,\sum_{n=-\infty}^{\infty}\int_{-\infty}^\infty p(\theta')e^{in(\theta-\theta')}\,d\theta' }[/math].

Using the definition of [math]\displaystyle{ \phi(s) }[/math], the characteristic function of [math]\displaystyle{ p(\theta) }[/math] yields a Laurent series about zero for the wrapped distribution in terms of the characteristic function of the unwrapped distribution:

[math]\displaystyle{ p_w(\theta)=\frac{1}{2\pi}\,\sum_{n=-\infty}^{\infty} \phi(n)\,e^{-in\theta} }[/math]

or

[math]\displaystyle{ p_{wz}(z)=\frac{1}{2\pi}\,\sum_{n=-\infty}^{\infty} \phi(n)\,z^{-n} }[/math]

Analogous to linear distributions, [math]\displaystyle{ \phi(m) }[/math] is referred to as the characteristic function of the wrapped distribution (or more accurately, the characteristic sequence).[2] This is an instance of the Poisson summation formula, and it can be seen that the coefficients of the Fourier series for the wrapped distribution are simply the coefficients of the Fourier transform of the unwrapped distribution at integer values.

Moments

The moments of the wrapped distribution [math]\displaystyle{ p_w(z) }[/math] are defined as:

[math]\displaystyle{ \langle z^m \rangle = \oint p_{wz}(z)z^m \, dz }[/math].

Expressing [math]\displaystyle{ p_w(z) }[/math] in terms of the characteristic function and exchanging the order of integration and summation yields:

[math]\displaystyle{ \langle z^m \rangle = \frac{1}{2\pi}\sum_{n=-\infty}^\infty \phi(n)\oint z^{m-n}\,dz }[/math].

From the residue theorem we have

[math]\displaystyle{ \oint z^{m-n}\,dz = 2\pi \delta_{m-n} }[/math]

where [math]\displaystyle{ \delta_k }[/math] is the Kronecker delta function. It follows that the moments are simply equal to the characteristic function of the unwrapped distribution for integer arguments:

[math]\displaystyle{ \langle z^m \rangle = \phi(m) }[/math].

Generation of random variates

If [math]\displaystyle{ X }[/math] is a random variate drawn from a linear probability distribution [math]\displaystyle{ P }[/math], then [math]\displaystyle{ Z=e^{i X} }[/math] is a circular variate distributed according to the wrapped [math]\displaystyle{ P }[/math] distribution, and [math]\displaystyle{ \theta=\arg(Z) }[/math] is the angular variate distributed according to the wrapped [math]\displaystyle{ P }[/math] distribution, with [math]\displaystyle{ -\pi \lt \theta \leq \pi }[/math].

Entropy

The information entropy of a circular distribution with probability density [math]\displaystyle{ p_w(\theta) }[/math] is defined as:

[math]\displaystyle{ H = -\int_\Gamma p_w(\theta)\,\ln(p_w(\theta))\,d\theta }[/math]

where [math]\displaystyle{ \Gamma }[/math] is any interval of length [math]\displaystyle{ 2\pi }[/math].[1] If both the probability density and its logarithm can be expressed as a Fourier series (or more generally, any integral transform on the circle), the orthogonal basis of the series can be used to obtain a closed form expression for the entropy.

The moments of the distribution [math]\displaystyle{ \phi(n) }[/math] are the Fourier coefficients for the Fourier series expansion of the probability density:

[math]\displaystyle{ p_w(\theta)=\frac{1}{2\pi}\sum_{n=-\infty}^\infty \phi_n e^{-in\theta} }[/math].

If the logarithm of the probability density can also be expressed as a Fourier series:

[math]\displaystyle{ \ln(p_w(\theta))=\sum_{m=-\infty}^\infty c_m e^{im\theta} }[/math]

where

[math]\displaystyle{ c_m=\frac{1}{2\pi}\int_\Gamma \ln(p_w(\theta))e^{-i m \theta}\,d\theta }[/math].

Then, exchanging the order of integration and summation, the entropy may be written as:

[math]\displaystyle{ H=-\frac{1}{2\pi}\sum_{m=-\infty}^\infty\sum_{n=-\infty}^\infty c_m \phi_n \int_\Gamma e^{i(m-n)\theta}\,d\theta }[/math].

Using the orthogonality of the Fourier basis, the integral may be reduced to:

[math]\displaystyle{ H=-\sum_{n=-\infty}^\infty c_n \phi_n }[/math].

For the particular case when the probability density is symmetric about the mean, [math]\displaystyle{ c_{-m}=c_m }[/math] and the logarithm may be written:

[math]\displaystyle{ \ln(p_w(\theta))= c_0 + 2\sum_{m=1}^\infty c_m \cos(m\theta) }[/math]

and

[math]\displaystyle{ c_m=\frac{1}{2\pi}\int_\Gamma \ln(p_w(\theta))\cos(m\theta)\,d\theta }[/math]

and, since normalization requires that [math]\displaystyle{ \phi_0=1 }[/math], the entropy may be written:

[math]\displaystyle{ H=-c_0-2\sum_{n=1}^\infty c_n \phi_n }[/math].

See also

References

External links