Probabilistic metric space

From HandWiki

In mathematics, probabilistic metric spaces is a generalizations of metric spaces where the distance no longer takes values in the non-negative real numbersc R0, but in distribution functions. Let D+ be the set of all probability distribution functions F such that F(0) = 0 (F is a nondecreasing, left continuous mapping from R into [0, 1] such that max(F) = 1).

Then given a non-empty set S and a function F: S × SD+ where we denote F(p, q) by Fp,q for every (p, q) ∈ S × S, the ordered pair (S, F) is said to be a probabilistic metric space if:

  • For all u and v in S, u = v if and only if Fu,v(x) = 1 for all x > 0.
  • For all u and v in S, Fu,v = Fv,u.
  • For all u, v and w in S, Fu,v(x) = 1 and Fv,w(y) = 1 ⇒ Fu,w(x + y) = 1 for x, y > 0.

Probability metric of random variables

A probability metric D between two random variables X and Y may be defined, for example, as

[math]\displaystyle{ D(X, Y) = \int_{-\infty}^\infty \int_{-\infty}^\infty |x-y|F(x, y) \, dx dy }[/math]

where F(x, y) denotes the joint probability density function of the random variables X and Y. If X and Y are independent from each other then the equation above transforms into

[math]\displaystyle{ D(X, Y) = \int_{-\infty}^\infty \int_{-\infty}^\infty |x-y|f(x)g(y) \, dx dy }[/math]

where f(x) and g(y) are probability density functions of X and Y respectively.

One may easily show that such probability metrics do not satisfy the first metric axiom or satisfies it if, and only if, both of arguments X and Y are certain events described by Dirac delta density probability distribution functions. In this case:

[math]\displaystyle{ D(X, Y) = \int_{-\infty}^\infty \int_{-\infty}^\infty |x-y|\delta(x-\mu_x)\delta(y-\mu_y) \, dx dy = |\mu_x-\mu_y| }[/math]

the probability metric simply transforms into the metric between expected values [math]\displaystyle{ \mu_x }[/math], [math]\displaystyle{ \mu_y }[/math] of the variables X and Y.

For all other random variables X, Y the probability metric does not satisfy the identity of indiscernibles condition required to be satisfied by the metric of the metric space, that is:

[math]\displaystyle{ D\left(X, X\right) \gt 0. }[/math]
Probability metric between two random variables X and Y, both having normal distributions and the same standard deviation [math]\displaystyle{ \sigma = 0, \sigma = 0.2, \sigma = 0.4, \sigma = 0.6, \sigma = 0.8, \sigma = 1 }[/math] (beginning with the bottom curve). [math]\displaystyle{ m_{xy} = |\mu_x-\mu_y| }[/math] denotes a distance between means of X and Y.

Example

For example if both probability distribution functions of random variables X and Y are normal distributions (N) having the same standard deviation [math]\displaystyle{ \sigma }[/math], integrating [math]\displaystyle{ D\left(X, Y\right) }[/math] yields:

[math]\displaystyle{ D_{NN}(X, Y) = \mu_{xy} + \frac{2\sigma}{\sqrt\pi}\operatorname{exp}\left(-\frac{\mu_{xy}^2}{4\sigma^2}\right)-\mu_{xy} \operatorname{erfc} \left(\frac{\mu_{xy}}{2\sigma}\right) }[/math]

where

[math]\displaystyle{ \mu_{xy} = \left|\mu_x-\mu_y\right| }[/math],

and [math]\displaystyle{ \operatorname{erfc}(x) }[/math] is the complementary error function.

In this case:

[math]\displaystyle{ \lim_{\mu_{xy}\to 0} D_{NN}(X, Y) = D_{NN}(X, X) = \frac{2\sigma}{\sqrt\pi}. }[/math]

Probability metric of random vectors

The probability metric of random variables may be extended into metric D(X, Y) of random vectors X, Y by substituting [math]\displaystyle{ |x-y| }[/math] with any metric operator d(x, y):

[math]\displaystyle{ D(\mathbf{X}, \mathbf{Y}) =\int_{\Omega} \int_{\Omega} d(\mathbf{x}, \mathbf{y})F(\mathbf{x}, \mathbf{y}) \, d\Omega_x d\Omega_y }[/math]

where F(X, Y) is the joint probability density function of random vectors X and Y. For example substituting d(x, y) with Euclidean metric and providing the vectors X and Y are mutually independent would yield to:

[math]\displaystyle{ D(\mathbf{X}, \mathbf{Y}) =\int_{\Omega} \int_{\Omega} \sqrt{\sum_i|x_i-y_i|^2} F(\mathbf{x})G(\mathbf{y}) \, d\Omega_x d\Omega_y. }[/math]