Earth mover's distance

From HandWiki
Revision as of 17:32, 6 February 2024 by Wincert (talk | contribs) (linkage)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Distance between probability distributions

In computer science, the earth mover's distance (EMD)[1] is a measure of dissimilarity between two frequency distributions, densities, or measures, over a metric space D. Informally, if the distributions are interpreted as two different ways of piling up earth (dirt) over D, the EMD captures the minimum cost of building the smaller pile using dirt taken from the larger, where cost is defined as the amount of dirt moved multiplied by the distance over which it is moved.

The earth mover's distance is also known as Wasserstein metric [math]\displaystyle{ W_1 }[/math], Kantorovich–Rubinstein metric, or Mallows's distance.[2] It is the solution of the optimal transport problem, which in turn is also known as the Monge-Kantorovich problem, or sometimes the Hitchcock–Koopmans transportation problem;[3] when the measures are uniform over a set of discrete elements, the same optimization problem is known as minimum weight bipartite matching.

Formal definitions

The EMD between probability distributions [math]\displaystyle{ P }[/math] and [math]\displaystyle{ Q }[/math] can be defined as an infimum over joint probabilities:

[math]\displaystyle{ \text{EMD}(P,Q) = \inf\limits_{\gamma \in \Pi(P, Q)} \mathbb{E}_{(x, y) \sim \gamma}\left[d(x, y)\right]\, }[/math]

where [math]\displaystyle{ \Pi(P, Q) }[/math] is the set of all joint distributions whose marginals are [math]\displaystyle{ P }[/math] and [math]\displaystyle{ Q }[/math].

By Kantorovich-Rubinstein duality, this can also be expressed as:

[math]\displaystyle{ \text{EMD}(P,Q) = \sup\limits_{\| f \|_L \leq 1} \, \mathbb{E}_{x \sim P}[f(x)] - \mathbb{E}_{y \sim Q}[f(y)]\, }[/math]

where the supremum is taken over all 1-Lipschitz continuous functions, i.e. [math]\displaystyle{ \| \nabla f(x)\| \leq 1 \quad \forall x }[/math].


EMD between signatures

In some applications, it is convenient to represent a distribution [math]\displaystyle{ P }[/math] as a signature, or collection clusters, where the [math]\displaystyle{ i }[/math]-th cluster represents a mass of [math]\displaystyle{ w_{i} }[/math] centered at [math]\displaystyle{ p_i }[/math]. In this formulation, consider signatures [math]\displaystyle{ P=\{(p_1,w_{p1}),(p_2,w_{p2}),...,(p_m,w_{pm})\} }[/math] and [math]\displaystyle{ Q=\{(q_1,w_{q1}),(q_2,w_{q2}),...,(q_n,w_{qn})\} }[/math]. Let [math]\displaystyle{ D=[d_{i,j}] }[/math] be the ground distance between clusters [math]\displaystyle{ p_i }[/math] and [math]\displaystyle{ q_j }[/math]. Then the EMD between [math]\displaystyle{ P }[/math] and [math]\displaystyle{ Q }[/math] is given by the optimal flow [math]\displaystyle{ F=[f_{i,j}] }[/math], with [math]\displaystyle{ f_{i,j} }[/math] the flow between [math]\displaystyle{ p_i }[/math] and [math]\displaystyle{ q_j }[/math], that minimizes the overall cost.

[math]\displaystyle{ \min\limits_F {\sum_{i=1}^m\sum_{j=1}^n f_{i,j}d_{i,j}} }[/math]

subject to the constraints:

[math]\displaystyle{ f_{i,j}\ge0, 1\le i \le m, 1\le j \le n }[/math]
[math]\displaystyle{ \sum_{j=1}^n {f_{i,j}} \le w_{pi}, 1 \le i \le m }[/math]
[math]\displaystyle{ \sum_{i=1}^m {f_{i,j}} \le w_{qj}, 1 \le j \le n }[/math]
[math]\displaystyle{ \sum_{i=1}^m\sum_{j=1}^n f_{i,j} = \min \left\{ \ \sum_{i=1}^m w_{pi}, \quad \sum_{j=1}^n w_{q j} \ \right\} }[/math]

The optimal flow [math]\displaystyle{ F }[/math] is found by solving this linear optimization problem. The earth mover's distance is defined as the work normalized by the total flow:

[math]\displaystyle{ \text{EMD}(P,Q) = \frac{\sum_{i=1}^m \sum_{j=1}^n f_{i,j}d_{i,j}}{\sum_{i=1}^m \sum_{j=1}^n f_{i,j}} }[/math]

Variants and extensions

Unequal probability mass

Some applications may require the comparison of distributions with different total masses. One approach is to allow for partial matching, where dirt from the more massive distribution is rearranged to make the less massive, and any leftover "dirt" is discarded at no cost. Formally, let [math]\displaystyle{ w_P }[/math] be the total weight of [math]\displaystyle{ P }[/math], and [math]\displaystyle{ w_Q }[/math] be the total weight of [math]\displaystyle{ Q }[/math]. We have:

[math]\displaystyle{ \text{EMD}(P,Q) = \tfrac{1}{\min(w_P, w_Q)} \inf\limits_{\gamma \in \Pi_\geq(P, Q)} \int d(x,y) \, \mathrm{d} \gamma(x,y) }[/math]

where [math]\displaystyle{ \Pi_\geq(P, Q) }[/math] is the set of all measures whose projections are [math]\displaystyle{ \geq P }[/math] and [math]\displaystyle{ \geq Q }[/math]. Note that this generalization of EMD is not a true distance between distributions, as it does not satisfy the triangle inequality.

An alternative approach is to allow for mass to be created or destroyed, on a global or local level, as an alternative to transportation, but with a cost penalty. In that case one must specify a real parameter [math]\displaystyle{ \alpha }[/math], the ratio between the cost of creating or destroying one unit of "dirt", and the cost of transporting it by a unit distance. This is equivalent to minimizing the sum of the earth moving cost plus [math]\displaystyle{ \alpha }[/math] times the L1 distance between the rearranged pile and the second distribution. The resulting measure [math]\displaystyle{ \widehat{EMD}_\alpha }[/math] is a true distance function.[4]

More than two distributions

The EMD can be extended naturally to the case where more than two distributions are compared. In this case, the "distance" between the many distributions is defined as the optimal value of a linear program. This generalized EMD may be computed exactly using a greedy algorithm, and the resulting functional has been shown to be Minkowski additive and convex monotone.[5]

Computing the EMD

The EMD can be computed by solving an instance of transportation problem, using any algorithm for minimum-cost flow problem, e.g. the network simplex algorithm.

The Hungarian algorithm can be used to get the solution if the domain D is the set {0, 1}. If the domain is integral, it can be translated for the same algorithm by representing integral bins as multiple binary bins.

As a special case, if D is a one-dimensional array of "bins" of size n, the EMD can be efficiently computed by scanning the array and keeping track of how much dirt needs to be transported between consecutive bins. Here the bins are zero-indexed:

[math]\displaystyle{ \begin{align} \text{EMD}_0 &= 0 \\ \text{EMD}_{i+1} &= P_i + \text{EMD}_i - Q_i \\ \text{Total Distance} &= \sum_{i=0}^{n}|\text{EMD}_i| \end{align} }[/math]

EMD-based similarity analysis

EMD-based similarity analysis (EMDSA) is an important and effective tool in many multimedia information retrieval[6] and pattern recognition[7] applications. However, the computational cost of EMD is super-cubic to the number of the "bins" given an arbitrary "D". Efficient and scalable EMD computation techniques for large scale data have been investigated using MapReduce,[8][9] as well as bulk synchronous parallel and resilient distributed dataset.[10]

Applications

An early application of the EMD in computer science was to compare two grayscale images that may differ due to dithering, blurring, or local deformations.[11] In this case, the region is the image's domain, and the total amount of light (or ink) is the "dirt" to be rearranged.

The EMD is widely used in content-based image retrieval to compute distances between the color histograms of two digital images.[citation needed] In this case, the region is the RGB color cube, and each image pixel is a parcel of "dirt". The same technique can be used for any other quantitative pixel attribute, such as luminance, gradient, apparent motion in a video frame, etc..

More generally, the EMD is used in pattern recognition to compare generic summaries or surrogates of data records called signatures.[1] A typical signature consists of list of pairs ((x1,m1), ... (xn,mn)), where each xi is a certain "feature" (e.g., color in an image, letter in a text, etc.), and mi is "mass" (how many times that feature occurs in the record). Alternatively, xi may be the centroid of a data cluster, and mi the number of entities in that cluster. To compare two such signatures with the EMD, one must define a distance between features, which is interpreted as the cost of turning a unit mass of one feature into a unit mass of the other. The EMD between two signatures is then the minimum cost of turning one of them into the other.

EMD analysis has been used for quantitating multivariate changes in biomarkers measured by flow cytometry, with potential applications to other technologies that report distributions of measurements.[12]

History

The concept was first introduced by Gaspard Monge in 1781,[13] in the context of transportation theory. The use of the EMD as a distance measure for monochromatic images was described in 1989 by S. Peleg, M. Werman and H. Rom.[11] The name "earth mover's distance" was proposed by J. Stolfi in 1994,[14] and was used in print in 1998 by Y. Rubner, C. Tomasi and L. G. Guibas.[15]

See also

References

  1. 1.0 1.1 Rubner, Y.; Tomasi, C.; Guibas, L.J. (1998). "A metric for distributions with applications to image databases". Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271). Narosa Publishing House. pp. 59–66. doi:10.1109/iccv.1998.710701. ISBN 81-7319-221-9. http://dx.doi.org/10.1109/iccv.1998.710701. 
  2. C. L. Mallows (1972). "A note on asymptotic joint normality". Annals of Mathematical Statistics 43 (2): 508–515. doi:10.1214/aoms/1177692631. 
  3. Singiresu S. Rao (2009). Engineering Optimization: Theory and Practice (4th ed.). John Wiley & Sons. pp. 221. ISBN 978-0-470-18352-6. 
  4. Pele, Ofir; Werman, Michael (2008). "A Linear Time Histogram Metric for Improved SIFT Matching". Computer Vision – ECCV 2008. Lecture Notes in Computer Science. 5304. Springer Berlin Heidelberg. pp. 495–508. doi:10.1007/978-3-540-88690-7_37. ISBN 978-3-540-88689-1. 
  5. Kline, Jeffery (2019). "Properties of the d-dimensional earth mover's problem". Discrete Applied Mathematics 265: 128–141. doi:10.1016/j.dam.2019.02.042. 
  6. Mark A. Ruzon; Carlo Tomasi (2001). "Edge, Junction, and Corner Detection Using Color Distributions". IEEE Transactions on Pattern Analysis and Machine Intelligence. 
  7. Kristen Grauman; Trevor Darrel (2004). "Fast contour matching using approximate earth mover's distance". Proceedings of CVPR 2004. 
  8. Jin Huang; Rui Zhang; Rajkumar Buyya; Jian Chen (2014). "MELODY-Join: Efficient Earth Mover's Distance Similarity Joins Using MapReduce". 
  9. Jia Xu; Bin Lei; Yu Gu; Winslett, M.; Ge Yu; Zhenjie Zhang (2015). "Efficient Similarity Join Based on Earth Mover's Distance Using MapReduce". IEEE Transactions on Knowledge and Data Engineering. 
  10. Jin Huang; Rui Zhang; Rajkumar Buyya; Jian Chen, M.; Yongwei Wu (2015). "Heads-Join: Efficient Earth Mover's Distance Join on Hadoop". IEEE Transactions on Parallel and Distributed Systems. 
  11. 11.0 11.1 S. Peleg; M. Werman; H. Rom (1989). "A unified approach to the change of resolution: Space and gray-level". IEEE Transactions on Pattern Analysis and Machine Intelligence 11 (7): 739–742. doi:10.1109/34.192468. 
  12. Orlova, DY; Zimmerman, N; Meehan, C; Meehan, S; Waters, J; Ghosn, EEB (23 March 2016). "Earth Mover's Distance (EMD): A True Metric for Comparing Biomarker Expression Levels in Cell Populations". PLOS One 11 (3): e0151859. doi:10.1371/journal.pone.0151859. PMID 27008164. Bibcode2016PLoSO..1151859O. 
  13. "Mémoire sur la théorie des déblais et des remblais". Histoire de l'Académie Royale des Science, Année 1781, avec les Mémoires de Mathématique et de Physique. 1781. 
  14. J. Stolfi, personal communication to L. J. Guibas, 1994, as cited by Rubner, Yossi; Tomasi, Carlo; Guibas, Leonidas J. (2000). "The earth mover's distance as a metric for image retrieval". International Journal of Computer Vision 40 (2): 99–121. doi:10.1023/A:1026543900054. http://robotics.stanford.edu/~rubner/papers/rubnerIjcv00.pdf. 
  15. Yossi Rubner; Carlo Tomasi; Leonidas J. Guibas (1998). "A metric for distributions with applications to image databases". Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271). pp. 59–66. doi:10.1109/ICCV.1998.710701. ISBN 81-7319-221-9. 

External links