Maximum inner-product search

From HandWiki
Short description: Search problem

Maximum inner-product search (MIPS) is a search problem, with a corresponding class of search algorithms which attempt to maximise the inner product between a query and the data items to be retrieved. MIPS algorithms are used in a wide variety of big data applications, including recommendation algorithms and machine learning.[1]

Formally, for a database of vectors [math]\displaystyle{ x_i }[/math] defined over a set of labels [math]\displaystyle{ S }[/math] in an inner product space with an inner product [math]\displaystyle{ \langle \cdot, \cdot \rangle }[/math] defined on it, MIPS search can be defined as the problem of determining

[math]\displaystyle{ \underset{i \in S}{\operatorname{arg\,max}}\ \langle x_i, q \rangle }[/math]

for a given query [math]\displaystyle{ q }[/math].

Although there is an obvious linear-time implementation, it is generally too slow to be used on practical problems. However, efficient algorithms exist to speed up MIPS search.[1][2]

Under the assumption of all vectors in the set having constant norm, MIPS can be viewed as equivalent to a nearest neighbor search (NNS) problem in which maximizing the inner product is equivalent to minimizing the corresponding distance metric in the NNS problem.[3] Like other forms of NNS, MIPS algorithms may be approximate or exact.[4]

MIPS search is used as part of DeepMind's RETRO algorithm.[5]

References

  1. 1.0 1.1 Abuzaid, Firas; Sethi, Geet; Bailis, Peter; Zaharia, Matei (2019-03-14). "To Index or Not to Index: Optimizing Exact Maximum Inner Product Search". arXiv:1706.01449 [cs.IR].
  2. Steve Mussmann, Stefano Ermon. Learning and Inference via Maximum Inner Product Search. In Proc. 33rd International Conference on Machine Learning (ICML), 2016.
  3. Shrivastava, Anshumali; Li, Ping (2015-07-12). "Improved asymmetric locality sensitive hashing (ALSH) for Maximum Inner Product Search (MIPS)". Proceedings of the Thirty-First Conference on Uncertainty in Artificial Intelligence. UAI'15 (Arlington, Virginia, USA: AUAI Press): 812–821. ISBN 978-0-9966431-0-8. https://dl.acm.org/doi/abs/10.5555/3020847.3020931. 
  4. Keivani, Omid; Sinha, Kaushik; Ram, Parikshit (May 2017). "Improved maximum inner product search with better theoretical guarantees". 2017 International Joint Conference on Neural Networks (IJCNN). pp. 2927–2934. doi:10.1109/IJCNN.2017.7966218. ISBN 978-1-5090-6182-2. https://ieeexplore.ieee.org/document/7966218. 
  5. "RETRO Is Blazingly Fast" (in en). 2022-07-01. http://mitchgordon.me/ml/2022/07/01/retro-is-blazing.html. 

See also