Timezone: »
In this paper we address the problem of Maximum Inner Product Search (MIPS) that is currently the computational bottleneck in a large number of machine learning applications. While being similar to the nearest neighbor search (NNS), the MIPS problem was shown to be more challenging, as the inner product is not a proper metric function. We propose to solve the MIPS problem with the usage of similarity graphs, i.e., graphs where each vertex is connected to the vertices that are the most similar in terms of some similarity function. Originally, the framework of similarity graphs was proposed for metric spaces and in this paper we naturally extend it to the non-metric MIPS scenario. We demonstrate that, unlike existing approaches, similarity graphs do not require any data transformation to reduce MIPS to the NNS problem and should be used for the original data. Moreover, we explain why such a reduction is detrimental for similarity graphs. By an extensive comparison to the existing approaches, we show that the proposed method is a game-changer in terms of the runtime/accuracy trade-off for the MIPS problem.
Author Information
Stanislav Morozov (Yandex)
Artem Babenko (MIPT/Yandex)
More from the Same Authors
-
2022 Poster: On Embeddings for Numerical Features in Tabular Deep Learning »
Yury Gorishniy · Ivan Rubachev · Artem Babenko -
2021 : Billion-Scale Approximate Nearest Neighbor Search Challenge + Q&A »
Harsha Vardhan Simhadri · George Williams · Martin Aumüller · Artem Babenko · Dmitry Baranchuk · Qi Chen · Matthijs Douze · Ravishankar Krishnawamy · Gopal Srinivasa · Suhas Jayaram Subramanya · Jingdong Wang -
2021 Poster: Revisiting Deep Learning Models for Tabular Data »
Yury Gorishniy · Ivan Rubachev · Valentin Khrulkov · Artem Babenko -
2019 Poster: Beyond Vector Spaces: Compact Data Representation as Differentiable Weighted Graphs »
Denis Mazur · Vage Egiazarian · Stanislav Morozov · Artem Babenko