Skip to yearly menu bar Skip to main content


Talk
in
Competition: Practical Vector Search (Big ANN) Challenge 2023

[OOD Track] RoarANN: Projected Bipartite Graph for Efficient Cross-Modal Approximate Nearest Neighbor Search

Kai Zhang

[ ]
Sat 16 Dec 7:30 a.m. PST — 7:45 a.m. PST

Abstract:

Approximate Nearest Neighbor Search (ANNS) is a fundamental and critical component in many applications, including recommendation systems and large language model-based applications. Cross-modal ANNS, which embeds data from different modalities into one high-dimensional space as feature vectors, uses the data from one modality to query another. However, there is an inherent distribution gap between embeddings from different modalities. Queries in cross-modal ANNS become Out-of-Distribution (OOD) to the base data, and state-of-the-art ANNS approaches suffer low performance for the workloads.

We quantitatively analyze the properties of the cross-modal datasets to gain an understanding of the ANNS efficiency. Unlike single-modal workloads, we find OOD queries are distant from the base data, and the k-nearest neighbors of an OOD query are distant in the embedding space. The property breaks the assumptions of existing ANNS approaches and mismatches their design for efficient search. With the insights from the OOD workloads, we propose pROjected bipARtite graph for ANNS (RoarANN), an ANNS approach that builds a graph index under the guidance of query distribution. Extensive experiments show that RoarANN significantly outperforms state-of-the-art approaches on modern cross-modal datasets, achieving up to 3.3X faster search speed at a 90% recall rate for OOD queries. Besides, RoarANN also demonstrates robust performance for in-distribution queries.

Authors: Meng Chen, Fudan University, Yue Chen, Fudan University, Rui Ma, Fudan University, Kai Zhang, Fudan University, Yuzheng Cai, Fudan University, Jiayang Shi, fudan university, Yizhuo Chen, Fudan University, Weiguo Zheng, Fudan University

Chat is not available.