Skip to yearly menu bar Skip to main content


Spotlight
in
Workshop: Optimal Transport and Machine Learning

Linear-Time Gromov Wasserstein Distances using Low Rank Couplings and Costs

Meyer Scetbon · Gabriel PeyrĂ© · Marco Cuturi


Abstract: The ability to compare and align points across datasets that are known to be related, yet incomparable--because they live in heterogeneous spaces--plays an increasingly important role in machine learning. The Gromov-Wasserstein (GW) formalism can help tackle this problem. Its goal is to seek a low-distortion assignment between points taken across these incomparable datasets.As a non-convex and quadratic generalization of optimal transport (OT), GW is NP-hard. Although heuristics are known to work reasonably well (e.g. by solving a sequence of nested regularized OT problems), they still remain too costly to scale, with \emph{cubic} complexity in the number of samples $n$. As a result GW is far less used in practice than usual OT.We examine in this work how a recent variant of the OT problem that narrows down the set of admissible couplings to those admitting a low rank factorization of two sub-couplings is particularly well suited to the resolution of GW.By updating alternatively each sub-coupling, our algorithm computes a stationary point of the problem in time $O(n^2)$. When cost matrices have low rank , our algorithm becomes linear in $n$.We demonstrate the efficiency of our method on simulated and real data.

Chat is not available.