Timezone: »
We are interested in supervised metric learning of Mahalanobis like distances. Existing approaches mainly focus on learning a new distance using similarity and dissimilarity constraints between examples. In this paper, instead of bringing closer examples of the same class and pushing far away examples of different classes we propose to move the examples with respect to virtual points. Hence, each example is brought closer to a a priori defined virtual point reducing the number of constraints to satisfy. We show that our approach admits a closed form solution which can be kernelized. We provide a theoretical analysis showing the consistency of the approach and establishing some links with other classical metric learning methods. Furthermore we propose an efficient solution to the difficult problem of selecting virtual points based in part on recent works in optimal transport. Lastly, we evaluate our approach on several state of the art datasets.
Author Information
Michaël Perrot (University of Saint-Etienne)
Amaury Habrard (University of Saint-Etienne)
More from the Same Authors
-
2021 Poster: A PAC-Bayes Analysis of Adversarial Robustness »
Paul Viallard · Eric Guillaume VIDOT · Amaury Habrard · Emilie Morvant -
2021 Poster: Learning Stochastic Majority Votes by Minimizing a PAC-Bayes Generalization Bound »
Valentina Zantedeschi · Paul Viallard · Emilie Morvant · Rémi Emonet · Amaury Habrard · Pascal Germain · Benjamin Guedj -
2017 Poster: Joint distribution optimal transportation for domain adaptation »
Nicolas Courty · Rémi Flamary · Amaury Habrard · Alain Rakotomamonjy -
2016 Poster: Mapping Estimation for Discrete Optimal Transport »
Michaël Perrot · Nicolas Courty · Rémi Flamary · Amaury Habrard