Skip to yearly menu bar Skip to main content


Nearly Isometric Embedding by Relaxation

James McQueen · Marina Meila · Dominique Perrault-Joncas

Area 5+6+7+8 #30

Keywords: [ Nonlinear Dimension Reduction and Manifold Learning ] [ Large Scale Learning and Big Data ] [ Kernel Methods ] [ Spectral Methods ]


Many manifold learning algorithms aim to create embeddings with low or no distortion (i.e. isometric). If the data has intrinsic dimension d, it is often impossible to obtain an isometric embedding in d dimensions, but possible in s > d dimensions. Yet, most geometry preserving algorithms cannot do the latter. This paper proposes an embedding algorithm that overcomes this problem. The algorithm directly computes, for any data embedding Y, a distortion loss(Y), and iteratively updates Y in order to decrease it. The distortion measure we propose is based on the push-forward Riemannian metric associated with the coordinates Y. The experiments confirm the superiority of our algorithm in obtaining low distortion embeddings.

Live content is unavailable. Log in and register to view live content