Skip to yearly menu bar Skip to main content


Poster

Communication Efficient Parallel Algorithms for Optimization on Manifolds

Bayan Saparbayeva · Michael Zhang · Lizhen Lin

Room 210 #27

Keywords: [ Distributed Inference ] [ Recommender Systems ] [ Nonlinear Dimensionality Reduction and Manifold Learning ] [ Convex Optimization ]


Abstract:

The last decade has witnessed an explosion in the development of models, theory and computational algorithms for ``big data'' analysis. In particular, distributed inference has served as a natural and dominating paradigm for statistical inference. However, the existing literature on parallel inference almost exclusively focuses on Euclidean data and parameters. While this assumption is valid for many applications, it is increasingly more common to encounter problems where the data or the parameters lie on a non-Euclidean space, like a manifold for example. Our work aims to fill a critical gap in the literature by generalizing parallel inference algorithms to optimization on manifolds. We show that our proposed algorithm is both communication efficient and carries theoretical convergence guarantees. In addition, we demonstrate the performance of our algorithm to the estimation of Fr\'echet means on simulated spherical data and the low-rank matrix completion problem over Grassmann manifolds applied to the Netflix prize data set.

Live content is unavailable. Log in and register to view live content