Timezone: »
Arguably, the two most popular accelerated or momentum-based optimization methods are Nesterov's accelerated gradient and Polyaks's heavy ball, both corresponding to different discretizations of a particular second order differential equation with a friction term. Such connections with continuous-time dynamical systems have been instrumental in demystifying acceleration phenomena in optimization. Here we study structure-preserving discretizations for a certain class of dissipative (conformal) Hamiltonian systems, allowing us to analyze the symplectic structure of both Nesterov and heavy ball, besides providing several new insights into these methods. Moreover, we propose a new algorithm based on a dissipative relativistic system that normalizes the momentum and may result in more stable/faster optimization. Importantly, such a method generalizes both Nesterov and heavy ball, each being recovered as distinct limiting cases, and has potential advantages at no additional cost.
Author Information
Guilherme Franca (UC Berkeley)
Jeremias Sulam (Johns Hopkins University)
Daniel Robinson (Johns Hopkins University)
Rene Vidal (Mathematical Institute for Data Science, Johns Hopkins University, USA)
Related Events (a corresponding poster, oral, or spotlight)
-
2020 Spotlight: Conformal Symplectic and Relativistic Optimization »
Thu. Dec 10th 03:20 -- 03:30 PM Room Orals & Spotlights: Optimization/Theory
More from the Same Authors
-
2021 Spotlight: A Geometric Analysis of Neural Collapse with Unconstrained Features »
Zhihui Zhu · Tianyu Ding · Jinxin Zhou · Xiao Li · Chong You · Jeremias Sulam · Qing Qu -
2022 : DeepSTI: Towards Tensor Reconstruction using Fewer Orientations in Susceptibility Tensor Imaging »
Zhenghan Fang · Kuo-Wei Lai · Peter van Zijl · Xu Li · Jeremias Sulam -
2022 Poster: Recovery and Generalization in Over-Realized Dictionary Learning »
Jeremias Sulam · Chong You · Zhihui Zhu -
2022 Poster: Global Linear and Local Superlinear Convergence of IRLS for Non-Smooth Robust Regression »
Liangzu Peng · Christian Kümmerle · Rene Vidal -
2021 Poster: A Geometric Analysis of Neural Collapse with Unconstrained Features »
Zhihui Zhu · Tianyu Ding · Jinxin Zhou · Xiao Li · Chong You · Jeremias Sulam · Qing Qu -
2020 Poster: Learning to solve TV regularised problems with unrolled algorithms »
Hamza Cherkaoui · Jeremias Sulam · Thomas Moreau -
2020 Poster: A novel variational form of the Schatten-$p$ quasi-norm »
Paris Giampouras · Rene Vidal · Athanasios Rontogiannis · Benjamin Haeffele -
2020 Poster: Adversarial Robustness of Supervised Sparse Coding »
Jeremias Sulam · Ramchandran Muthukumar · Raman Arora -
2020 Poster: A Game Theoretic Analysis of Additive Adversarial Attacks and Defenses »
Ambar Pal · Rene Vidal -
2019 : Keynote I – Rene Vidal (Johns Hopkins University) »
René Vidal -
2019 Poster: A Linearly Convergent Method for Non-Smooth Non-Convex Optimization on the Grassmannian with Applications to Robust Subspace and Dictionary Learning »
Zhihui Zhu · Tianyu Ding · Daniel Robinson · Manolis Tsakiris · René Vidal -
2018 Poster: Dual Principal Component Pursuit: Improved Analysis and Efficient Algorithms »
Zhihui Zhu · Yifan Wang · Daniel Robinson · Daniel Naiman · René Vidal · Manolis Tsakiris -
2012 Poster: Finding Exemplars from Pairwise Dissimilarities via Simultaneous Sparse Recovery »
Ehsan Elhamifar · Guillermo Sapiro · René Vidal -
2011 Poster: Sparse Manifold Clustering and Embedding »
Ehsan Elhamifar · René Vidal -
2006 Poster: Online Clustering of Moving Subspaces »
René Vidal