Timezone: »
We study the problem of minimizing a strongly convex, smooth function when we have noisy estimates of its gradient. We propose a novel multistage accelerated algorithm that is universally optimal in the sense that it achieves the optimal rate both in the deterministic and stochastic case and operates without knowledge of noise characteristics. The algorithm consists of stages that use a stochastic version of Nesterov's method with a specific restart and parameters selected to achieve the fastest reduction in the bias-variance terms in the convergence rate bounds.
Author Information
Necdet Serhat Aybat (Penn State University)
Alireza Fallah (MIT)
Mert Gurbuzbalaban (Rutgers)
Asuman Ozdaglar (Massachusetts Institute of Technology)
Asu Ozdaglar received the B.S. degree in electrical engineering from the Middle East Technical University, Ankara, Turkey, in 1996, and the S.M. and the Ph.D. degrees in electrical engineering and computer science from the Massachusetts Institute of Technology, Cambridge, in 1998 and 2003, respectively. She is currently a professor in the Electrical Engineering and Computer Science Department at the Massachusetts Institute of Technology. She is also the director of the Laboratory for Information and Decision Systems. Her research expertise includes optimization theory, with emphasis on nonlinear programming and convex analysis, game theory, with applications in communication, social, and economic networks, distributed optimization and control, and network analysis with special emphasis on contagious processes, systemic risk and dynamic control. Professor Ozdaglar is the recipient of a Microsoft fellowship, the MIT Graduate Student Council Teaching award, the NSF Career award, the 2008 Donald P. Eckman award of the American Automatic Control Council, the Class of 1943 Career Development Chair, the inaugural Steven and Renee Innovation Fellowship, and the 2014 Spira teaching award. She served on the Board of Governors of the Control System Society in 2010 and was an associate editor for IEEE Transactions on Automatic Control. She is currently the area co-editor for a new area for the journal Operations Research, entitled "Games, Information and Networks. She is the co-author of the book entitled âConvex Analysis and Optimizationâ (Athena Scientific, 2003).
More from the Same Authors
-
2022 : Smoothed-SGDmax: A Stability-Inspired Algorithm to Improve Adversarial Generalization »
Jiancong Xiao · Jiawei Zhang · Zhiquan Luo · Asuman Ozdaglar -
2022 Poster: What is a Good Metric to Study Generalization of Minimax Learners? »
Asuman Ozdaglar · Sarath Pattathil · Jiawei Zhang · Kaiqing Zhang -
2022 Poster: Bridging Central and Local Differential Privacy in Data Acquisition Mechanisms »
Alireza Fallah · Ali Makhdoumi · azarakhsh malekian · Asuman Ozdaglar -
2022 Poster: SAPD+: An Accelerated Stochastic Method for Nonconvex-Concave Minimax Problems »
Xuan Zhang · Necdet Serhat Aybat · Mert Gurbuzbalaban -
2021 : Q&A with Professor Asu Ozdaglar »
Asuman Ozdaglar -
2021 : Keynote Talk: Personalization in Federated Learning: Adaptation and Clustering (Asu Ozdaglar) »
Asuman Ozdaglar -
2021 Poster: Decentralized Q-learning in Zero-sum Markov Games »
Muhammed Sayin · Kaiqing Zhang · David Leslie · Tamer Basar · Asuman Ozdaglar -
2021 Poster: Generalization of Model-Agnostic Meta-Learning Algorithms: Recurring and Unseen Tasks »
Alireza Fallah · Aryan Mokhtari · Asuman Ozdaglar -
2021 Poster: On the Convergence Theory of Debiased Model-Agnostic Meta-Reinforcement Learning »
Alireza Fallah · Kristian Georgiev · Aryan Mokhtari · Asuman Ozdaglar -
2020 Poster: Breaking Reversibility Accelerates Langevin Dynamics for Non-Convex Optimization »
Xuefeng GAO · Mert Gurbuzbalaban · Lingjiong Zhu -
2020 Poster: IDEAL: Inexact DEcentralized Accelerated Augmented Lagrangian Method »
Yossi Arjevani · Joan Bruna · Bugra Can · Mert Gurbuzbalaban · Stefanie Jegelka · Hongzhou Lin -
2020 Poster: Personalized Federated Learning with Theoretical Guarantees: A Model-Agnostic Meta-Learning Approach »
Alireza Fallah · Aryan Mokhtari · Asuman Ozdaglar -
2020 Spotlight: IDEAL: Inexact DEcentralized Accelerated Augmented Lagrangian Method »
Yossi Arjevani · Joan Bruna · Bugra Can · Mert Gurbuzbalaban · Stefanie Jegelka · Hongzhou Lin -
2019 Poster: First Exit Time Analysis of Stochastic Gradient Descent Under Heavy-Tailed Gradient Noise »
Thanh Huy Nguyen · Umut Simsekli · Mert Gurbuzbalaban · Gaël RICHARD -
2018 Poster: Escaping Saddle Points in Constrained Optimization »
Aryan Mokhtari · Asuman Ozdaglar · Ali Jadbabaie -
2018 Spotlight: Escaping Saddle Points in Constrained Optimization »
Aryan Mokhtari · Asuman Ozdaglar · Ali Jadbabaie -
2017 Poster: When Cyclic Coordinate Descent Outperforms Randomized Coordinate Descent »
Mert Gurbuzbalaban · Asuman Ozdaglar · Pablo A Parrilo · Nuri Vanli -
2017 Spotlight: When Cyclic Coordinate Descent Outperforms Randomized Coordinate Descent »
Mert Gurbuzbalaban · Asuman Ozdaglar · Pablo A Parrilo · Nuri Vanli -
2016 Poster: A primal-dual method for conic constrained distributed optimization problems »
Necdet Serhat Aybat · Erfan Yazdandoost Hamedani -
2015 Invited Talk: Incremental Methods for Additive Cost Convex Optimization »
Asuman Ozdaglar -
2013 Poster: Computing the Stationary Distribution Locally »
Christina Lee · Asuman Ozdaglar · Devavrat Shah