Timezone: »
Spotlight
Projection Efficient Subgradient Method and Optimal Nonsmooth Frank-Wolfe Method
Kiran Thekumparampil · Prateek Jain · Praneeth Netrapalli · Sewoong Oh
Thu Dec 10 07:10 PM -- 07:20 PM (PST) @ Orals & Spotlights: Optimization
We consider the classical setting of optimizing a nonsmooth Lipschitz continuous convex function over a convex constraint set, when having access to a (stochastic) first-order oracle (FO) for the function and a projection oracle (PO) for the constraint set. It is well known that to achieve $\epsilon$-suboptimality in high-dimensions, $\Theta(\epsilon^{-2})$ FO calls are necessary. This is achieved by the projected subgradient method (PGD). However, PGD also entails $O(\epsilon^{-2})$ PO calls, which may be computationally costlier than FO calls (e.g. nuclear norm constraints). Improving this PO calls complexity of PGD is largely unexplored, despite the fundamental nature of this problem and extensive literature. We present first such improvement. This only requires a mild assumption that the objective function, when extended to a slightly larger neighborhood of the constraint set, still remains Lipschitz and accessible via FO. In particular, we introduce MOPES method, which carefully combines Moreau-Yosida smoothing and accelerated first-order schemes. This is guaranteed to find a feasible $\epsilon$-suboptimal solution using only $O(\epsilon^{-1})$ PO calls and optimal $O(\epsilon^{-2})$ FO calls. Further, instead of a PO if we only have a linear minimization oracle (LMO, a la Frank-Wolfe) to access the constraint set, an extension of our method, MOLES, finds a feasible $\epsilon$-suboptimal solution using $O(\epsilon^{-2})$ LMO calls and FO calls---both match known lower bounds, resolving a question left open since White (1993). Our experiments confirm that these methods achieve significant speedups over the state-of-the-art, for a problem with costly PO and LMO calls.
Author Information
Kiran Thekumparampil (Univ. of Illinois at Urbana-Champaign)
Prateek Jain (Microsoft Research)
Praneeth Netrapalli (Microsoft Research)
Sewoong Oh (University of Washington)
Related Events (a corresponding poster, oral, or spotlight)
-
2020 Poster: Projection Efficient Subgradient Method and Optimal Nonsmooth Frank-Wolfe Method »
Fri Dec 11th 05:00 -- 07:00 AM Room Poster Session 6
More from the Same Authors
-
2020 Poster: RNNPool: Efficient Non-linear Pooling for RAM Constrained Inference »
Oindrila Saha · Aditya Kusupati · Harsha Vardhan Simhadri · Manik Varma · Prateek Jain -
2020 Poster: Robust Meta-learning for Mixed Linear Regression with Small Batches »
Weihao Kong · Raghav Somani · Sham Kakade · Sewoong Oh -
2020 Spotlight: RNNPool: Efficient Non-linear Pooling for RAM Constrained Inference »
Oindrila Saha · Aditya Kusupati · Harsha Vardhan Simhadri · Manik Varma · Prateek Jain -
2020 Poster: The Pitfalls of Simplicity Bias in Neural Networks »
Harshay Shah · Kaustav Tamuly · Aditi Raghunathan · Prateek Jain · Praneeth Netrapalli -
2020 Poster: Least Squares Regression with Markovian Data: Fundamental Limits and Algorithms »
Dheeraj Nagaraj · Xian Wu · Guy Bresler · Prateek Jain · Praneeth Netrapalli -
2020 Poster: Follow the Perturbed Leader: Optimism and Fast Parallel Algorithms for Smooth Minimax Games »
Arun Suggala · Praneeth Netrapalli -
2020 Spotlight: Least Squares Regression with Markovian Data: Fundamental Limits and Algorithms »
Dheeraj Nagaraj · Xian Wu · Guy Bresler · Prateek Jain · Praneeth Netrapalli -
2020 Poster: MOReL: Model-Based Offline Reinforcement Learning »
Rahul Kidambi · Aravind Rajeswaran · Praneeth Netrapalli · Thorsten Joachims -
2019 Poster: Efficient Algorithms for Smooth Minimax Optimization »
Kiran Thekumparampil · Prateek Jain · Praneeth Netrapalli · Sewoong Oh -
2019 Poster: Turbo Autoencoder: Deep learning based channel codes for point-to-point communication channels »
Yihan Jiang · Hyeji Kim · Himanshu Asnani · Sreeram Kannan · Sewoong Oh · Pramod Viswanath -
2019 Poster: The Step Decay Schedule: A Near Optimal, Geometrically Decaying Learning Rate Procedure For Least Squares »
Rong Ge · Sham Kakade · Rahul Kidambi · Praneeth Netrapalli -
2019 Poster: Minimax Optimal Estimation of Approximate Differential Privacy on Neighboring Databases »
Xiyang Liu · Sewoong Oh -
2018 Poster: Support Recovery for Orthogonal Matching Pursuit: Upper and Lower bounds »
Raghav Somani · Chirag Gupta · Prateek Jain · Praneeth Netrapalli -
2018 Spotlight: Support Recovery for Orthogonal Matching Pursuit: Upper and Lower bounds »
Raghav Somani · Chirag Gupta · Prateek Jain · Praneeth Netrapalli -
2018 Poster: Deepcode: Feedback Codes via Deep Learning »
Hyeji Kim · Yihan Jiang · Sreeram Kannan · Sewoong Oh · Pramod Viswanath -
2018 Poster: Robustness of conditional GANs to noisy labels »
Kiran Thekumparampil · Ashish Khetan · Zinan Lin · Sewoong Oh -
2018 Spotlight: Robustness of conditional GANs to noisy labels »
Kiran Thekumparampil · Ashish Khetan · Zinan Lin · Sewoong Oh -
2018 Poster: PacGAN: The power of two samples in generative adversarial networks »
Zinan Lin · Ashish Khetan · Giulia Fanti · Sewoong Oh -
2017 Poster: Optimal Sample Complexity of M-wise Data for Top-K Ranking »
Minje Jang · Sunghyun Kim · Changho Suh · Sewoong Oh -
2017 Poster: Estimating Mutual Information for Discrete-Continuous Mixtures »
Weihao Gao · Sreeram Kannan · Sewoong Oh · Pramod Viswanath -
2017 Poster: Matrix Norm Estimation from a Few Entries »
Ashish Khetan · Sewoong Oh -
2017 Spotlight: Estimating Mutual Information for Discrete-Continuous Mixtures »
Weihao Gao · Sreeram Kannan · Sewoong Oh · Pramod Viswanath -
2017 Spotlight: Matrix Norm Estimation from a Few Entries »
Ashish Khetan · Sewoong Oh -
2017 Poster: Discovering Potential Correlations via Hypercontractivity »
Hyeji Kim · Weihao Gao · Sreeram Kannan · Sewoong Oh · Pramod Viswanath -
2016 Poster: Breaking the Bandwidth Barrier: Geometrical Adaptive Entropy Estimation »
Weihao Gao · Sewoong Oh · Pramod Viswanath -
2016 Poster: Computational and Statistical Tradeoffs in Learning to Rank »
Ashish Khetan · Sewoong Oh -
2016 Poster: Achieving budget-optimality with adaptive schemes in crowdsourcing »
Ashish Khetan · Sewoong Oh -
2015 Workshop: Non-convex Optimization for Machine Learning: Theory and Practice »
Anima Anandkumar · Niranjan Uma Naresh · Kamalika Chaudhuri · Percy Liang · Sewoong Oh -
2015 Poster: Secure Multi-party Differential Privacy »
Peter Kairouz · Sewoong Oh · Pramod Viswanath -
2015 Poster: Collaboratively Learning Preferences from Ordinal Data »
Sewoong Oh · Kiran Thekumparampil · Jiaming Xu -
2014 Workshop: Analysis of Rank Data: Confluence of Social Choice, Operations Research, and Machine Learning »
Shivani Agarwal · Hossein Azari Soufiani · Guy Bresler · Sewoong Oh · David Parkes · Arun Rajkumar · Devavrat Shah -
2014 Poster: Provable Tensor Factorization with Missing Data »
Prateek Jain · Sewoong Oh -
2014 Poster: Extremal Mechanisms for Local Differential Privacy »
Peter Kairouz · Sewoong Oh · Pramod Viswanath -
2014 Poster: Minimax-optimal Inference from Partial Rankings »
Bruce Hajek · Sewoong Oh · Jiaming Xu -
2014 Poster: Learning Mixed Multinomial Logit Model from Ordinal Data »
Sewoong Oh · Devavrat Shah -
2012 Poster: Iterative ranking from pair-wise comparisons »
Sahand N Negahban · Sewoong Oh · Devavrat Shah -
2012 Spotlight: Iterative ranking from pair-wise comparisons »
Sahand N Negahban · Sewoong Oh · Devavrat Shah -
2011 Poster: Iterative Learning for Reliable Crowdsourcing Systems »
David R Karger · Sewoong Oh · Devavrat Shah -
2011 Oral: Iterative Learning for Reliable Crowdsourcing Systems »
David R Karger · Sewoong Oh · Devavrat Shah -
2009 Poster: Matrix Completion from Noisy Entries »
Raghunandan Keshavan · Andrea Montanari · Sewoong Oh