Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2022: Optimization for Machine Learning

RandProx: Primal-Dual Optimization Algorithms with Randomized Proximal Updates

Laurent Condat · Peter Richtarik


Abstract:

Proximal splitting algorithms are well suited for large-scale nonsmooth optimization problems. We propose a primal-dual algorithm, in which the dual update is randomized, with the proximity operator of one of the function replaced by a stochastic oracle. A nonsmooth variance-reduction technique is implemented so that the algorithm finds an exact minimizer of the general problem. We derive linear convergence results in presence of strong convexity. Several existing randomized algorithms, like Point-SAGA, are recovered as particular cases. Randomness helps getting faster algorithms; this has long been known for stochastic-gradient-type algorithms, and our work shows that this fully applies in the more general primal-dual setting as well.

Chat is not available.