Timezone: »

 
Poster
Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping
Eduard Gorbunov · Marina Danilova · Alexander Gasnikov

Wed Dec 09 09:00 AM -- 11:00 AM (PST) @ Poster Session 3 #822

In this paper, we propose a new accelerated stochastic first-order method called clipped-SSTM for smooth convex stochastic optimization with heavy-tailed distributed noise in stochastic gradients and derive the first high-probability complexity bounds for this method closing the gap in the theory of stochastic optimization with heavy-tailed noise. Our method is based on a special variant of accelerated Stochastic Gradient Descent (SGD) and clipping of stochastic gradients. We extend our method to the strongly convex case and prove new complexity bounds that outperform state-of-the-art results in this case. Finally, we extend our proof technique and derive the first non-trivial high-probability complexity bounds for SGD with clipping without light-tails assumption on the noise.

Author Information

Eduard Gorbunov (Moscow Institute of Physics and Technology)
Marina Danilova (ICS RAS)
Alexander Gasnikov (Moscow Institute of Physics and Technology)

More from the Same Authors