Timezone: »

Accelerated Mini-Batch Stochastic Dual Coordinate Ascent
Shai Shalev-Shwartz · Tong Zhang

Thu Dec 05 07:00 PM -- 11:59 PM (PST) @ Harrah's Special Events Center, 2nd Floor #None

Stochastic dual coordinate ascent (SDCA) is an effective technique for solving regularized loss minimization problems in machine learning. This paper considers an extension of SDCA under the mini-batch setting that is often used in practice. Our main contribution is to introduce an accelerated mini-batch version of SDCA and prove a fast convergence rate for this method. We discuss an implementation of our method over a parallel computing system, and compare the results to both the vanilla stochastic dual coordinate ascent and to the accelerated deterministic gradient descent method of Nesterov [2007].

Author Information

Shai Shalev-Shwartz (Mobileye & HUJI)
Tong Zhang (Tencent)

More from the Same Authors