Timezone: »

 
Poster
A Stochastic Gradient Method with an Exponential Convergence 
Rate for Finite Training Sets
Nicolas Le Roux · Mark Schmidt · Francis Bach

Tue Dec 04 07:00 PM -- 12:00 AM (PST) @ Harrah’s Special Events Center 2nd Floor

We propose a new stochastic gradient method for optimizing the sum of
 a finite set of smooth functions, where the sum is strongly convex.
 While standard stochastic gradient methods
 converge at sublinear rates for this problem, the proposed method incorporates a memory of previous gradient values in order to achieve a linear convergence 
rate. In a machine learning context, numerical experiments indicate that the new algorithm can dramatically outperform standard
 algorithms, both in terms of optimizing the training error and reducing the test error quickly.

Author Information

Nicolas Le Roux (Microsoft Research)
Mark Schmidt (INRIA - SIERRA Project Team)
Francis Bach (INRIA - Ecole Normale Superieure)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors