Timezone: »

 
Poster
Zeroth-order (Non)-Convex Stochastic Optimization via Conditional Gradient and Gradient Updates
Krishnakumar Balasubramanian · Saeed Ghadimi

Wed Dec 05 02:00 PM -- 04:00 PM (PST) @ Room 210 #27

In this paper, we propose and analyze zeroth-order stochastic approximation algorithms for nonconvex and convex optimization. Specifically, we propose generalizations of the conditional gradient algorithm achieving rates similar to the standard stochastic gradient algorithm using only zeroth-order information. Furthermore, under a structural sparsity assumption, we first illustrate an implicit regularization phenomenon where the standard stochastic gradient algorithm with zeroth-order information adapts to the sparsity of the problem at hand by just varying the step-size. Next, we propose a truncated stochastic gradient algorithm with zeroth-order information, whose rate of convergence depends only poly-logarithmically on the dimensionality.

Author Information

Krishnakumar Balasubramanian (University of California, Davis)
Saeed Ghadimi (Princeton University)

More from the Same Authors