Timezone: »

 
Spotlight
Stochastic Chebyshev Gradient Descent for Spectral Optimization
Insu Han · Haim Avron · Jinwoo Shin

Thu Dec 06 12:35 PM -- 12:40 PM (PST) @ Room 517 CD

A large class of machine learning techniques requires the solution of optimization problems involving spectral functions of parametric matrices, e.g. log-determinant and nuclear norm. Unfortunately, computing the gradient of a spectral function is generally of cubic complexity, as such gradient descent methods are rather expensive for optimizing objectives involving the spectral function. Thus, one naturally turns to stochastic gradient methods in hope that they will provide a way to reduce or altogether avoid the computation of full gradients. However, here a new challenge appears: there is no straightforward way to compute unbiased stochastic gradients for spectral functions. In this paper, we develop unbiased stochastic gradients for spectral-sums, an important subclass of spectral functions. Our unbiased stochastic gradients are based on combining randomized trace estimators with stochastic truncation of the Chebyshev expansions. A careful design of the truncation distribution allows us to offer distributions that are variance-optimal, which is crucial for fast and stable convergence of stochastic gradient methods. We further leverage our proposed stochastic gradients to devise stochastic methods for objective functions involving spectral-sums, and rigorously analyze their convergence rate. The utility of our methods is demonstrated in numerical experiments.

Author Information

Insu Han (KAIST)
Haim Avron (Tel Aviv University)
Jinwoo Shin (KAIST; AITRICS)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors