Timezone: »

 
Poster
Tight Nonparametric Convergence Rates for Stochastic Gradient Descent under the Noiseless Linear Model
Raphaël Berthier · Francis Bach · Pierre Gaillard

Thu Dec 10 09:00 AM -- 11:00 AM (PST) @ Poster Session 5 #1390
In the context of statistical supervised learning, the noiseless linear model assumes that there exists a deterministic linear relation $Y = \langle \theta_*, \Phi(U) \rangle$ between the random output $Y$ and the random feature vector $\Phi(U)$, a potentially non-linear transformation of the inputs~$U$. We analyze the convergence of single-pass, fixed step-size stochastic gradient descent on the least-square risk under this model. The convergence of the iterates to the optimum $\theta_*$ and the decay of the generalization error follow polynomial convergence rates with exponents that both depend on the regularities of the optimum $\theta_*$ and of the feature vectors $\Phi(U)$. We interpret our result in the reproducing kernel Hilbert space framework. As a special case, we analyze an online algorithm for estimating a real function on the unit hypercube from the noiseless observation of its value at randomly sampled points; the convergence depends on the Sobolev smoothness of the function and of a chosen kernel. Finally, we apply our analysis beyond the supervised learning setting to obtain convergence rates for the averaging process (a.k.a. gossip algorithm) on a graph depending on its spectral dimension.

Author Information

Raphaël Berthier (INRIA, ENS)
Francis Bach (INRIA - Ecole Normale Superieure)
Pierre Gaillard (Inria)

More from the Same Authors