Skip to yearly menu bar Skip to main content


Poster

Principal Component Projection and Regression in Nearly Linear Time through Asymmetric SVRG

Yujia Jin · Aaron Sidford

East Exhibition Hall B + C #162

Keywords: [ Computational Comp ] [ Algorithms -> Components Analysis (e.g., CCA, ICA, LDA, PCA); Optimization -> Convex Optimization; Theory ] [ Optimization ] [ Stochastic Optimization ]


Abstract:

Given a n-by-d data matrix A, principal component projection (PCP) and principal component regression (PCR), i.e. projection and regression restricted to the top-eigenspace of A, are fundamental problems in machine learning, optimization, and numerical analysis. In this paper we provide the first algorithms that solve these problems in nearly linear time for fixed eigenvalue distribution and large n. This improves upon previous methods which had superlinear running times when either the number of top eigenvalues or gap between the eigenspaces were large.

We achieve our results by applying rational polynomial approximations to reduce the problem to solving asymmetric linear systems which we solve by a variant of SVRG. We corroborate these findings with preliminary empirical experiments.

Live content is unavailable. Log in and register to view live content