Poster
Invariant subspaces and PCA in nearly matrix multiplication time
Aleksandros Sobczyk · Marko Mladenovic · Mathieu Luisier
West Ballroom A-D #5900
Abstract:
Approximating invariant subspaces of generalized eigenvalue problems (GEPs) is a fundamental computational problem at the core of machine learning and scientific computing. It is, for example, the root of Principal Component Analysis (PCA) for dimensionality reduction, data visualization, and noise filtering, and of Density Functional Theory (DFT), arguably the most popular method to calculate the electronic structure of materials. Given Hermitian , where is positive-definite, let be the true spectral projector on the invariant subspace that is associated with the smallest (or largest) eigenvalues of the GEP , for some . We show that we can compute a matrix such that , in bit operations in the floating point model, for some , with probability . Here, is arbitrarily small, is the matrix multiplication exponent, , and is the gap between eigenvalues and . To achieve such provable "forward-error" guarantees, our methods rely on a new stability analysis for the Cholesky factorization, and a smoothed analysis for computing spectral gaps, which can be of independent interest.Ultimately, we obtain new matrix multiplication-type bit complexity upper bounds for PCA problems, including classical PCA and (randomized) low-rank approximation.
Chat is not available.