Poster
On the Power of Truncated SVD for General High-rank Matrix Estimation Problems
Simon Du · Yining Wang · Aarti Singh
Keywords: [ Theory ]
Abstract:
We show that given an estimate that is close to a general high-rank positive semi-definite (PSD) matrix in spectral norm (i.e., ), the simple truncated Singular Value Decomposition of produces a multiplicative approximation of in Frobenius norm. This observation leads to many interesting results on general high-rank matrix estimation problems: 1.High-rank matrix completion: we show that it is possible to recover a {general high-rank matrix} up to relative error in Frobenius norm from partial observations, with sample complexity independent of the spectral gap of . 2.High-rank matrix denoising: we design algorithms that recovers a matrix with relative error in Frobenius norm from its noise-perturbed observations, without assuming is exactly low-rank. 3.Low-dimensional estimation of high-dimensional covariance: given i.i.d.~samples of dimension from , we show that it is possible to estimate the covariance matrix with relative error in Frobenius norm with ,improving over classical covariance estimation results which requires .
Chat is not available.