Timezone: »
We propose a new framework -- Square Root Principal Component Pursuit -- for low-rank matrix recovery from observations corrupted with noise and outliers. Inspired by the square root Lasso, this new formulation does not require prior knowledge of the noise level. We show that a single, universal choice of the regularization parameter suffices to achieve reconstruction error proportional to the (a priori unknown) noise level. In comparison, previous formulations such as stable PCP rely on noise-dependent parameters to achieve similar performance, and are therefore challenging to deploy in applications where the noise level is unknown. We validate the effectiveness of our new method through experiments on simulated and real datasets. Our simulations corroborate the claim that a universal choice of the regularization parameter yields near optimal performance across a range of noise levels, indicating that the proposed method outperforms the (somewhat loose) bound proved here.
Author Information
Junhui Zhang (Columbia University)
Jingkai Yan (Columbia University)
John Wright (Columbia University)
More from the Same Authors
-
2021 Poster: Deep Networks Provably Classify Data on Curves »
Tingran Wang · Sam Buchanan · Dar Gilboa · John Wright -
2014 Poster: Finding a sparse vector in a subspace: Linear sparsity using alternating directions »
Qing Qu · Ju Sun · John Wright -
2012 Workshop: Analysis Operator Learning vs. Dictionary Learning: Fraternal Twins in Sparse Modeling »
Martin Kleinsteuber · Francis Bach · Remi Gribonval · John Wright · Simon Hawe -
2012 Poster: Learning with Partially Absorbing Random Walks »
Xiao-Ming Wu · Zhenguo Li · Shih-Fu Chang · John Wright · Anthony Man-Cho So