Skip to yearly menu bar Skip to main content


Poster

Decentralized sketching of low rank matrices

Rakshith Sharma Srinivasa · Kiryung Lee · Marius Junge · Justin Romberg

East Exhibition Hall B + C #157

Keywords: [ Algorithms -> Components Analysis (e.g., CCA, ICA, LDA, PCA); Algorithms -> Large Scale Learning; Theory ] [ Information Theory ] [ Optimization ] [ Convex Optimization ]


Abstract:

We address a low-rank matrix recovery problem where each column of a rank-r matrix X of size (d1,d2) is compressed beyond the point of recovery to size L with L << d1. Leveraging the joint structure between the columns, we propose a method to recover the matrix to within an epsilon relative error in the Frobenius norm from a total of O(r(d1 + d2)\log^6(d1 + d2)/\epsilon^2) observations. This guarantee holds uniformly for all incoherent matrices of rank r. In our method, we propose to use a novel matrix norm called the mixed-norm along with the maximum l2 norm of the columns to design a novel convex relaxation for low-rank recovery that is tailored to our observation model. We also show that our proposed mixed-norm, the standard nuclear norm, and the max-norm are particular instances of convex regularization of low-rankness via tensor norms. Finally, we provide a scalable ADMM algorithm for the mixed-norm based method and demonstrate its empirical performance via large-scale simulations.

Live content is unavailable. Log in and register to view live content