Timezone: »

SPALS: Fast Alternating Least Squares via Implicit Leverage Scores Sampling
Dehua Cheng · Richard Peng · Yan Liu · Kimis Perros

Wed Dec 07 09:00 AM -- 12:30 PM (PST) @ Area 5+6+7+8 #126

Tensor CANDECOMP/PARAFAC (CP) decomposition is a powerful but computationally challenging tool in modern data analytics. In this paper, we show ways of sampling intermediate steps of alternating minimization algorithms for computing low rank tensor CP decompositions, leading to the sparse alternating least squares (SPALS) method. Specifically, we sample the the Khatri-Rao product, which arises as an intermediate object during the iterations of alternating least squares. This product captures the interactions between different tensor modes, and form the main computational bottleneck for solving many tensor related tasks. By exploiting the spectral structures of the matrix Khatri-Rao product, we provide efficient access to its statistical leverage scores. When applied to the tensor CP decomposition, our method leads to the first algorithm that runs in sublinear time per-iteration and approximates the output of deterministic alternating least squares algorithms. Empirical evaluations of this approach show significantly speedups over existing randomized and deterministic routines for performing CP decomposition. On a tensor of the size 2.4m by 6.6m by 92k with over 2 billion nonzeros formed by Amazon product reviews, our routine converges in two minutes to the same error as deterministic ALS.

Author Information

Dehua Cheng (Univ. of Southern California)
Richard Peng (Georgia Tech)
Yan Liu (University of Southern California)
Kimis Perros (Georgia Institute of Technology)

More from the Same Authors