Keynote
in
Workshop: Learning with Tensors: Why Now and How?
Orthogonalized Alternating Least Squares: A theoretically principled tensor factorization algorithm for practical use
From a theoretical perspective, low-rank tensor factorization is an algorithmic miracle, allowing for (provably correct) reconstruction and learning in a number of settings. From a practical standpoint, we still lack sufficiently robust, versatile, and efficient tensor factorization algorithms, particularly for large-scale problems. Many of the algorithms with provable guarantees either suffer from an expensive initialization step, and require the iterative removal of rank-1 factors, destroying any sparsity that might be present in the original tensor. On the other hand, the most commonly used algorithm in practice is "alternating least squares" [ALS], which iteratively fixes all but one mode, and optimizes the remaining mode. This algorithm is extremely efficient, but often converges to bad local optima, particularly when the weights of the factors are non-uniform. In this work, we propose a modification of the ALS approach that enjoys practically viable efficiency, as well as provable recovery (assuming the factors are random or have small pairwise inner products) even for highly non-uniform weights. We demonstrate the significant superiority of our recovery algorithm over the traditional ALS on both random synthetic data, and on computing word embeddings from a third-order word tri-occurrence tensor.
This is based on joint work with Vatsal Sharan.
Live content is unavailable. Log in and register to view live content