Timezone: »
We present an efficient low-rank approximation algorithm for non-negative tensors. The algorithm is derived from our two findings: First, we show that rank-1 approximation for tensors can be viewed as a mean-field approximation by treating each tensor as a probability distribution. Second, we theoretically provide a sufficient condition for distribution parameters to reduce Tucker ranks of tensors; interestingly, this sufficient condition can be achieved by iterative application of the mean-field approximation. Since the mean-field approximation is always given as a closed formula, our findings lead to a fast low-rank approximation algorithm without using a gradient method. We empirically demonstrate that our algorithm is faster than the existing non-negative Tucker rank reduction methods and achieves competitive or better approximation of given tensors.
Author Information
Kazu Ghalamkari (National Institute of Informatics / SOKENDAI)
I am engaged in research to develop machine learning theory based on information geometry. My background in master's course was physics. So I am intensely interested in the interdisciplinary academic field of intelligent informatics and physics. When I can describe things in machine learning with physics terms, I am thrilled. In NeurIPS2021, I will introduce a novel tensor decomposition method, called Legendre Tucker Rank reduction (LTR). In this paper, I pointed out that the tensor rank-1 approximation is a mean-field approximation, a widely known physics methodology for reducing many-body problems to one-body problems. More details are on my personal page. gkazu.info
Mahito Sugiyama (National Institute of Informatics)
More from the Same Authors
-
2020 : Towards Geometric Understanding of Low-Rank Approximation »
Mahito Sugiyama · Kazu Ghalamkari -
2015 Poster: Halting in Random Walk Kernels »
Mahito Sugiyama · Karsten Borgwardt -
2013 Poster: Rapid Distance-Based Outlier Detection via Sampling »
Mahito Sugiyama · Karsten Borgwardt