Timezone: »
Tensor decomposition methods are widely used for model compression and fast inference in convolutional neural networks (CNNs). Although many decompositions are conceivable, only CP decomposition and a few others have been applied in practice, and no extensive comparisons have been made between available methods. Previous studies have not determined how many decompositions are available, nor which of them is optimal. In this study, we first characterize a decomposition class specific to CNNs by adopting a flexible graphical notation. The class includes such well-known CNN modules as depthwise separable convolution layers and bottleneck layers, but also previously unknown modules with nonlinear activations. We also experimentally compare the tradeoff between prediction accuracy and time/space complexity for modules found by enumerating all possible decompositions, or by using a neural architecture search. We find some nonlinear decompositions outperform existing ones.
Author Information
Kohei Hayashi (Preferred Networks)
Taiki Yamaguchi (The University of Tokyo)
Yohei Sugawara (Preferred Networks, Inc.)
Shin-ichi Maeda (Preferred Networks)
More from the Same Authors
-
2019 Poster: Robustness to Adversarial Perturbations in Learning from Incomplete Data »
Amir Najafi · Shin-ichi Maeda · Masanori Koyama · Takeru Miyato -
2017 Poster: Fitting Low-Rank Tensors in Constant Time »
Kohei Hayashi · Yuichi Yoshida -
2017 Spotlight: Fitting Low-Rank Tensors in Constant Time »
Kohei Hayashi · Yuichi Yoshida -
2017 Poster: On Tensor Train Rank Minimization : Statistical Efficiency and Scalable Algorithm »
Masaaki Imaizumi · Takanori Maehara · Kohei Hayashi -
2016 Poster: Minimizing Quadratic Functions in Constant Time »
Kohei Hayashi · Yuichi Yoshida -
2013 Poster: Factorized Asymptotic Bayesian Inference for Latent Feature Models »
Kohei Hayashi · Ryohei Fujimaki -
2012 Poster: Weighted Likelihood Policy Search with Model Selection »
Tsuyoshi Ueno · Yoshinobu Kawahara · Kohei Hayashi · Takashi Washio -
2011 Poster: Statistical Performance of Convex Tensor Decomposition »
Ryota Tomioka · Taiji Suzuki · Kohei Hayashi · Hisashi Kashima