Timezone: »
A gated recurrent unit (GRU) is a successful recurrent neural network architecture for time-series data. The GRU is typically trained using a gradient-based method, which is subject to the exploding gradient problem in which the gradient increases significantly. This problem is caused by an abrupt change in the dynamics of the GRU due to a small variation in the parameters. In this paper, we find a condition under which the dynamics of the GRU changes drastically and propose a learning method to address the exploding gradient problem. Our method constrains the dynamics of the GRU so that it does not drastically change. We evaluated our method in experiments on language modeling and polyphonic music modeling. Our experiments showed that our method can prevent the exploding gradient problem and improve modeling accuracy.
Author Information
Sekitoshi Kanai (NTT, Keio University)
Yasuhiro Fujiwara (NTT Software Innovation Center)
Sotetsu Iwamura (NTT Software Innovation center)
More from the Same Authors
-
2021 Poster: Meta-Learning for Relative Density-Ratio Estimation »
Atsutoshi Kumagai · Tomoharu Iwata · Yasuhiro Fujiwara -
2021 Poster: Permuton-induced Chinese Restaurant Process »
Masahiro Nakano · Yasuhiro Fujiwara · Akisato Kimura · Takeshi Yamada · naonori ueda -
2018 Poster: Sigsoftmax: Reanalysis of the Softmax Bottleneck »
Sekitoshi Kanai · Yasuhiro Fujiwara · Yuki Yamanaka · Shuichi Adachi