Timezone: »

SBO-RNN: Reformulating Recurrent Neural Networks via Stochastic Bilevel Optimization
Ziming Zhang · Yun Yue · Guojun Wu · Yanhua Li · Haichong Zhang

Wed Dec 08 12:30 AM -- 02:00 AM (PST) @

In this paper we consider the training stability of recurrent neural networks (RNNs) and propose a family of RNNs, namely SBO-RNN, that can be formulated using stochastic bilevel optimization (SBO). With the help of stochastic gradient descent (SGD), we manage to convert the SBO problem into an RNN where the feedforward and backpropagation solve the lower and upper-level optimization for learning hidden states and their hyperparameters, respectively. We prove that under mild conditions there is no vanishing or exploding gradient in training SBO-RNN. Empirically we demonstrate our approach with superior performance on several benchmark datasets, with fewer parameters, less training data, and much faster convergence. Code is available at https://zhang-vislab.github.io.

Author Information

Ziming Zhang (Worcester Polytechnic Institute (WPI))
Yun Yue (Worcester Polytechnic Institute)
Guojun Wu (Worcester Polytechnic Institute)
Yanhua Li ("Worcester Polytechnic Institute, USA")
Haichong Zhang (Johns Hopkins University)

More from the Same Authors