Skip to yearly menu bar Skip to main content


Poster

Path-Normalized Optimization of Recurrent Neural Networks with ReLU Activations

Behnam Neyshabur · Yuhuai Wu · Russ Salakhutdinov · Nati Srebro

Area 5+6+7+8 #42

Keywords: [ Deep Learning or Neural Networks ]


Abstract:

We investigate the parameter-space geometry of recurrent neural networks (RNNs), and develop an adaptation of path-SGD optimization method, attuned to this geometry, that can learn plain RNNs with ReLU activations. On several datasets that require capturing long-term dependency structure, we show that path-SGD can significantly improve trainability of ReLU RNNs compared to RNNs trained with SGD, even with various recently suggested initialization schemes.

Live content is unavailable. Log in and register to view live content