Skip to yearly menu bar Skip to main content


Poster

Memory-Efficient Backpropagation Through Time

Audrunas Gruslys · Remi Munos · Ivo Danihelka · Marc Lanctot · Alex Graves

Area 5+6+7+8 #64

Keywords: [ Deep Learning or Neural Networks ]


Abstract:

We propose a novel approach to reduce memory consumption of the backpropagation through time (BPTT) algorithm when training recurrent neural networks (RNNs). Our approach uses dynamic programming to balance a trade-off between caching of intermediate results and recomputation. The algorithm is capable of tightly fitting within almost any user-set memory budget while finding an optimal execution policy minimizing the computational cost. Computational devices have limited memory capacity and maximizing a computational performance given a fixed memory budget is a practical use-case. We provide asymptotic computational upper bounds for various regimes. The algorithm is particularly effective for long sequences. For sequences of length 1000, our algorithm saves 95\% of memory usage while using only one third more time per iteration than the standard BPTT.

Live content is unavailable. Log in and register to view live content