Timezone: »

 
Poster
The Reversible Residual Network: Backpropagation Without Storing Activations
Aidan Gomez · Mengye Ren · Raquel Urtasun · Roger Grosse

Mon Dec 04 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #120

Residual Networks (ResNets) have demonstrated significant improvement over traditional Convolutional Neural Networks (CNNs) on image classification, increasing in performance as networks grow both deeper and wider. However, memory consumption becomes a bottleneck as one needs to store all the intermediate activations for calculating gradients using backpropagation. In this work, we present the Reversible Residual Network (RevNet), a variant of ResNets where each layer's activations can be reconstructed exactly from the next layer's. Therefore, the activations for most layers need not be stored in memory during backprop. We demonstrate the effectiveness of RevNets on CIFAR and ImageNet, establishing nearly identical performance to equally-sized ResNets, with activation storage requirements independent of depth.

Author Information

Aidan Gomez (University of Toronto)
Mengye Ren (University of Toronto)
Raquel Urtasun (University of Toronto)
Roger Grosse (University of Toronto)

More from the Same Authors