Skip to yearly menu bar Skip to main content


Spotlight
in
Workshop: The Symbiosis of Deep Learning and Differential Equations -- III

Adaptive Resolution Residual Networks

Léa Demeule · Mahtab Sandhu · Glen Berseth

Keywords: [ Residual Network ] [ Adaptive resolution ] [ Rediscretization ] [ Convolutional Network ] [ Laplacian residual ] [ Laplacian dropout ] [ Laplacian pyramid ] [ Neural operator ]

[ ] [ Project Page ]
Sat 16 Dec 7:45 a.m. PST — 8 a.m. PST

Abstract:

We introduce Adaptive Resolution Residual Networks (ARRNs), a form of neural operator that enables the creation of networks for signal-based tasks that can be rediscretized to suit any signal resolution. ARRNs are composed of a chain of Laplacian residuals that each contain ordinary layers, which do not need to be rediscretizable for the whole network to be rediscretizable. ARRNs have the property of requiring a lower number of Laplacian residuals for exact evaluation on lower resolution signals, which greatly reduces computational cost. ARRNs also implement Laplacian dropout, which encourages networks to become robust to low-bandwidth signals. ARRNs can thus be trained once at high-resolution and then be rediscretized on the fly at the suitable resolution with great robustness.

Chat is not available.