Skip to yearly menu bar Skip to main content


Poster

Normalization-Equivariant Neural Networks with Application to Image Denoising

Sébastien Herbreteau · Emmanuel Moebel · Charles Kervrann

Great Hall & Hall B1+B2 (level 1) #719
[ ] [ Project Page ]
[ Paper [ Slides [ Poster [ OpenReview
Wed 13 Dec 8:45 a.m. PST — 10:45 a.m. PST

Abstract:

In many information processing systems, it may be desirable to ensure that any change of the input, whether by shifting or scaling, results in a corresponding change in the system response. While deep neural networks are gradually replacing all traditional automatic processing methods, they surprisingly do not guarantee such normalization-equivariance (scale + shift) property, which can be detrimental in many applications. To address this issue, we propose a methodology for adapting existing neural networks so that normalization-equivariance holds by design. Our main claim is that not only ordinary convolutional layers, but also all activation functions, including the ReLU (rectified linear unit), which are applied element-wise to the pre-activated neurons, should be completely removed from neural networks and replaced by better conditioned alternatives. To this end, we introduce affine-constrained convolutions and channel-wise sort pooling layers as surrogates and show that these two architectural modifications do preserve normalization-equivariance without loss of performance. Experimental results in image denoising show that normalization-equivariant neural networks, in addition to their better conditioning, also provide much better generalization across noise levels.

Chat is not available.