Skip to yearly menu bar Skip to main content


Poster

Relaxing Equivariance Constraints with Non-stationary Continuous Filters

Tycho van der Ouderaa · David W. Romero · Mark van der Wilk

Hall J (level 1) #306

Keywords: [ Equivariance ] [ filters ] [ constraints ] [ non-stationary ] [ convolution ] [ soft equivariance ] [ partial equivariance ] [ generalized ] [ relaxed ] [ kernel ]


Abstract:

Equivariances provide useful inductive biases in neural network modeling, with the translation equivariance of convolutional neural networks being a canonical example. Equivariances can be embedded in architectures through weight-sharing and place symmetry constraints on the functions a neural network can represent. The type of symmetry is typically fixed and has to be chosen in advance. Although some tasks are inherently equivariant, many tasks do not strictly follow such symmetries. In such cases, equivariance constraints can be overly restrictive. In this work, we propose a parameter-efficient relaxation of equivariance that can effectively interpolate between a (i) non-equivariant linear product, (ii) a strict-equivariant convolution, and (iii) a strictly-invariant mapping. The proposed parameterisation can be thought of as a building block to allow adjustable symmetry structure in neural networks. In addition, we demonstrate that the amount of equivariance can be learned from the training data using backpropagation. Gradient-based learning of equivariance achieves similar or improved performance compared to the best value found by cross-validation and outperforms baselines with partial or strict equivariance on CIFAR-10 and CIFAR-100 image classification tasks.

Chat is not available.