Skip to yearly menu bar Skip to main content


Poster

Equivariant Machine Learning on Graphs with Nonlinear Spectral Filters

Ya-Wei Eileen Lin · Ronen Talmon · Ron Levie

East Exhibit Hall A-C #3102
[ ]
Thu 12 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Equivariant machine learning is an approach for designing deep learning models that respect the symmetries of the problem, with the aim of reducing model complexity and improving generalization. In this paper, we focus on an extension of shift equivariance, which is the basis of convolution networks on images, to general graphs. Unlike images, graphs do not have a natural notion of domain translation. Therefore, we consider the graph functional shifts as the symmetry group: the unitary operators that commute with the graph shift operator. Notably, such symmetries operate in the signal space rather than directly in the spatial space.We remark that each linear filter layer of a standard spectral graph neural network (GNN) commutes with graph functional shifts, but the activation function breaks this symmetry. Instead, we propose nonlinear spectral filters (NLSFs) that are fully equivariant to graph functional shifts and show that they have universal approximation properties. The proposed NLSFs are based on a new form of spectral domain that is transferable between graphs. We demonstrate the superior performance of NLSFs over existing spectral GNNs in node and graph classification benchmarks.

Live content is unavailable. Log in and register to view live content