Skip to yearly menu bar Skip to main content


Spotlight Poster

Non-convolutional graph neural networks.

Yuanqing Wang · Kyunghyun Cho

East Exhibit Hall A-C #4603
[ ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Rethink convolution-based graph neural networks (GNN)---they characteristically suffer from limited expressiveness, over-smoothing, and over-squashing, and require specialized sparse kernels for efficient computation.Here, we design a simple graph learning module entirely free of convolution operators, coined random walk with unifying memory (RUM) neural network, where an RNN merges the topological and semantic graph features along the random walks terminating at each node.Relating the rich literature on RNN behavior and graph topology, we theoretically show and experimentally verify that RUM attenuates the aforementioned symptoms and is more expressive than the Weisfeiler-Lehman (WL) isomorphism test.On a variety of node- and graph-level classification and regression tasks, RUM not only achieves competitive performance, but is also robust, memory-efficient, scalable, and faster than the simplest convolutional GNNs.

Live content is unavailable. Log in and register to view live content