Skip to yearly menu bar Skip to main content


Poster

Almost Surely Asymptotically Constant Graph Neural Networks

Sam Adam-Day · Michael Benedikt · Ismail Ceylan · Ben Finkelshtein

TBD Poster Room (East or West)
[ ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

We present a new angle on the expressive power of graph neural networks (GNNs) by studying how the predictions of a GNN probabilistic classifier evolve as we apply it on larger graphs drawn from some random graph model. We show that the output converges to a constant function, which upper-bounds what these classifiers can uniformly express. This strong convergence phenomenon applies to a very wide class of GNNs, including state of the art models, with aggregates including mean and the attention-based mechanism of graph transformers. Our results apply to a broad class of random graph models, including the (sparse) Erdős–Rényi model, the stochastic block model, and the Barabási-Albert model. We empirically validate these findings, observing that the convergence phenomenon appears not only on random graphs but also on some real-world graphs of modest size.

Live content is unavailable. Log in and register to view live content