Skip to yearly menu bar Skip to main content


Poster

Generalization Analysis of Message Passing Neural Networks on Large Random Graphs

Sohir Maskey · Ron Levie · Yunseok Lee · Gitta Kutyniok

Hall J (level 1) #407

Keywords: [ generalization ] [ graph neural networks ] [ convergence ] [ large random graphs ] [ message passing ]


Abstract:

Message passing neural networks (MPNN) have seen a steep rise in popularity since their introduction as generalizations of convolutional neural networks to graph-structured data, and are now considered state-of-the-art tools for solving a large variety of graph-focused problems. We study the generalization error of MPNNs in graph classification and regression. We assume that graphs of different classes are sampled from different random graph models. We show that, when training a MPNN on a dataset sampled from such a distribution, the generalization gap increases in the complexity of the MPNN, and decreases, not only with respect to the number of training samples, but also with the average number of nodes in the graphs. This shows how a MPNN with high complexity can generalize from a small dataset of graphs, as long as the graphs are large. The generalization bound is derived from a uniform convergence result, that shows that any MPNN, applied on a graph, approximates the MPNN applied on the geometric model that the graph discretizes.

Chat is not available.