Skip to yearly menu bar Skip to main content


Poster

On the Expressivity and Sample Complexity of Node-Individualized Graph Neural Networks

Paolo Pellizzoni · Till Hendrik Schulz · Dexiong Chen · Karsten Borgwardt

East Exhibit Hall A-C #3011
[ ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Graph neural networks (GNNs) employing message passing for graph classification are inherently limited by the expressive power of the Weisfeiler-Leman (WL) test for graph isomorphism. Node individualization schemes, which assign unique identifiers to nodes (e.g., by adding random noise to features), are a common approach for achieving universal expressiveness. However, the ability of GNNs endowed with individualization schemes to generalize beyond the training data is still an open question. To address this question, this paper presents a theoretical analysis of the sample complexity of such GNNs from a statistical learning perspective, employing Vapnik–Chervonenkis (VC) dimension and covering number bounds. We demonstrate that node individualization schemes that are permutation-equivariant result in lower sample complexity, and design novel individualization schemes that exploit these results. As an application of this analysis, we also develop a novel architecture that can perform substructure identification (i.e., subgraph isomorphism) while having a lower VC dimension compared to competing methods. Finally, our theoretical findings are validated experimentally on both synthetic and real-world datasets.

Live content is unavailable. Log in and register to view live content