Uncertainty-Aware Message Passing Neural Networks
Alesia Chernikova · Moritz Laber · Narayan Sabhahit · Tina Eliassi-Rad
Abstract
Existing theoretical guarantees for message passing neural networks (MPNNs) assume deterministic node features, whereas in this work we address the more realistic setting where inherent noise or finite measurement precision lead to uncertainty about node features. First, we quantify uncertainty in MPNNs by propagating Gaussian node feature distributions through the architecture using polynomial chaos expansion (PCE), and leverage the resulting approximate node embeddings distributions for analytic, probabilistic robustness certificates against $L_2$-bounded node feature perturbations. Second, we model node features as multivariate random variables and propose a Wasserstein-based pseudometric that matches the discriminative power of node-level MPNNs. We show that MPNNs are global Lipschitz continuous functions w.r.t. the introduced pseudometric. Our framework subsumes the deterministic case via Dirac measures and offers a foundation for reasoning about generalization in MPNNs with uncertainty on node features.
Chat is not available.
Successful Page Load