Skip to yearly menu bar Skip to main content


Poster
in
Workshop: AI for Science: from Theory to Practice

Predictive Uncertainty Quantification for Graph Neural Network Driven Relaxed Energy Calculations

Joseph Musielewicz · Janice Lan · Matt Uyttendaele


Abstract:

Graph neural networks (GNNs) have been shown to be astonishingly capable models for molecular property prediction, particularly as surrogates for expensive density functional theory calculations of relaxed energy for novel material discovery. However, one limitation of GNNs in this context is the lack of useful uncertainty prediction methods, as this is critical to the material discovery pipeline. In this work, we show that uncertainty quantification for relaxed energy calculations is more complex than uncertainty quantification for other kinds of molecular property prediction, due to the effect that structure optimizations have on the error distribution. We propose that distribution-free techniques are more useful tools for assessing calibration, recalibrating, and developing uncertainty prediction methods for GNNs performing relaxed energy calculations. We also develop a relaxed energy task for evaluating uncertainty methods for equivariant GNNs, based on distribution-free recalibration and using the Open Catalyst Project dataset. We benchmark a set of popular uncertainty prediction methods on this task, and show that latent distance methods, with our novel improvements, are the most well-calibrated and economical approach for relaxed energy calculations. Further, we challenge the community to develop improved uncertainty prediction methods for GNN-driven relaxed energy calculations, and benchmark them on this task.

Chat is not available.