Skip to yearly menu bar Skip to main content


Poster
in
Workshop: AI for Accelerated Materials Design (AI4Mat)

Hyperparameter Optimization of Graph Neural Networks for the OpenCatalyst Dataset: A Case Study

Carmelo Gonzales · Eric Lee · Kin Long Kelvin Lee · Joyce Tang · Santiago Miret

Keywords: [ graph neural networks ] [ hyperparameter optimization ] [ Multi-Fidelity Optimization ] [ OpenCatalyst Dataset ]


Abstract:

The proliferation of deep learning (DL) techniques in recent years has often resulted in the creation of progressively larger datasets and deep learning architectures. As the expressive power of DL models has grown, so has the compute capacity needed to effectively train the models. One such example is the OpenCatalyst dataset in the emerging field of scientific machine learning, which has elevated the compute requirements needed to effectively train graph neural networks (GNNs) on complex scientific data. The extensive compute complexity involved in training GNNs on the OpenCatalyst dataset makes it very costly to perform hyperparameter optimization (HPO) using traditional methods, such as grid search or even Bayesian optimization-based approaches. Given this challenge, we propose a novel methodology for effective, cost-aware HPO on GNN training on OpenCatalyst that leverages a multi-fidelity approach with experiments on reduced datasets, hyperparameter importance, and computational budget considerations. We show speed ups by over 50 percent when performing hyperparameter optimization of the E(n)-GNN model on the OpenCatalyst dataset.

Chat is not available.