Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2023 Workshop: Machine Learning and the Physical Sciences

Unleashing the Potential of Fractional Calculus in Graph Neural Networks

Qiyu Kang · Kai Zhao · Qinxu Ding · Feng Ji · Xuhao Li · WENFEI LIANG · Yang Song · Wee Peng Tay


Abstract:

We introduce the FRactional-Order graph Neural Dynamical network (FROND), a learning framework that augments traditional graph neural ordinary differential equation (ODE) models by integrating the time-fractional Caputo derivative. Thanks to its non-local characteristic, fractional calculus enables our framework to encapsulate long-term memories during the feature-updating process, diverging from the Markovian updates inherent in conventional graph neural ODE models. This capability enhances graph representation learning.Analytically, we exhibit that over-smoothing issues are mitigated when feature updating is regulated by a diffusion process. Additionally, our framework affords a fresh dynamical system perspective to comprehend various skip or dense connections situated between GNN layers in existing literature.

Chat is not available.