Timezone: »
Convolutional layers within graph neural networks operate by aggregating information about local neighbourhood structures; one common way to encode such substructures is through random walks. The distribution of these random walks evolves according to a diffusion equation defined using the graph Laplacian. We extend this approach by leveraging classic mathematical results about hypo-elliptic diffusions. This results in a novel tensor-valued graph operator, which we call the hypo-elliptic graph Laplacian. We provide theoretical guarantees and efficient low-rank approximation algorithms. In particular, this gives a structured approach to capture long-range dependencies on graphs that is robust to pooling. Besides the attractive theoretical properties, our experiments show that this method competes with graph transformers on datasets requiring long-range reasoning but scales only linearly in the number of edges as opposed to quadratically in nodes.
Author Information
Csaba Toth (University of Oxford)
Darrick Lee (University of Oxford)
Celia Hacker (Swiss Federal Institute of Technology Lausanne)
Harald Oberhauser (University of Oxford)
More from the Same Authors
-
2022 Poster: Positively Weighted Kernel Quadrature via Subsampling »
Satoshi Hayakawa · Harald Oberhauser · Terry Lyons -
2022 Poster: Fast Bayesian Inference with Batch Bayesian Quadrature via Kernel Recombination »
Masaki Adachi · Satoshi Hayakawa · Martin Jørgensen · Harald Oberhauser · Michael A Osborne -
2020 Poster: A Randomized Algorithm to Reduce the Support of Discrete Measures »
Francesco Cosentino · Harald Oberhauser · Alessandro Abate -
2020 Spotlight: A Randomized Algorithm to Reduce the Support of Discrete Measures »
Francesco Cosentino · Harald Oberhauser · Alessandro Abate