Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Symmetry and Geometry in Neural Representations (NeurReps)

Hyperbolic and Mixed Geometry Graph Neural Networks

Rishi Sonthalia · Xinyue Cui


Abstract:

Hyperbolic Graph Neural Networks (GNNs) have shown great promise for modeling hierarchical and graph-structured data in the hyperbolic space, which reduces embedding distortion comparing to Euclidean space. However, existing hyperbolic GNNs implement most operations through differential and exponential maps in the tangent space, which is a Euclidean subspace. To avoid such complex transformations between the hyperbolic and Euclidean spaces, recent advances in hyperbolic learning have formalized hyperbolic neural networks based on the Lorentz model that realize their operations entirely in the hyperbolic space via Lorentz transformations \cite{chen-etal-2022-fully}. Here, we adopt the hyperbolic framework from \cite{chen-etal-2022-fully} and propose a family of hyperbolic GNNs with greater modeling capabilities as opposed to existing hyperbolic GNNs. We also show that this framework allows us to have neural networks with both hyperbolic layers and Euclidean layers that can be trained jointly. Our experiments demonstrate that our fully hyperbolic GNNs lead to substantial improvement in comparison with their Euclidean counterparts.

Chat is not available.