Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Memory in Artificial and Real Intelligence (MemARI)

Cache-memory gated graph neural networks

Guixiang Ma · Vy Vo · Nesreen K. Ahmed · Theodore Willke


Abstract:

While graph neural networks (GNNs) provide a powerful way to learn structured representations, it remains challenging to learn long-range dependencies between graph nodes. Recurrent gated GNNs only partly address this problem. We introduce a memory augmentation to a gated GNN which simply stores the previous hidden states in a cache. We show that the cache-memory gated GNN outperforms other models on a synthetic task that requires long-range information, as well as tasks on real-world datasets.

Chat is not available.