Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Temporal Graph Learning Workshop @ NeurIPS 2023

Large-scale Graph Representation Learning of Dynamic Brain Connectome with Transformers

Byung-Hoon Kim · Jungwon Choi · EungGu Yun · Kyungsang Kim · Xiang Li · Juho Lee


Abstract:

Graph Transformers have recently been successful in various graph representation learning tasks, providing a number of advantages over message-passing Graph Neural Networks.Utilizing Graph Transformers for learning the representation of the brain functional connectivity network is also gaining interest.However, studies to date have underlooked the temporal dynamics of functional connectivity, which fluctuates over time.Here, we propose a method for learning the representation of dynamic functional connectivity with Graph Transformers.Specifically, we define the connectome embedding, which holds the position, structure, and time information of the functional connectivity graph, and use Transformers to learn its representation across time.We perform experiments with over 50,000 resting-state fMRI samples obtained from three datasets, which is the largest number of fMRI data used in studies by far.The experimental results show that our proposed method outperforms other competitive baselines in gender classification and age regression tasks based on the functional connectivity extracted from the resting-state fMRI data.

Chat is not available.