Timezone: »

Brain Network Transformer
Xuan Kan · Wei Dai · Hejie Cui · Zilong Zhang · Ying Guo · Carl Yang

Tue Nov 29 09:00 AM -- 11:00 AM (PST) @ Hall J #904

Human brains are commonly modeled as networks of Regions of Interest (ROIs) and their connections for the understanding of brain functions and mental disorders. Recently, Transformer-based models have been studied over different types of data, including graphs, shown to bring performance gains widely. In this work, we study Transformer-based models for brain network analysis. Driven by the unique properties of data, we model brain networks as graphs with nodes of fixed size and order, which allows us to (1) use connection profiles as node features to provide natural and low-cost positional information and (2) learn pair-wise connection strengths among ROIs with efficient attention weights across individuals that are predictive towards downstream analysis tasks. Moreover, we propose an Orthonormal Clustering Readout operation based on self-supervised soft clustering and orthonormal projection. This design accounts for the underlying functional modules that determine similar behaviors among groups of ROIs, leading to distinguishable cluster-aware node embeddings and informative graph embeddings. Finally, we re-standardize the evaluation pipeline on the only one publicly available large-scale brain network dataset of ABIDE, to enable meaningful comparison of different models. Experiment results show clear improvements of our proposed Brain Network Transformer on both the public ABIDE and our restricted ABCD datasets. The implementation is available at https://github.com/Wayfear/BrainNetworkTransformer.

Author Information

Xuan Kan
Wei Dai (Stanford University)
Hejie Cui (Emory University)
Zilong Zhang (University of International Business and Economics)
Ying Guo
Carl Yang (Emory University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors