Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Medical Imaging meets NeurIPS

Decentralized Sparse Federated Learning for Efficient Training on Distributed NeuroImaging Data

Bishal Thapaliya · Riyasat Ohib · Eloy Geenjaar · Jingyu Liu · Vince Calhoun · Sergey Plis


Abstract:

Neuroimaging advancements have increased data sharing among researchers. Yet, institutions often retain data control due to research culture, privacy, and accountability. There is therefore a need for tools that analyze combined datasets without transmitting the actual data. We introduce a decentralized sparse federated learning (FL) approach that locally trains sparse models for efficient communication in such settings. By leveraging sparsity and transmitting only some parameters among client sites throughout the training, we reduce communication costs, especially with larger models and varied site-specific resources. We validate our method using the ABCD data.

Chat is not available.