Poster
Towards a "Universal Translator" for Neural Dynamics at Single-Cell, Single-Spike Resolution
Yizi Zhang · Yanchen Wang · Donato Jiménez-Benetó · Zixuan Wang · Mehdi Azabou · Blake Richards · Renee Tung · Olivier Winter · Brain Laboratory International · Eva Dyer · Liam Paninski · Cole Hurwitz
East Exhibit Hall A-C #3800
Neuroscience research has made immense progress over the last decade, but our understanding of the brain remains fragmented and piecemeal: the dream of probing an arbitrary brain region and automatically reading out the information encoded in its neural activity remains out of reach. In this work, we build towards a first foundation model for neural spiking data that can solve a diverse set of tasks across multiple brain areas. We introduce a novel self-supervised modeling approach for population activity in which the model alternates between masking out and reconstructing neural activity across different time steps, neurons, and brain regions. To evaluate our approach, we design unsupervised and supervised prediction tasks using the International Brain Laboratory repeated site dataset, which is comprised of Neuropixels recordings targeting the same brain locations across 48 animals and experimental sessions. The prediction tasks include single-neuron and region-level activity prediction, forward prediction, and behavior decoding. We demonstrate that our multi-task-masking (MtM) approach significantly improves the performance of current state-of-the-art population models and enables multi-task learning. We also show that by training on multiple animals, we can improve the generalization ability of the model to unseen animals, paving the way for a foundation model of the brain at single-cell, single-spike resolution.
Live content is unavailable. Log in and register to view live content