Timezone: »
Latent dynamics models have emerged as powerful tools for modeling and interpreting neural population activity. Recently, there has been a focus on incorporating simultaneously measured behaviour into these models to further disentangle sources of neural variability in their latent space. These approaches, however, are limited in their ability to capture the underlying neural dynamics (e.g. linear) and in their ability to relate the learned dynamics back to the observed behaviour (e.g. no time lag). To this end, we introduce Targeted Neural Dynamical Modeling (TNDM), a nonlinear state-space model that jointly models the neural activity and external behavioural variables. TNDM decomposes neural dynamics into behaviourally relevant and behaviourally irrelevant dynamics; the relevant dynamics are used to reconstruct the behaviour through a flexible linear decoder and both sets of dynamics are used to reconstruct the neural activity through a linear decoder with no time lag. We implement TNDM as a sequential variational autoencoder and validate it on simulated recordings and recordings taken from the premotor and motor cortex of a monkey performing a center-out reaching task. We show that TNDM is able to learn low-dimensional latent dynamics that are highly predictive of behaviour without sacrificing its fit to the neural data.
Author Information
Cole Hurwitz (University of Edinburgh)
Akash Srivastava (MIT–IBM Watson AI Lab)
Kai Xu (University of Edinburgh)
Justin Jude (University of Edinburgh)
Matthew Perich (Icahn School of Medicine at Mount Sinai)
Lee Miller (Northwestern University at Chicago)
Matthias Hennig (University of Edinburgh)
More from the Same Authors
-
2021 : Neural Latents Benchmark ‘21: Evaluating latent variable models of neural population activity »
Felix Pei · Joel Ye · David Zoltowski · Anqi Wu · Raeed Chowdhury · Hansem Sohn · Joseph O'Doherty · Krishna V Shenoy · Matthew Kaufman · Mark Churchland · Mehrdad Jazayeri · Lee Miller · Jonathan Pillow · Il Memming Park · Eva Dyer · Chethan Pandarinath -
2021 Poster: A Bayesian-Symbolic Approach to Reasoning and Learning in Intuitive Physics »
Kai Xu · Akash Srivastava · Dan Gutfreund · Felix Sosa · Tomer Ullman · Josh Tenenbaum · Charles Sutton -
2020 Poster: Telescoping Density-Ratio Estimation »
Benjamin Rhodes · Kai Xu · Michael Gutmann -
2020 Spotlight: Telescoping Density-Ratio Estimation »
Benjamin Rhodes · Kai Xu · Michael Gutmann -
2019 Poster: Scalable Spike Source Localization in Extracellular Recordings using Amortized Variational Inference »
Cole Hurwitz · Kai Xu · Akash Srivastava · Alessio Buccino · Matthias Hennig -
2018 : Poster Session »
Lorenzo Masoero · Tammo Rukat · Runjing Liu · Sayak Ray Chowdhury · Daniel Coelho de Castro · Claudia Wehrhahn · Feras Saad · Archit Verma · Kelvin Hsu · Irineo Cabreros · Sandhya Prabhakaran · Yiming Sun · Maxime Rischard · Linfeng Liu · Adam Farooq · Jeremiah Liu · Melanie F. Pradier · Diego Romeres · Neill Campbell · Kai Xu · Mehmet M Dundar · Tucker Keuter · Prashnna Gyawali · Eli Sennesh · Alessandro De Palma · Daniel Flam-Shepherd · Takatomi Kubo -
2018 Poster: HOUDINI: Lifelong Learning as Program Synthesis »
Lazar Valkov · Dipak Chaudhari · Akash Srivastava · Charles Sutton · Swarat Chaudhuri -
2017 Poster: VEEGAN: Reducing Mode Collapse in GANs using Implicit Variational Learning »
Akash Srivastava · Lazar Valkov · Chris Russell · Michael Gutmann · Charles Sutton