Timezone: »

 
Poster
Dependent Dirichlet Process Spike Sorting
Jan Gasthaus · Frank Wood · Dilan Gorur · Yee Whye Teh

Tue Dec 09 07:30 PM -- 12:00 AM (PST) @

In this paper we propose a new incremental spike sorting model that automatically eliminates refractory period violations, accounts for action potential waveform drift, and can handle "appearance" and "disappearance" of neurons. Our approach is to augment a known time-varying Dirichlet process that ties together a sequence of infinite Gaussian mixture models, one per action potential waveform observation, with an interspike-interval-dependent likelihood that prohibits refractory period violations. We demonstrate this model by showing results from sorting two publicly available neural data recordings for which the a partial ground truth labeling is known.

Author Information

Jan Gasthaus (Amazon)
Frank Wood (University of British Columbia)

Dr. Wood is an associate professor in the Department of Engineering Science at the University of Oxford. Before that he was an assistant professor of Statistics at Columbia University and a research scientist at the Columbia Center for Computational Learning Systems. He formerly was a postdoctoral fellow of the Gatsby Computational Neuroscience Unit of the University College London. He holds a PhD from Brown University (’07) and BS from Cornell University (’96), both in computer science. Dr. Wood is the original architect of both the Anglican and Probabilistic-C probabilistic programming systems. He conducts AI-driven research at the boundary of probabilistic programming, Bayesian modeling, and Monte Carlo methods. Dr. Wood holds 6 patents, has authored over 50 papers, received the AISTATS best paper award in 2009, and has been awarded faculty research awards from Xerox, Google and Amazon. Prior to his academic career he was a successful entrepreneur having run and sold the content-based image retrieval company ToFish! to AOL/Time Warner and served as CEO of Interfolio.

Dilan Gorur (DeepMind)
Yee Whye Teh (University of Oxford, DeepMind)

I am a Professor of Statistical Machine Learning at the Department of Statistics, University of Oxford and a Research Scientist at DeepMind. I am also an Alan Turing Institute Fellow and a European Research Council Consolidator Fellow. I obtained my Ph.D. at the University of Toronto (working with Geoffrey Hinton), and did postdoctoral work at the University of California at Berkeley (with Michael Jordan) and National University of Singapore (as Lee Kuan Yew Postdoctoral Fellow). I was a Lecturer then a Reader at the Gatsby Computational Neuroscience Unit, UCL, and a tutorial fellow at University College Oxford, prior to my current appointment. I am interested in the statistical and computational foundations of intelligence, and works on scalable machine learning, probabilistic models, Bayesian nonparametrics and deep learning. I was programme co-chair of ICML 2017 and AISTATS 2010.

More from the Same Authors