Timezone: »

 
Poster
Bayesian entropy estimation for binary spike train data using parametric prior knowledge
Evan Archer · Il Memming Park · Jonathan W Pillow

Fri Dec 06 07:00 PM -- 11:59 PM (PST) @ Harrah's Special Events Center, 2nd Floor

Shannon's entropy is a basic quantity in information theory, and a fundamental building block for the analysis of neural codes. Estimating the entropy of a discrete distribution from samples is an important and difficult problem that has received considerable attention in statistics and theoretical neuroscience. However, neural responses have characteristic statistical structure that generic entropy estimators fail to exploit. For example, existing Bayesian entropy estimators make the naive assumption that all spike words are equally likely a priori, which makes for an inefficient allocation of prior probability mass in cases where spikes are sparse. Here we develop Bayesian estimators for the entropy of binary spike trains using priors designed to flexibly exploit the statistical structure of simultaneously-recorded spike responses. We define two prior distributions over spike words using mixtures of Dirichlet distributions centered on simple parametric models. The parametric model captures high-level statistical features of the data, such as the average spike count in a spike word, which allows the posterior over entropy to concentrate more rapidly than with standard estimators (e.g., in cases where the probability of spiking differs strongly from 0.5). Conversely, the Dirichlet distributions assign prior mass to distributions far from the parametric model, ensuring consistent estimates for arbitrary distributions. We devise a compact representation of the data and prior that allow for computationally efficient implementations of Bayesian least squares and empirical Bayes entropy estimators with large numbers of neurons. We apply these estimators to simulated and real neural data and show that they substantially outperform traditional methods.

Author Information

Evan Archer (Sony AI)
Il Memming Park (Stony Brook University)
Jonathan W Pillow (UT Austin)

Jonathan Pillow is an assistant professor in Psychology and Neurobiology at the University of Texas at Austin. He graduated from the University of Arizona in 1997 with a degree in mathematics and philosophy, and was a U.S. Fulbright fellow in Morocco in 1998. He received his Ph.D. in neuroscience from NYU in 2005, and was a Royal Society postdoctoral reserach fellow at the Gatsby Computational Neuroscience Unit, UCL from 2005 to 2008. His recent work involves statistical methods for understanding the neural code in single neurons and neural populations, and his lab conducts psychophysical experiments designed to test Bayesian models of human sensory perception.

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors