Timezone: »

Adaptation Accelerating Sampling-based Bayesian Inference in Attractor Neural Networks
Xingsi Dong · Zilong Ji · Tianhao Chu · Tiejun Huang · Wenhao Zhang · Si Wu


The brain performs probabilistic Bayesian inference to interpret the external world. The sampling-based view assumes that the brain represents the stimulus posterior distribution via samples of stochastic neuronal responses. Although the idea of sampling-based inference is appealing, it faces a critical challenge of whether stochastic sampling is fast enough to match the rapid computation of the brain. In this study, we explore how latent stimulus sampling can be accelerated in neural circuits. Specifically, we consider a canonical neural circuit model called continuous attractor neural networks (CANNs) and investigate how sampling-based inference of latent continuous variables is accelerated in CANNs. Intriguingly, we find that by including noisy adaptation in the neuronal dynamics, the CANN is able to speed up the sampling process significantly. We theoretically derive that the CANN with noisy adaptation implements the efficient sampling method called Hamiltonian dynamics with friction, where noisy adaption effectively plays the role of momentum. We theoretically analyze the sampling performances of the network and derive the condition when the acceleration has the maximum effect. Simulation results confirm our theoretical analyses. We further extend the model to coupled CANNs and demonstrate that noisy adaptation accelerates the sampling of the posterior distribution of multivariate stimuli. We hope that this study enhances our understanding of how Bayesian inference is realized in the brain.

Author Information

Xingsi Dong (Peking University)
Zilong Ji (Institute of cognitive neuroscience, University College London)
Tianhao Chu (Peking University)
Tiejun Huang (Peking University)
Wenhao Zhang (UT Southwestern Medical Center)
Si Wu (Peking University)

More from the Same Authors