Skip to yearly menu bar Skip to main content


Poster

Synaptic Sampling: A Bayesian Approach to Neural Network Plasticity and Rewiring

David Kappel · Stefan Habenschuss · Robert Legenstein · Wolfgang Maass

210 C #8

Abstract:

We reexamine in this article the conceptual and mathematical framework for understanding the organization of plasticity in spiking neural networks. We propose that inherent stochasticity enables synaptic plasticity to carry out probabilistic inference by sampling from a posterior distribution of synaptic parameters. This view provides a viable alternative to existing models that propose convergence of synaptic weights to maximum likelihood parameters. It explains how priors on weight distributions and connection probabilities can be merged optimally with learned experience. In simulations we show that our model for synaptic plasticity allows spiking neural networks to compensate continuously for unforeseen disturbances. Furthermore it provides a normative mathematical framework to better understand the permanent variability and rewiring observed in brain networks.

Live content is unavailable. Log in and register to view live content