Skip to yearly menu bar Skip to main content

Workshop: Associative Memory & Hopfield Networks in 2023

Error-correcting columnar networks: high-capacity memory under sparse connectivity

Haozhe Shan · Ludovica Bachschmid-Romano · Haim Sompolinsky

Abstract: Neurons with recurrent connectivity can store memory patterns as attractor states in their dynamics, forming a plausible basis for associative memory in the brain. Classic theoretical results on fully connected recurrent neural networks (RNNs) with binary neurons and Hebbian learning rules state that they can store at most $O\left(N\right)$ memories, where $N$ is the number of neurons. However, under the physiological constraint that neurons are sparsely connected, this capacity is dramatically reduced to $O(K)$, where $K$ is the average degree of connectivity (estimated to be $O(10^{3}\sim10^{4})$ in the mammalian neocortex). This reduced capacity is orders-of magnitude smaller than experimental estimates of human memory capacity. In this work, we propose the error-correcting columnar network (ECCN) as a plausible model of how the brain realizes high-capacity memory storage despite sparse connectivity. In the ECCN, neurons are organized into ``columns'': in each memory, neurons from the same column encode the same feature(s), similar to columns in primary sensory areas. A column-synchronizing mechanism utilizes the redundancy of columnar codes to perform error correction. We analytically computed the memory capacity of the ECCN via a dynamical mean-field theory. The results show that for a fixed column size $M$, the capacity grows linearly with network size $N$ until it saturates at $\propto MK$. For optimal choice of $M$ for each $N$, the capacity is $\propto \sqrt{NK}$.

Chat is not available.