Skip to yearly menu bar Skip to main content

Workshop: Associative Memory & Hopfield Networks in 2023

A Different Route to Exponential Storage Capacity

Elvis Dohmatob


Recent developments have sought to overcome the inherent limitations of traditional associative memory models, like Hopfield networks, where storage capacity scales linearly with input dimension.In this paper, we present a new extension of Hopfield networks that grants precise control over inter-neuron interactions while allowing control of the level of connectivity within the network. This versatile framework encompasses a variety of designs, including classical Hopfield networks, models with polynomial activation functions, and simplicial Hopfield networks as particular cases. Remarkably, a specific instance of our construction, resulting in a new self-attention mechanism, is characterized by quasi-exponential storage capacity and a sparse network structure, aligning with biological plausibility.

Chat is not available.