Skip to yearly menu bar Skip to main content

Workshop: Associative Memory & Hopfield Networks in 2023

Memorization and consolidation in associative memory networks

Danil Tyulmankov · Kimberly Stachenfeld · Dmitry Krotov · L F Abbott


Humans, animals, and machines can store and retrieve long-term memories of individual items, while at the same time consolidating and learning general representations of categories that discard the individual examples from which the representations were constructed. Classical neural networks model only one or the other of these two regimes. In this work, we propose a biologically motivated model that can not only consolidate representations of common items but also memorize exceptional ones. Critically, we consider the unsupervised learning regime where exceptional items are not labeled as such a priori, so the signal to either memorize or consolidate items must be generated by the network itself. We propose a number of metrics for this control signal and compare them for two different algorithms inspired by traditional imbalanced data learning approaches -- loss reweighting and importance sampling. Overall, our model serves not only as a framework for concurrent memorization and consolidation processes in biological systems, but also as a simple illustration of related phenomena in large-scale machine learning models, as well as a potential method for debiasing artificial intelligence algorithms.

Chat is not available.