Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Information-Theoretic Principles in Cognitive Systems (InfoCog)

Information theoretic study of the neural geometry induced by category learning

Laurent BONNASSE-GAHOT · Jean-Pierre Nadal

[ ] [ Project Page ]
Fri 15 Dec 12:40 p.m. PST — 1:30 p.m. PST
 
presentation: Information-Theoretic Principles in Cognitive Systems (InfoCog)
Fri 15 Dec 6:15 a.m. PST — 3:30 p.m. PST

Abstract:

Categorization is an important topic both for biological and artificial neural networks. Here, we take an information theoretic approach to assess the efficiency of the representations induced by category learning. We show that one can decompose the relevant Bayesian cost into two components, one for the coding part and one for the decoding part. Minimizing this cost implies maximizing the mutual information between the set of categories and the neural activities. We analytically show that this mutual information can be written as the sum of two terms that can be interpreted as (i) finding an appropriate representation space, and, (ii) building a representation with the appropriate metrics, based on the neural Fisher information on this space. One main consequence is that category learning induces an expansion of neural space near decision boundaries. Finally, we provide numerical illustrations that show how Fisher information of the coding neural population aligns with the boundaries between categories.

Chat is not available.