Skip to yearly menu bar Skip to main content


Spotlight
in
Workshop: All Things Attention: Bridging Different Perspectives on Attention

Fine-tuning hierarchical circuits through learned stochastic co-modulation

Caroline Haimerl · Eero Simoncelli · Cristina Savin

Keywords: [ biological vision ] [ neural covariability ] [ gain modulation ] [ hierarchical coding ]


Abstract:

Attentional gating is a core mechanism supporting behavioral flexibility, but its biological implementation remains uncertain. Gain modulation of neural responses is likely to play a key role, but simply boosting relevant neural responses can be insufficient for improving behavioral outputs, especially in hierarchical circuits. Here we propose a variation of attentional gating that relies on stochastic gain modulation as a dedicated indicator of task relevance. We show that targeted stochastic modulation can be effectively learned and used to fine-tune hierarchical architectures, without reorganization of the underlying circuits. Simulations of such networks demonstrate improvements in learning efficiency and performance in novel tasks, relative to traditional attentional mechanisms based on deterministic gain increases. The effectiveness of this approach relies on the availability of representational bottlenecks in which the task relevant information is localized in small subpopulations of neurons. Overall, this work provides a new mechanism for constructing intelligent systems that can flexibly and robustly adapt to changes in task structure.

Chat is not available.