Timezone: »
We present a recurrent neuronal network, modeled as a continuous-time dynamical system, that can solve constraint satisfaction problems. Discrete variables are represented by coupled Winner-Take-All (WTA) networks, and their values are encoded in localized patterns of oscillations that are learned by the recurrent weights in these networks. Constraints over the variables are encoded in the network connectivity. Although there are no sources of noise, the network can escape from local optima in its search for solutions that satisfy all constraints by modifying the effective network connectivity through oscillations. If there is no solution that satisfies all constraints, the network state changes in a pseudo-random manner and its trajectory approximates a sampling procedure that selects a variable assignment with a probability that increases with the fraction of constraints satisfied by this assignment. External evidence, or input to the network, can force variables to specific values. When new inputs are applied, the network re-evaluates the entire set of variables in its search for the states that satisfy the maximum number of constraints, while being consistent with the external input. Our results demonstrate that the proposed network architecture can perform a deterministic search for the optimal solution to problems with non-convex cost functions. The network is inspired by canonical microcircuit models of the cortex and suggests possible dynamical mechanisms to solve constraint satisfaction problems that can be present in biological networks, or implemented in neuromorphic electronic circuits.
Author Information
Hesham Mostafa (ETH Zurich)
Lorenz K Muller (ETH Zurich)
Giacomo Indiveri (ETH Zurich)
More from the Same Authors
-
2016 Demonstration: Nullhop: Flexibly efficient FPGA CNN accelerator driven by DAVIS neuromorphic vision sensor »
Alessandro Aimar · Enrico Calabrese · Hesham Mostafa · Antonio Rios-Navarro · Ricardo Tapiador · Iulia-Alexandra Lungu · Angel F. Jimenez-Fernandez · Federico Corradi · Shih-Chii Liu · Alejandro Linares-Barranco · Tobi Delbruck -
2007 Oral: Learning to classify complex patterns using a VLSI network of spiking neurons »
Srinjoy Mitra · Giacomo Indiveri · Stefano Fusi -
2007 Spotlight: Contraction Properties of VLSI Cooperative Competitive Neural Networks of Spiking Neurons »
Emre Neftci · Elisabetta Chicca · Giacomo Indiveri · Jean-Jacques Slotine · Rodney J Douglas -
2007 Poster: Contraction Properties of VLSI Cooperative Competitive Neural Networks of Spiking Neurons »
Emre Neftci · Elisabetta Chicca · Giacomo Indiveri · Jean-Jacques Slotine · Rodney J Douglas -
2007 Poster: Learning to classify complex patterns using a VLSI network of spiking neurons »
Srinjoy Mitra · Giacomo Indiveri · Stefano Fusi -
2007 Demonstration: Contraction of VLSI Spiking Neurons »
Emre Neftci · Elisabetta Chicca · Giacomo Indiveri · Jean-Jacques Slotine · Rodney J Douglas -
2006 Poster: Context dependent amplification of both rate and event-correlation in a VLSI network of spiking neurons »
Elisabetta Chicca · Giacomo Indiveri · Rodney J Douglas -
2006 Spotlight: Context dependent amplification of both rate and event-correlation in a VLSI network of spiking neurons »
Elisabetta Chicca · Giacomo Indiveri · Rodney J Douglas -
2006 Poster: A selective attention multi--chip system with dynamic synapses and spiking neurons »
Chiara Bartolozzi · Giacomo Indiveri