Timezone: »

Unfolding recurrence by Green’s functions for optimized reservoir computing
Sandra Nestler · Christian Keup · David Dahmen · Matthieu Gilson · Holger Rauhut · Moritz Helias

Wed Dec 09 09:00 AM -- 11:00 AM (PST) @ Poster Session 3 #990

Cortical networks are strongly recurrent, and neurons have intrinsic temporal dynamics. This sets them apart from deep feed-forward networks. Despite the tremendous progress in the application of deep feed-forward networks and their theoretical understanding, it remains unclear how the interplay of recurrence and non-linearities in recurrent cortical networks contributes to their function. The purpose of this work is to present a solvable recurrent network model that links to feed forward networks. By perturbative methods we transform the time-continuous, recurrent dynamics into an effective feed-forward structure of linear and non-linear temporal kernels. The resulting analytical expressions allow us to build optimal time-series classifiers from random reservoir networks. Firstly, this allows us to optimize not only the readout vectors, but also the input projection, demonstrating a strong potential performance gain. Secondly, the analysis exposes how the second order stimulus statistics is a crucial element that interacts with the non-linearity of the dynamics and boosts performance.

Author Information

Sandra Nestler (Juelich Research Centre)
Christian Keup (Juelich Research Centre)
David Dahmen (Jülich Research Centre)
Matthieu Gilson (Juelich Forschungszentrum)
Holger Rauhut (RWTH Aachen University)
Moritz Helias (Juelich Research Centre)

More from the Same Authors