Skip to yearly menu bar Skip to main content

Workshop: The Symbiosis of Deep Learning and Differential Equations

Expressive Power of Randomized Signature

Lukas Gonon · Josef Teichmann


We consider the question whether the time evolution of controlled differential equations on general state spaces can be arbitrarily well approximated by (regularized) regressions on features generated themselves through randomly chosen dynamical systems of moderately high dimension. On the one hand this is motivated by paradigms of reservoir computing, on the other hand by ideas from rough path theory and compressed sensing. Appropriately interpreted this yields provable approximation and generalization results for generic dynamical systems by regressions on states of random, otherwise untrained dynamical systems, which usually are approximated by recurrent or LSTM networks. The results have important implications for transfer learning and energy efficiency of training.We apply methods from rough path theory, convenient analysis, non-commutative algebra and the Johnson-Lindenstrauss Lemma to prove the approximation results.

Chat is not available.