Timezone: »

Framing RNN as a kernel method: A neural ODE approach
Adeline Fermanian · Pierre Marion · Jean-Philippe Vert · Gérard Biau

Tue Dec 07 12:00 AM -- 12:15 AM (PST) @

Building on the interpretation of a recurrent neural network (RNN) as a continuous-time neural differential equation, we show, under appropriate conditions, that the solution of a RNN can be viewed as a linear function of a specific feature set of the input sequence, known as the signature. This connection allows us to frame a RNN as a kernel method in a suitable reproducing kernel Hilbert space. As a consequence, we obtain theoretical guarantees on generalization and stability for a large class of recurrent networks. Our results are illustrated on simulated datasets.

Author Information

Adeline Fermanian (Mines ParisTech)
Pierre Marion (Sorbonne Université)
Jean-Philippe Vert (Google)
Gérard Biau (University Paris VI)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors