Skip to yearly menu bar Skip to main content


Poster

Kernel-Based Approaches for Sequence Modeling: Connections to Neural Methods

Kevin J Liang · Guoyin Wang · Yitong Li · Ricardo Henao · Lawrence Carin

East Exhibition Hall B + C #153

Keywords: [ Recurrent Networks ] [ Deep Learning ] [ Neuroscience and cognitive science ]


Abstract: We investigate time-dependent data analysis from the perspective of recurrent kernel machines, from which models with hidden units and gated memory cells arise naturally. By considering dynamic gating of the memory cell, a model closely related to the long short-term memory (LSTM) recurrent neural network is derived. Extending this setup to $n$-gram filters, the convolutional neural network (CNN), Gated CNN, and recurrent additive network (RAN) are also recovered as special cases. Our analysis provides a new perspective on the LSTM, while also extending it to $n$-gram convolutional filters. Experiments are performed on natural language processing tasks and on analysis of local field potentials (neuroscience). We demonstrate that the variants we derive from kernels perform on par or even better than traditional neural methods. For the neuroscience application, the new models demonstrate significant improvements relative to the prior state of the art.

Live content is unavailable. Log in and register to view live content