Skip to yearly menu bar Skip to main content


Poster
in
Workshop: AI for Science: Mind the Gaps

Multiple Sequential Learning Tasks Represented in Recurrent Neural Networks

Shaonan Wang


Abstract:

Our brain can flexibly perform a variety of sequential learning tasks including music, language, and mathematics, but the underlying mechanism hasn't been elucidated in traditional experimental and modeling studies which were designed for only one task at a time. From the computational perspective, we hypothesize that the working mechanism of a multitask model can provide a possible solution to that of brains. Therefore, we trained a single recurrent neural network to perform 8 sequential learning tasks that depend on working memory, structure extraction, categorization, and other cognitive processes. After training, the model can learn sophisticated information holding and erasing strategies to perform multitasks simultaneously. More interestingly, the model learns to reuse neurons to encode similar task features. Hopefully, this work can provide a computational platform to investigate the neural representations of cognitive sequential learning ability.