Timezone: »

 
Poster
Non-normal Recurrent Neural Network (nnRNN): learning long time dependencies while improving expressivity with transient dynamics
Giancarlo Kerg · Kyle Goyette · Maximilian Puelma Touzel · Gauthier Gidel · Eugene Vorontsov · Yoshua Bengio · Guillaume Lajoie

Tue Dec 10 10:45 AM -- 12:45 PM (PST) @ East Exhibition Hall B + C #155

A recent strategy to circumvent the exploding and vanishing gradient problem in RNNs, and to allow the stable propagation of signals over long time scales, is to constrain recurrent connectivity matrices to be orthogonal or unitary. This ensures eigenvalues with unit norm and thus stable dynamics and training. However this comes at the cost of reduced expressivity due to the limited variety of orthogonal transformations. We propose a novel connectivity structure based on the Schur decomposition and a splitting of the Schur form into normal and non-normal parts. This allows to parametrize matrices with unit-norm eigenspectra without orthogonality constraints on eigenbases. The resulting architecture ensures access to a larger space of spectrally constrained matrices, of which orthogonal matrices are a subset. This crucial difference retains the stability advantages and training speed of orthogonal RNNs while enhancing expressivity, especially on tasks that require computations over ongoing input sequences.

Author Information

Giancarlo Kerg (MILA)
Kyle Goyette (University of Montreal)
Maximilian Puelma Touzel (Mila)
Gauthier Gidel (Mila)

I am a Ph.D student supervised by Simon Lacoste-Julien, I graduated from ENS Ulm and Université Paris-Saclay. I was a visiting PhD student at Sierra. I also worked for 6 months as a freelance Data Scientist for Monsieur Drive (Acquired by Criteo) and I recently co-founded a startup called Krypto. I'm currently pursuing my PhD at Mila. My work focuses on optimization applied to machine learning. More details can be found in my resume. My research is to develop new optimization algorithms and understand the role of optimization in the learning procedure, in short, learn faster and better. I identify to the field of machine learning (NIPS, ICML, AISTATS and ICLR) and optimization (SIAM OP)

Eugene Vorontsov (Polytechnique Montreal)
Yoshua Bengio (Mila)

Yoshua Bengio is Full Professor in the computer science and operations research department at U. Montreal, scientific director and founder of Mila and of IVADO, Turing Award 2018 recipient, Canada Research Chair in Statistical Learning Algorithms, as well as a Canada AI CIFAR Chair. He pioneered deep learning and has been getting the most citations per day in 2018 among all computer scientists, worldwide. He is an officer of the Order of Canada, member of the Royal Society of Canada, was awarded the Killam Prize, the Marie-Victorin Prize and the Radio-Canada Scientist of the year in 2017, and he is a member of the NeurIPS advisory board and co-founder of the ICLR conference, as well as program director of the CIFAR program on Learning in Machines and Brains. His goal is to contribute to uncover the principles giving rise to intelligence through learning, as well as favour the development of AI for the benefit of all.

Guillaume Lajoie (Université de Montréal / Mila)

More from the Same Authors