Timezone: »

STEER : Simple Temporal Regularization For Neural ODE
Arnab Ghosh · Harkirat Singh Behl · Emilien Dupont · Philip Torr · Vinay Namboodiri

Thu Dec 10 09:00 PM -- 11:00 PM (PST) @ Poster Session 6 #1741

Training Neural Ordinary Differential Equations (ODEs) is often computationally expensive. Indeed, computing the forward pass of such models involves solving an ODE which can become arbitrarily complex during training. Recent works have shown that regularizing the dynamics of the ODE can partially alleviate this. In this paper we propose a new regularization technique: randomly sampling the end time of the ODE during training. The proposed regularization is simple to implement, has negligible overhead and is effective across a wide variety of tasks. Further, the technique is orthogonal to several other methods proposed to regularize the dynamics of ODEs and as such can be used in conjunction with them. We show through experiments on normalizing flows, time series models and image recognition that the proposed regularization can significantly decrease training time and even improve performance over baseline models.

Author Information

Arnab Ghosh (University of Oxford)
Harkirat Singh Behl (University of Oxford)

I am a DPhil student at the University of Oxford. I am a member of the Torr Vision Group and the Optimization for Vision and Learning group. My supervisors are Prof. Philip Torr and Prof. Pawan Kumar. I completed my undergraduate degree from Indian Institute of Technology (IIT) Kanpur in May 2018. I did a summer research internship in MSR Redmond in 2019 with Dr. Vibhav Vineet. My research interests lie in designing optimization algorithms for problems of practical interest in Computer Vision and Machine Learning.

Emilien Dupont (Oxford University)
Philip Torr (University of Oxford)
Vinay Namboodiri (University of Bath)

More from the Same Authors