Timezone: »

 
Poster
Improving Neural Ordinary Differential Equations with Nesterov's Accelerated Gradient Method
Ho Huu Nghia Nguyen · Tan Nguyen · Huyen Vo · Stanley Osher · Thieu Vo

Wed Nov 30 09:00 AM -- 11:00 AM (PST) @ Hall J #618
We propose the Nesterov neural ordinary differential equations (NesterovNODEs), whose layers solve the second-order ordinary differential equations (ODEs) limit of Nesterov's accelerated gradient (NAG) method, and a generalization called GNesterovNODEs. Taking the advantage of the convergence rate $\mathcal{O}(1/k^{2})$ of the NAG scheme, GNesterovNODEs speed up training and inference by reducing the number of function evaluations (NFEs) needed to solve the ODEs. We also prove that the adjoint state of a GNesterovNODEs also satisfies a GNesterovNODEs, thus accelerating both forward and backward ODE solvers and allowing the model to be scaled up for large-scale tasks. We empirically corroborate the advantage of GNesterovNODEs on a wide range of practical applications, including point cloud separation, image classification, and sequence modeling. Compared to NODEs, GNesterovNODEs require a significantly smaller number of NFEs while achieving better accuracy across our experiments.

Author Information

Ho Huu Nghia Nguyen (FPT Software Company Limited, FPT Cau Giay Building, Duy Tan Street, Cau Giay District, Ha Noi City)

I recently finished my undergraduate degree in Computer Science at Ho Chi Minh city University of Science, one of the top 2 universities in Vietnam in Computer Science. I am currently an AI Research Resident at the FPT Software AI Residency Program, hosted by FPT Software AI Center. My interests are in Neural ODEs, Equivariant models, and Generalization. I am looking for a PhD position.

Tan Nguyen (University of California, Los Angeles)
Huyen Vo (Hanoi University of Science and Technology)
Stanley Osher (UCLA)
Thieu Vo (Johannes Kepler University Linz)

More from the Same Authors