Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Heavy Tails in ML: Structure, Stability, Dynamics

Semi-Implicit Neural Ordinary Differential Equations for Learning Chaotic Systems

Hong Zhang · Ying Liu · Romit Maulik

Keywords: [ Neural ODEs ] [ Stability ] [ chaotic systems ]


Abstract:

Classical neural ordinary differential equations (ODEs) trained by using explicit methods are intrinsically constrained by stability,severely affecting their efficiency and robustness in learning complex spatiotemporal dynamics, particularly those displaying chaotic behavior. In this work we propose a semi-implicit neural ODE approach that capitalizes on the partitionable structure of the underlying dynamics.In our method the neural ODE is partitioned into a linear part treated implicitly for enhanced stability and a nonlinear part treated explicitly.We apply this approach to learn chaotic trajectories of the Kuramoto--Sivashinsky equation.Our results demonstrate that our approach significantly outperforms existing approaches for coarse-resolution data and remains efficient for fine-resolution data where existing techniques become intractable.

Chat is not available.