Skip to yearly menu bar Skip to main content


Poster

On Learning Physical Neural Representations of Dynamical Data

Jeongjin Park · Nicole Yang · Nisha Chandramoorthy

East Exhibit Hall A-C #2300
[ ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Conventional notions of generalization often fail to describe the ability of learned models to capture meaningful information from dynamical data. A neural network that learns complex dynamics with a small test error may still fail to reproduce its physical behavior, including associated statistical moments and Lyapunov exponents. To address this gap, we propose an ergodic theoretic approach to generalization of complex dynamical models learned from time series data. Our main contribution is to define and analyze generalization of a broad suite of neural representations of classes of ergodic systems, including chaotic systems, in a way that captures emulating underlying invariant, physical measures. Our results provide theoretical justification for why regression methods for generators of dynamical systems (Neural ODEs) fail to generalize, and why their statistical accuracy improves upon adding Jacobian information during training. We verify our results on a number of ergodic chaotic systems and neural network parameterizations, including MLPs, ResNets, Fourier Neural layers, and RNNs.

Live content is unavailable. Log in and register to view live content