Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2023 Workshop: Machine Learning and the Physical Sciences

Information bottleneck learns dominant transfer operator eigenfunctions in dynamical systems

Matthew S. Schmitt · Maciej Koch-Janusz · Michel Fruchart · Daniel Seara · Vincenzo Vitelli


Abstract:

A common task across the physical sciences is that of model reduction: given a high-dimensional and complex description of a full system, how does one reduce it to a small number of important collective variables? Here we investigate model reduction for dynamical systems using the information bottleneck framework. We show that the optimal compression of a system's state is achieved by encoding spectral properties of its transfer operator. After demonstrating this in analytically-tractable examples, we show our findings hold also in variational compression schemes using experimental fluids data. These results shed light into the latent variables in certain neural network architectures, and show the practical utility of information-based loss functions.

Chat is not available.