Skip to yearly menu bar Skip to main content


Poster
in
Workshop: The Symbiosis of Deep Learning and Differential Equations -- III

Does In-Context Operator Learning Generalize to Domain-Shifted Settings?

Jerry Liu · N. Benjamin Erichson · Kush Bhatia · Michael Mahoney · Christopher RĂ©

Keywords: [ Operator Learning ] [ in-context learning ] [ differential equation ] [ Meta-Learning ] [ generalization ]


Abstract:

Neural network-based approaches for learning differential equations (DEs) have demonstrated generalization capabilities within a DE solution or operator instance. However, because standard techniques can only represent the solution function or operator for a single system at a time, the broader notion of generalization across classes of DEs has so far gone unexplored. In this work, we investigate whether commonalities across DE classes can be leveraged to transfer knowledge about solving one DE towards solving another --- without updating any model parameters. To this end, we leverage the recently-proposed in-context operator learning (ICOL) framework, which trains a model to identify in-distribution operators given a small number of input-output pairs as examples. Our implementation is motivated by pseudospectral methods, a class of numerical solvers that can be systematically applied to a range of DEs. For a natural distribution of 1D linear ordinary differential equations (ODEs), we identify a connection between operator learning and in-context linear regression. Applying recent results demonstrating the capabilities of Transformers to in-context learn linear functions, our reduction to least squares helps to explain why Transformers can be expected to solve ODEs in-context. Empirically, we demonstrate that ICOL is robust to a range of distribution shifts, including observational noise, domain-shifted inputs, varying boundary conditions, and surprisingly, even operators from functional forms unseen during training.

Chat is not available.