Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2023 Workshop: Machine Learning and the Physical Sciences

Incremental learning for physics-informed neural networks

Aleksandr Dekhovich · Marcel Sluiter · David M.J. Tax · Miguel Bessa


Abstract:

This work proposes an incremental learning algorithm for physics-informed neural networks (PINNs), which have recently become a powerful tool for solving partial differential equations (PDEs). As demonstrated herein, by developing incremental PINNs (iPINNs) we can effectively mitigate training challenges associated with PINNs loss landscape optimization and learn multiple tasks (equations) sequentially without additional parameters for new tasks. Interestingly, we show that this also improves performance for every equation in the sequence. The approach is based on creating its own subnetwork for each PDE and allowing each subnetwork to overlap with previously learned subnetworks. We also show that iPINNs achieve lower prediction error than regular PINNs for two different scenarios: (1) learning a family of equations (e.g., 1-D convection PDE); and (2) learning PDEs resulting from a combination of processes (e.g., 1-D reaction-diffusion PDE).

Chat is not available.