Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2023 Workshop: Machine Learning and the Physical Sciences

Reduced-order modeling for parameterized PDEs via implicit neural representations

Tianshu Wen · Kookjin Lee · Youngsoo Choi


Abstract:

We present a new data-driven reduced-order modeling approach to efficiently solve parametrized partial differential equations (PDEs) for many-query problems. This work is inspired by the concept of implicit neural representation (INR), which models physics signals in a continuous manner and independent of spatial/temporal discretization. The proposed framework encodes PDE and utilizes a parametrized neural ODE (PNODE) to learn latent dynamics characterized by multiple PDE parameters. PNODE can be inferred by a hypernetwork to reduce the potential difficulties in learning PNODE due to a complex multilayer perceptron (MLP). The framework uses an INR to decode the latent dynamics and reconstruct accurate PDE solutions. Further, a physics-informed loss is also introduced to correct the prediction of unseen parameter instances. A numerical experiment is performed on a two-dimensional Burgers equation with a large variation of PDE parameters. We evaluate the proposed method at a large Reynolds number and obtain up to speedup of ~50x and ~3% relative l2 error.

Chat is not available.