Timezone: »
Multi-fidelity modeling and learning is important in physical simulation related applications. It can leverage both low-fidelity and high-fidelity examples for training so as to reduce the cost of data generation yet still achieving good performance. While existing approaches only model finite, discrete fidelities, in practice, the feasible fidelity choice is often infinite, which can correspond to a continuous mesh spacing or finite element length. In this paper, we propose Infinite Fidelity Coregionalization (IFC). Given the data, our method can extract and exploit rich information within infinite, continuous fidelities to bolster the prediction accuracy. Our model can interpolate and/or extrapolate the predictions to novel fidelities that are not covered by the training data. Specifically, we introduce a low-dimensional latent output as a continuous function of the fidelity and input, and multiple it with a basis matrix to predict high-dimensional solution outputs. We model the latent output as a neural Ordinary Differential Equation (ODE) to capture the complex relationships within and integrate information throughout the continuous fidelities. We then use Gaussian processes or another ODE to estimate the fidelity-varying bases. For efficient inference, we reorganize the bases as a tensor, and use a tensor-Gaussian variational posterior approximation to develop a scalable inference algorithm for massive outputs. We show the advantage of our method in several benchmark tasks in computational physics.
Author Information
Shibo Li (University of Utah)
Zheng Wang (University of Utah)
Robert Kirby (University of Utah)
Shandian Zhe (University of Utah)
More from the Same Authors
-
2022 Spotlight: Recall Distortion in Neural Network Pruning and the Undecayed Pruning Algorithm »
Aidan Good · Jiaqi Lin · Xin Yu · Hannah Sieg · Mikey Fergurson · Shandian Zhe · Jerzy Wieczorek · Thiago Serra -
2022 Poster: Recall Distortion in Neural Network Pruning and the Undecayed Pruning Algorithm »
Aidan Good · Jiaqi Lin · Xin Yu · Hannah Sieg · Mikey Fergurson · Shandian Zhe · Jerzy Wieczorek · Thiago Serra -
2022 Poster: Batch Multi-Fidelity Active Learning with Budget Constraints »
Shibo Li · Jeff M Phillips · Xin Yu · Robert Kirby · Shandian Zhe -
2021 Poster: Self-Adaptable Point Processes with Nonparametric Time Decays »
Zhimeng Pan · Zheng Wang · Jeff M Phillips · Shandian Zhe -
2021 Poster: Characterizing possible failure modes in physics-informed neural networks »
Aditi Krishnapriyan · Amir Gholami · Shandian Zhe · Robert Kirby · Michael Mahoney -
2021 Poster: Batch Multi-Fidelity Bayesian Optimization with Deep Auto-Regressive Networks »
Shibo Li · Robert Kirby · Shandian Zhe -
2020 Poster: Multi-Fidelity Bayesian Optimization via Deep Neural Networks »
Shibo Li · Wei Xing · Robert Kirby · Shandian Zhe -
2018 Poster: Stochastic Nonparametric Event-Tensor Decomposition »
Shandian Zhe · Yishuai Du -
2018 Spotlight: Stochastic Nonparametric Event-Tensor Decomposition »
Shandian Zhe · Yishuai Du