Timezone: »
Poster
Batch Multi-Fidelity Active Learning with Budget Constraints
Shibo Li · Jeff M Phillips · Xin Yu · Robert Kirby · Shandian Zhe
Learning functions with high-dimensional outputs is critical in many applications, such as physical simulation and engineering design. However, collecting training examples for these applications is often costly, e.g., by running numerical solvers. The recent work (Li et al., 2022) proposes the first multi-fidelity active learning approach for high-dimensional outputs, which can acquire examples at different fidelities to reduce the cost while improving the learning performance. However, this method only queries at one pair of fidelity and input at a time, and hence has a risk of bringing in strongly correlated examples to reduce the learning efficiency. In this paper, we propose Batch Multi-Fidelity Active Learning with Budget Constraints (BMFAL-BC), which can promote the diversity of training examples to improve the benefit-cost ratio, while respecting a given budget constraint for batch queries. Hence, our method can be more practically useful. Specifically, we propose a novel batch acquisition function that measures the mutual information between a batch of multi-fidelity queries and the target function, so as to penalize highly correlated queries and encourages diversity. The optimization of the batch acquisition function is challenging in that it involves a combinatorial search over many fidelities while subject to the budget constraint. To address this challenge, we develop a weighted greedy algorithm that can sequentially identify each (fidelity, input) pair, while achieving a near $(1 - 1/e)$-approximation of the optimum. We show the advantage of our method in several computational physics and engineering applications.
Author Information
Shibo Li (University of Utah)
Jeff M Phillips (University of Utah)
Xin Yu (University of Utah)
Robert Kirby (University of Utah)
Shandian Zhe (University of Utah)
More from the Same Authors
-
2022 Spotlight: Recall Distortion in Neural Network Pruning and the Undecayed Pruning Algorithm »
Aidan Good · Jiaqi Lin · Xin Yu · Hannah Sieg · Mikey Fergurson · Shandian Zhe · Jerzy Wieczorek · Thiago Serra -
2022 Poster: Recall Distortion in Neural Network Pruning and the Undecayed Pruning Algorithm »
Aidan Good · Jiaqi Lin · Xin Yu · Hannah Sieg · Mikey Fergurson · Shandian Zhe · Jerzy Wieczorek · Thiago Serra -
2022 Poster: Infinite-Fidelity Coregionalization for Physical Simulation »
Shibo Li · Zheng Wang · Robert Kirby · Shandian Zhe -
2021 : An Interactive Visual Demo of Bias Mitigation Techniques for Word Representations »
Archit Rathore · Sunipa Dev · Vivek Srikumar · Jeff M Phillips · Yan Zheng · Michael Yeh · Junpeng Wang · Wei Zhang · Bei Wang -
2021 Poster: Self-Adaptable Point Processes with Nonparametric Time Decays »
Zhimeng Pan · Zheng Wang · Jeff M Phillips · Shandian Zhe -
2021 Poster: Characterizing possible failure modes in physics-informed neural networks »
Aditi Krishnapriyan · Amir Gholami · Shandian Zhe · Robert Kirby · Michael Mahoney -
2021 Poster: Batch Multi-Fidelity Bayesian Optimization with Deep Auto-Regressive Networks »
Shibo Li · Robert Kirby · Shandian Zhe -
2020 Poster: Multi-Fidelity Bayesian Optimization via Deep Neural Networks »
Shibo Li · Wei Xing · Robert Kirby · Shandian Zhe -
2018 Poster: Stochastic Nonparametric Event-Tensor Decomposition »
Shandian Zhe · Yishuai Du -
2018 Spotlight: Stochastic Nonparametric Event-Tensor Decomposition »
Shandian Zhe · Yishuai Du -
2016 Poster: The Robustness of Estimator Composition »
Pingfan Tang · Jeff M Phillips