Timezone: »
Bayesian optimization (BO) is a sample-efficient approach to optimizing costly-to-evaluate black-box functions. Most BO methods ignore how evaluation costs may vary over the optimization domain. However, these costs can be highly heterogeneous and are often unknown in advance in many practical settings, such as hyperparameter tuning of machine learning algorithms or physics-based simulation optimization. Moreover, those few existing methods that acknowledge cost heterogeneity do not naturally accommodate a budget constraint on the total evaluation cost. This combination of unknown costs and a budget constraint introduces a new dimension to the exploration-exploitation trade-off, where learning about the cost incurs a cost itself. Existing methods do not reason about the various trade-offs of this problem in a principled way, leading often to poor performance. We formalize this claim by proving that the expected improvement and the expected improvement per unit of cost, arguably the two most widely used acquisition functions in practice, can be arbitrarily inferior with respect to the optimal non-myopic policy. To overcome the shortcomings of existing approaches, we propose the budgeted multi-step expected improvement, a non-myopic acquisition function that generalizes classical expected improvement to the setting of heterogeneous and unknown evaluation costs. We show that our acquisition function outperforms existing methods in a variety of synthetic and real problems.
Author Information
Raul Astudillo (Cornell University)
I am a Ph.D. candidate in the School of Operations Research and Information Engineering at Cornell University, where I am fortunate to be advised by Professor Peter Frazier. Prior to coming to Cornell, I completed the undergraduate program in Mathematics offered jointly by the University of Guanajuato and the Center for Research in Mathematics. My current research focuses on the design and analysis of Bayesian optimization algorithms for problems with nested objective functions.
Daniel Jiang (Facebook)
Maximilian Balandat (Meta)
Eytan Bakshy (Meta)
Peter Frazier (Cornell / Uber)
Peter Frazier is an Associate Professor in the School of Operations Research and Information Engineering at Cornell University, and a Staff Data Scientist at Uber. He received a Ph.D. in Operations Research and Financial Engineering from Princeton University in 2009. His research is at the intersection of machine learning and operations research, focusing on Bayesian optimization, multi-armed bandits, active learning, and Bayesian nonparametric statistics. He is an associate editor for Operations Research, ACM TOMACS, and IISE Transactions, and is the recipient of an AFOSR Young Investigator Award and an NSF CAREER Award.
More from the Same Authors
-
2021 : Practical Policy Optimization with PersonalizedExperimentation »
Mia Garrard · Hanson Wang · Ben Letham · Zehui Wang · Yin Huang · Yichun Hu · Chad Zhou · Norm Zhou · Eytan Bakshy -
2022 : Sparse Bayesian Optimization »
Sulin Liu · Qing Feng · David Eriksson · Ben Letham · Eytan Bakshy -
2022 : One-Shot Optimal Design for Gaussian Process Analysis of Randomized Experiments »
Jelena Markovic · Qing Feng · Eytan Bakshy -
2022 : Panel »
Roman Garnett · José Miguel Hernández-Lobato · Eytan Bakshy · Syrine Belakaria · Stefanie Jegelka -
2022 Poster: Bayesian Optimization over Discrete and Mixed Spaces via Probabilistic Reparameterization »
Samuel Daulton · Xingchen Wan · David Eriksson · Maximilian Balandat · Michael A Osborne · Eytan Bakshy -
2021 Poster: Constrained Two-step Look-Ahead Bayesian Optimization »
Yunxiang Zhang · Xiangyu Zhang · Peter Frazier -
2021 Poster: Bayesian Optimization of Function Networks »
Raul Astudillo · Peter Frazier -
2021 Poster: Parallel Bayesian Optimization of Multiple Noisy Objectives with Expected Hypervolume Improvement »
Samuel Daulton · Maximilian Balandat · Eytan Bakshy -
2021 Poster: Bayesian Optimization with High-Dimensional Outputs »
Wesley Maddox · Maximilian Balandat · Andrew Wilson · Eytan Bakshy -
2020 Poster: Bayesian Optimization of Risk Measures »
Sait Cakmak · Raul Astudillo · Peter Frazier · Enlu Zhou -
2020 Poster: Differentiable Expected Hypervolume Improvement for Parallel Multi-Objective Bayesian Optimization »
Samuel Daulton · Maximilian Balandat · Eytan Bakshy -
2020 Poster: BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization »
Maximilian Balandat · Brian Karrer · Daniel Jiang · Samuel Daulton · Ben Letham · Andrew Wilson · Eytan Bakshy -
2020 Poster: Re-Examining Linear Embeddings for High-Dimensional Bayesian Optimization »
Ben Letham · Roberto Calandra · Akshara Rai · Eytan Bakshy -
2020 Poster: Efficient Nonmyopic Bayesian Optimization via One-Shot Multi-Step Trees »
Shali Jiang · Daniel Jiang · Maximilian Balandat · Brian Karrer · Jacob Gardner · Roman Garnett -
2020 Poster: High-Dimensional Contextual Policy Search with Unknown Context Rewards using Bayesian Optimization »
Qing Feng · Ben Letham · Hongzi Mao · Eytan Bakshy -
2020 Spotlight: High-Dimensional Contextual Policy Search with Unknown Context Rewards using Bayesian Optimization »
Qing Feng · Ben Letham · Hongzi Mao · Eytan Bakshy -
2019 : Invited Speaker: Eytan Bakshy »
Eytan Bakshy -
2019 Poster: Practical Two-Step Lookahead Bayesian Optimization »
Jian Wu · Peter Frazier -
2017 : Multi-Attribute Bayesian Optimization under Utility Uncertainty »
Raul Astudillo -
2017 : Invited talk: Knowledge Gradient Methods for Bayesian Optimization »
Peter Frazier -
2017 Poster: Multi-Information Source Optimization »
Matthias Poloczek · Jialei Wang · Peter Frazier -
2017 Spotlight: Multi-Information Source Optimization »
Matthias Poloczek · Jialei Wang · Peter Frazier -
2017 Poster: Bayesian Optimization with Gradients »
Jian Wu · Matthias Poloczek · Andrew Wilson · Peter Frazier -
2017 Oral: Bayesian Optimization with Gradients »
Jian Wu · Matthias Poloczek · Andrew Wilson · Peter Frazier -
2016 Poster: Minimizing Regret on Reflexive Banach Spaces and Nash Equilibria in Continuous Zero-Sum Games »
Maximilian Balandat · Walid Krichene · Claire Tomlin · Alexandre Bayen