`

Timezone: »

 
Poster
Parallel Bayesian Optimization of Multiple Noisy Objectives with Expected Hypervolume Improvement
Sam Daulton · Maximilian Balandat · Eytan Bakshy

Tue Dec 07 08:30 AM -- 10:00 AM (PST) @ None #None

Optimizing multiple competing black-box objectives is a challenging problem in many fields, including science, engineering, and machine learning. Multi-objective Bayesian optimization (MOBO) is a sample-efficient approach for identifying the optimal trade-offs between the objectives. However, many existing methods perform poorly when the observations are corrupted by noise. We propose a novel acquisition function, NEHVI, that overcomes this important practical limitation by applying a Bayesian treatment to the popular expected hypervolume improvement (EHVI) criterion and integrating over this uncertainty in the Pareto frontier. We argue that, even in the noiseless setting, generating multiple candidates in parallel is an incarnation of EHVI with uncertainty in the Pareto frontier and therefore can be addressed using the same underlying technique. Through this lens, we derive a natural parallel variant, qNEHVI, that reduces computational complexity of parallel EHVI from exponential to polynomial with respect to the batch size. qNEHVI is one-step Bayes-optimal for hypervolume maximization in both noisy and noiseless environments, and we show that it can be optimized effectively with gradient-based methods via sample average approximation. Empirically, we demonstrate not only that qNEHVI is substantially more robust to observation noise than existing MOBO approaches, but also that it achieves state-of-the-art optimization performance and competitive wall-times in large-batch environments.

Author Information

Sam Daulton (Meta, University of Oxford)

Research Scientist at Meta, PhD Candidate at Oxford. My research focuses on Bayesian optimization.

Maximilian Balandat (University of California, Berkeley)
Eytan Bakshy (Meta)

More from the Same Authors