Timezone: »
Bayesian optimization (BO) is a powerful approach to sample-efficient optimization of black-box objective functions. However, the application of BO to areas such as recommendation systems often requires taking the interpretability and simplicity of the configurations into consideration, a setting that has not been previously studied in the BO literature. To make BO applicable in this setting, we present several regularization-based approaches that allow us to discover sparse and more interpretable configurations. We propose a novel differentiable relaxation based on homotopy continuation that makes it possible to target sparsity by working directly with regularization. We identify failure modes for regularized BO and develop a hyperparameter-free method, sparsity exploring Bayesian optimization (SEBO) that seeks to simultaneously maximize a target objective and sparsity. SEBO and methods based on fixed regularization are evaluated on synthetic and real-world problems, and we show that we are able to efficiently optimize for sparsity.
Author Information
Sulin Liu (Princeton University)
Qing Feng (Facebook)
David Eriksson (Meta)
Ben Letham (Facebook)
Eytan Bakshy (Meta)
More from the Same Authors
-
2021 : ProBF: Probabilistic Safety Certificates with Barrier Functions »
Sulin Liu · Athindran Ramesh Kumar · Jaime Fisac · Ryan Adams · Peter J. Ramadge -
2021 : Practical Policy Optimization with PersonalizedExperimentation »
Mia Garrard · Hanson Wang · Ben Letham · Zehui Wang · Yin Huang · Yichun Hu · Chad Zhou · Norm Zhou · Eytan Bakshy -
2021 : Semiparametric approaches for decision making in high-dimensional sensory discrimination tasks »
Stephen Keeley · Ben Letham · Chase Tymms · Michael Shvartsman -
2022 : One-Shot Optimal Design for Gaussian Process Analysis of Randomized Experiments »
Jelena Markovic · Qing Feng · Eytan Bakshy -
2022 : Panel »
Roman Garnett · José Miguel Hernández-Lobato · Eytan Bakshy · Syrine Belakaria · Stefanie Jegelka -
2022 : Sparse Bayesian Optimization »
Sulin Liu -
2022 Poster: Bayesian Optimization over Discrete and Mixed Spaces via Probabilistic Reparameterization »
Samuel Daulton · Xingchen Wan · David Eriksson · Maximilian Balandat · Michael A Osborne · Eytan Bakshy -
2021 Poster: Multi-Step Budgeted Bayesian Optimization with Unknown Evaluation Costs »
Raul Astudillo · Daniel Jiang · Maximilian Balandat · Eytan Bakshy · Peter Frazier -
2021 Poster: Parallel Bayesian Optimization of Multiple Noisy Objectives with Expected Hypervolume Improvement »
Samuel Daulton · Maximilian Balandat · Eytan Bakshy -
2021 Poster: Bayesian Optimization with High-Dimensional Outputs »
Wesley Maddox · Maximilian Balandat · Andrew Wilson · Eytan Bakshy -
2020 Poster: Differentiable Expected Hypervolume Improvement for Parallel Multi-Objective Bayesian Optimization »
Samuel Daulton · Maximilian Balandat · Eytan Bakshy -
2020 Poster: Fast Matrix Square Roots with Applications to Gaussian Processes and Bayesian Optimization »
Geoff Pleiss · Martin Jankowiak · David Eriksson · Anil Damle · Jacob Gardner -
2020 Poster: Task-Agnostic Amortized Inference of Gaussian Process Hyperparameters »
Sulin Liu · Xingyuan Sun · Peter J. Ramadge · Ryan Adams -
2020 Poster: BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization »
Maximilian Balandat · Brian Karrer · Daniel Jiang · Samuel Daulton · Ben Letham · Andrew Wilson · Eytan Bakshy -
2020 Poster: Re-Examining Linear Embeddings for High-Dimensional Bayesian Optimization »
Ben Letham · Roberto Calandra · Akshara Rai · Eytan Bakshy -
2020 Poster: High-Dimensional Contextual Policy Search with Unknown Context Rewards using Bayesian Optimization »
Qing Feng · Ben Letham · Hongzi Mao · Eytan Bakshy -
2020 Spotlight: High-Dimensional Contextual Policy Search with Unknown Context Rewards using Bayesian Optimization »
Qing Feng · Ben Letham · Hongzi Mao · Eytan Bakshy -
2019 : Invited Speaker: Eytan Bakshy »
Eytan Bakshy -
2018 : Software Panel »
Ben Letham · David Duvenaud · Dustin Tran · Aki Vehtari