Timezone: »
Bayesian optimization (BO) is a successful methodology to optimize black-box functions that are expensive to evaluate. While traditional methods optimize each black-box function in isolation, there has been recent interest in speeding up BO by transferring knowledge across multiple related black-box functions. In this work, we introduce a method to automatically design the BO search space by relying on evaluations of previous black-box functions. We depart from the common practice of defining a set of arbitrary search ranges a priori by considering search space geometries that are learnt from historical data. This simple, yet effective strategy can be used to endow many existing BO methods with transfer learning properties. Despite its simplicity, we show that our approach considerably boosts BO by reducing the size of the search space, thus accelerating the optimization of a variety of black-box optimization problems. In particular, the proposed approach combined with random search results in a parameter-free, easy-to-implement, robust hyperparameter optimization strategy. We hope it will constitute a natural baseline for further research attempting to warm-start BO.
Author Information
Valerio Perrone (Amazon)
Huibin Shen (Amazon)
Matthias Seeger (Amazon)
Cedric Archambeau (Amazon)
Rodolphe Jenatton (Google Brain)
More from the Same Authors
-
2021 : Gradient-matching coresets for continual learning »
Lukas Balles · Giovanni Zappella · Cedric Archambeau -
2021 : Uncertainty Baselines: Benchmarks for Uncertainty & Robustness in Deep Learning »
Zachary Nado · Neil Band · Mark Collier · Josip Djolonga · Mike Dusenberry · Sebastian Farquhar · Qixuan Feng · Angelos Filos · Marton Havasi · Rodolphe Jenatton · Ghassen Jerfel · Jeremiah Liu · Zelda Mariet · Jeremy Nixon · Shreyas Padhy · Jie Ren · Tim G. J. Rudner · Yeming Wen · Florian Wenzel · Kevin Murphy · D. Sculley · Balaji Lakshminarayanan · Jasper Snoek · Yarin Gal · Dustin Tran -
2021 : Deep Classifiers with Label Noise Modeling and Distance Awareness »
Vincent Fortuin · Mark Collier · Florian Wenzel · James Allingham · Jeremiah Liu · Dustin Tran · Balaji Lakshminarayanan · Jesse Berent · Rodolphe Jenatton · Effrosyni Kokiopoulou -
2022 Poster: On the Adversarial Robustness of Mixture of Experts »
Joan Puigcerver · Rodolphe Jenatton · Carlos Riquelme · Pranjal Awasthi · Srinadh Bhojanapalli -
2022 Poster: Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts »
Basil Mustafa · Carlos Riquelme · Joan Puigcerver · Rodolphe Jenatton · Neil Houlsby -
2021 Poster: Scaling Vision with Sparse Mixture of Experts »
Carlos Riquelme · Joan Puigcerver · Basil Mustafa · Maxim Neumann · Rodolphe Jenatton · André Susano Pinto · Daniel Keysers · Neil Houlsby -
2020 : Bayesian optimization by density ratio estimation »
Louis Tiao · Aaron Klein · Cedric Archambeau · Edwin Bonilla · Matthias W Seeger · Fabio Ramos -
2020 Poster: Hyperparameter Ensembles for Robustness and Uncertainty Quantification »
Florian Wenzel · Jasper Snoek · Dustin Tran · Rodolphe Jenatton -
2018 Poster: Scalable Hyperparameter Transfer Learning »
Valerio Perrone · Rodolphe Jenatton · Matthias W Seeger · Cedric Archambeau -
2013 Poster: Convex Relaxations for Permutation Problems »
Fajwel Fogel · Rodolphe Jenatton · Francis Bach · Alexandre d'Aspremont -
2012 Poster: A latent factor model for highly multi-relational data »
Rodolphe Jenatton · Nicolas Le Roux · Antoine Bordes · Guillaume R Obozinski -
2010 Workshop: Numerical Mathematics Challenges in Machine Learning »
Matthias Seeger · Suvrit Sra -
2010 Session: Oral Session 6 »
Matthias Seeger -
2010 Poster: Network Flow Algorithms for Structured Sparsity »
Julien Mairal · Rodolphe Jenatton · Guillaume R Obozinski · Francis Bach -
2009 Poster: Speeding up Magnetic Resonance Image Acquisition by Bayesian Multi-Slice Adaptive Compressed Sensing »
Matthias Seeger -
2008 Poster: Bayesian Experimental Design of Magnetic Resonance Imaging Sequences »
Matthias Seeger · Hannes Nickisch · Rolf Pohmann · Bernhard Schölkopf -
2008 Spotlight: Bayesian Experimental Design of Magnetic Resonance Imaging Sequences »
Matthias Seeger · Hannes Nickisch · Rolf Pohmann · Bernhard Schölkopf -
2008 Poster: Local Gaussian Process Regression for Real Time Online Model Learning »
Duy Nguyen-Tuong · Matthias Seeger · Jan Peters -
2007 Workshop: Approximate Bayesian Inference in Continuous/Hybrid Models »
Matthias Seeger · David Barber · Neil D Lawrence · Onno Zoeter -
2007 Oral: Bayesian Inference for Spiking Neuron Models with a Sparsity Prior »
Sebastian Gerwinn · Jakob H Macke · Matthias Seeger · Matthias Bethge -
2007 Poster: Bayesian Inference for Spiking Neuron Models with a Sparsity Prior »
Sebastian Gerwinn · Jakob H Macke · Matthias Seeger · Matthias Bethge -
2006 Poster: Cross-Validation Optimization for Large Scale Hierarchical Classification Kernel Methods »
Matthias Seeger