Timezone: »

 
Poster
Provably Efficient Online Hyperparameter Optimization with Population-Based Bandits
Jack Parker-Holder · Vu Nguyen · Stephen J Roberts

Tue Dec 08 09:00 AM -- 11:00 AM (PST) @ Poster Session 1 #554

Many of the recent triumphs in machine learning are dependent on well-tuned hyperparameters. This is particularly prominent in reinforcement learning (RL) where a small change in the configuration can lead to failure. Despite the importance of tuning hyperparameters, it remains expensive and is often done in a naive and laborious way. A recent solution to this problem is Population Based Training (PBT) which updates both weights and hyperparameters in a \emph{single training run} of a population of agents. PBT has been shown to be particularly effective in RL, leading to widespread use in the field. However, PBT lacks theoretical guarantees since it relies on random heuristics to explore the hyperparameter space. This inefficiency means it typically requires vast computational resources, which is prohibitive for many small and medium sized labs. In this work, we introduce the first provably efficient PBT-style algorithm, Population-Based Bandits (PB2). PB2 uses a probabilistic model to guide the search in an efficient way, making it possible to discover high performing hyperparameter configurations with far fewer agents than typically required by PBT. We show in a series of RL experiments that PB2 is able to achieve high performance with a modest computational budget.

Author Information

Jack Parker-Holder (University of Oxford)
Vu Nguyen (Amazon Research Adelaide)
Stephen J Roberts (University of Oxford)

More from the Same Authors