`

Timezone: »

 
Spotlight
Speedy Performance Estimation for Neural Architecture Search
Robin Ru · Clare Lyle · Lisa Schut · Miroslav Fil · Mark van der Wilk · Yarin Gal

@ None

Reliable yet efficient evaluation of generalisation performance of a proposed architecture is crucial to the success of neural architecture search (NAS). Traditional approaches face a variety of limitations: training each architecture to completion is prohibitively expensive, early stopped validation accuracy may correlate poorly with fully trained performance, and model-based estimators require large training sets. We instead propose to estimate the final test performance based on a simple measure of training speed. Our estimator is theoretically motivated by the connection between generalisation and training speed, and is also inspired by the reformulation of a PAC-Bayes bound under the Bayesian setting. Our model-free estimator is simple, efficient, and cheap to implement, and does not require hyperparameter-tuning or surrogate training before deployment. We demonstrate on various NAS search spaces that our estimator consistently outperforms other alternatives in achieving better correlation with the true test performance rankings. We further show that our estimator can be easily incorporated into both query-based and one-shot NAS methods to improve the speed or quality of the search.

Author Information

Robin Ru (Oxford University)
Clare Lyle (University of Oxford)
Lisa Schut (University of Oxford)
Miroslav Fil (University of Oxford)
Mark van der Wilk (PROWLER.io)
Yarin Gal (University of Oxford)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors