Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2021: Optimization for Machine Learning

Simulated Annealing for Neural Architecture Search

Shentong Mo · Jingfei Xia · Pinxu Ren


Abstract:

Gradient-based Neural Architecture Search (NAS) approaches have achieved remarkable progress in the automated machine learning community. However, previous methods would cause much search time and huge computation resources in a big search space for seeking an optimal network structure. In this work, we propose a novel Simulated Annealing algorithm for NAS, namely SA-NAS, by adding perturbations to the gradient-descent for saving search cost and boosting the predictive performance of the search architecture. Our proposed algorithm is easy to be adapted to current state-of-the-art methods in the literature. We conduct extensive experiments on various benchmarks where the results demonstrate the effectiveness and efficiency of our SA-NAS in reducing search time and saving computation resources. Compared to previous differentiable methods, our SA-NAS achieves comparable or better predictive performance under the same setting.

Chat is not available.