Timezone: »

Towards Robust and Automatic Hyper-Parameter Tunning
Mathieu Tuli · Mahdi Hosseini · Konstantinos N Plataniotis

The task of hyper-parameter optimization (HPO) is burdened with heavy computational costs due to the intractability of optimizing both a model's weights and its hyper-parameters simultaneously. In this work, we introduce a new class of HPO method and explore how the low-rank factorization of the convolutional weights of intermediate layers of a convolutional neural network can be used to define an analytical response surface for optimizing hyper-parameters, using only training data. We quantify how this surface behaves as a surrogate to model performance and can be solved using a trust-region search algorithm, which we call \AlgName. The algorithm outperforms state-of-the-art such as Bayesian Optimization and generalizes across model, optimizer, and dataset selection.

Author Information

Mathieu Tuli (University of Toronto and Vector Institute)
Mahdi Hosseini (University of New Brunswick)
Konstantinos N Plataniotis (UofT)

More from the Same Authors

  • 2021 : NoFADE: Analyzing Diminishing Returns on CO2 Investment »
    Andre Fu · Justin Tran · Andy Xie · Jonathan Spraggett · Elisa Ding · Chang-Won Lee · Kanav Singla · Mahdi Hosseini · Konstantinos N Plataniotis
  • 2021 : Fairness:: P4AI: Approaching AI Ethics through Principlism »
    Mahdi Hosseini · Konstantinos N Plataniotis
  • 2021 : Poster Session 2 (gather.town) »
    Wenjie Li · Akhilesh Soni · Jinwuk Seok · Jianhao Ma · Jeffery Kline · Mathieu Tuli · Miaolan Xie · Robert Gower · Quanqi Hu · Matteo Cacciola · Yuanlu Bai · Boyue Li · Wenhao Zhan · Shentong Mo · Junhyung Lyle Kim · Sajad Fathi Hafshejani · Chris Junchi Li · Zhishuai Guo · Harshvardhan Harshvardhan · Neha Wadia · Tatjana Chavdarova · Difan Zou · Zixiang Chen · Aman Gupta · Jacques Chen · Betty Shea · Benoit Dherin · Aleksandr Beznosikov