Timezone: »
Poster
Single Loop Gaussian Homotopy Method for Non-convex Optimization
Hidenori Iwakiri · Yuhang Wang · Shinji Ito · Akiko Takeda
The Gaussian homotopy (GH) method is a popular approach to finding better stationary points for non-convex optimization problems by gradually reducing a parameter value $t$, which changes the problem to be solved from an almost convex one to the original target one. Existing GH-based methods repeatedly call an iterative optimization solver to find a stationary point every time $t$ is updated, which incurs high computational costs. We propose a novel single loop framework for GH methods (SLGH) that updates the parameter $t$ and the optimization decision variables at the same. Computational complexity analysis is performed on the SLGH algorithm under various situations: either a gradient or gradient-free oracle of a GH function can be obtained for both deterministic and stochastic settings. The convergence rate of SLGH with a tuned hyperparameter becomes consistent with the convergence rate of gradient descent, even though the problem to be solved is gradually changed due to $t$. In numerical experiments, our SLGH algorithms show faster convergence than an existing double loop GH method while outperforming gradient descent-based methods in terms of finding a better solution.
Author Information
Hidenori Iwakiri (The University of Tokyo / RIKEN AIP)
Yuhang Wang (The University of Tokyo)
Shinji Ito (NEC Corporation)
Akiko Takeda (The University of Tokyo / RIKEN)
More from the Same Authors
-
2022 Poster: Average Sensitivity of Euclidean k-Clustering »
Yuichi Yoshida · Shinji Ito -
2022 Poster: Nearly Optimal Best-of-Both-Worlds Algorithms for Online Learning with Feedback Graphs »
Shinji Ito · Taira Tsuchiya · Junya Honda -
2021 Poster: A Gradient Method for Multilevel Optimization »
Ryo Sato · Mirai Tanaka · Akiko Takeda -
2021 Poster: On Optimal Robustness to Adversarial Corruption in Online Decision Problems »
Shinji Ito -
2021 Poster: Hybrid Regret Bounds for Combinatorial Semi-Bandits and Adversarial Linear Bandits »
Shinji Ito -
2020 Poster: Tight First- and Second-Order Regret Bounds for Adversarial Linear Bandits »
Shinji Ito · Shuichi Hirahara · Tasuku Soma · Yuichi Yoshida -
2020 Poster: A Tight Lower Bound and Efficient Reduction for Swap Regret »
Shinji Ito -
2020 Poster: Delay and Cooperation in Nonstochastic Linear Bandits »
Shinji Ito · Daisuke Hatano · Hanna Sumita · Kei Takemura · Takuro Fukunaga · Naonori Kakimura · Ken-Ichi Kawarabayashi -
2020 Spotlight: A Tight Lower Bound and Efficient Reduction for Swap Regret »
Shinji Ito -
2020 Spotlight: Delay and Cooperation in Nonstochastic Linear Bandits »
Shinji Ito · Daisuke Hatano · Hanna Sumita · Kei Takemura · Takuro Fukunaga · Naonori Kakimura · Ken-Ichi Kawarabayashi -
2020 Spotlight: Tight First- and Second-Order Regret Bounds for Adversarial Linear Bandits »
Shinji Ito · Shuichi Hirahara · Tasuku Soma · Yuichi Yoshida -
2019 Poster: Improved Regret Bounds for Bandit Combinatorial Optimization »
Shinji Ito · Daisuke Hatano · Hanna Sumita · Kei Takemura · Takuro Fukunaga · Naonori Kakimura · Ken-Ichi Kawarabayashi -
2019 Poster: Submodular Function Minimization with Noisy Evaluation Oracle »
Shinji Ito -
2019 Poster: Oracle-Efficient Algorithms for Online Linear Optimization with Bandit Feedback »
Shinji Ito · Daisuke Hatano · Hanna Sumita · Kei Takemura · Takuro Fukunaga · Naonori Kakimura · Ken-Ichi Kawarabayashi -
2013 Poster: Global Solver and Its Efficient Approximation for Variational Bayesian Low-rank Subspace Clustering »
Shinichi Nakajima · Akiko Takeda · S. Derin Babacan · Masashi Sugiyama · Ichiro Takeuchi