Timezone: »
Poster
A Gradient Method for Multilevel Optimization
Ryo Sato · Mirai Tanaka · Akiko Takeda
Although application examples of multilevel optimization have already been discussed since the 1990s, the development of solution methods was almost limited to bilevel cases due to the difficulty of the problem. In recent years, in machine learning, Franceschi et al. have proposed a method for solving bilevel optimization problems by replacing their lower-level problems with the $T$ steepest descent update equations with some prechosen iteration number $T$. In this paper, we have developed a gradient-based algorithm for multilevel optimization with $n$ levels based on their idea and proved that our reformulation asymptotically converges to the original multilevel problem. As far as we know, this is one of the first algorithms with some theoretical guarantee for multilevel optimization. Numerical experiments show that a trilevel hyperparameter learning model considering data poisoning produces more stable prediction results than an existing bilevel hyperparameter learning model in noisy data settings.
Author Information
Ryo Sato (The University of Tokyo)
Mirai Tanaka (The Institute of Statistical Mathematics / RIKEN)
Akiko Takeda (The University of Tokyo / RIKEN)
More from the Same Authors
-
2022 Poster: Single Loop Gaussian Homotopy Method for Non-convex Optimization »
Hidenori Iwakiri · Yuhang Wang · Shinji Ito · Akiko Takeda -
2019 Poster: Semi-flat minima and saddle points by embedding neural networks to overparameterization »
Kenji Fukumizu · Shoichiro Yamaguchi · Yoh-ichi Mototake · Mirai Tanaka -
2013 Poster: Global Solver and Its Efficient Approximation for Variational Bayesian Low-rank Subspace Clustering »
Shinichi Nakajima · Akiko Takeda · S. Derin Babacan · Masashi Sugiyama · Ichiro Takeuchi