Poster
An Accelerated Gradient Method for Convex Smooth Simple Bilevel Optimization
Jincheng Cao · Ruichen Jiang · Erfan Yazdandoost Hamedani · Aryan Mokhtari
West Ballroom A-D #5909
Abstract:
In this paper, we focus on simple bilevel optimization problems, where we minimize a convex smooth objective function over the optimal solution set of another convex smooth constrained optimization problem. We present a novel bilevel optimization method that locally approximates the solution set of the lower-level problem using a cutting plane approach and employs an accelerated gradient-based update to reduce the upper-level objective function over the approximated solution set. We measure the performance of our method in terms of suboptimality and infeasibility errors and provide non-asymptotic convergence guarantees for both error criteria. Specifically, when the feasible set is compact, we show that our method requires at most iterations to find a solution that is -suboptimal and -infeasible. Moreover, under the additional assumption that the lower-level objective satisfies the -th Hölderian error bound, we show that our method achieves an iteration complexity of , which matches the optimal complexity of single-level convex constrained optimization when .
Chat is not available.