Skip to yearly menu bar Skip to main content


Poster

Optimization of Smooth Functions with Noisy Observations: Local Minimax Rates

Yining Wang · Sivaraman Balakrishnan · Aarti Singh

Room 210 #65

Keywords: [ Frequentist Statistics ] [ Learning Theory ] [ Information Theory ]


Abstract:

We consider the problem of global optimization of an unknown non-convex smooth function with noisy zeroth-order feedback. We propose a local minimax framework to study the fundamental difficulty of optimizing smooth functions with adaptive function evaluations. We show that for functions with fast growth around their global minima, carefully designed optimization algorithms can identify a near global minimizer with many fewer queries than worst-case global minimax theory predicts. For the special case of strongly convex and smooth functions, our implied convergence rates match the ones developed for zeroth-order convex optimization problems. On the other hand, we show that in the worst case no algorithm can converge faster than the minimax rate of estimating an unknown functions in linf-norm. Finally, we show that non-adaptive algorithms, although optimal in a global minimax sense, do not attain the optimal local minimax rate.

Live content is unavailable. Log in and register to view live content