Skip to yearly menu bar Skip to main content


Spotlight

Parameter-Free Online Learning via Model Selection

Dylan J Foster · Satyen Kale · Mehryar Mohri · Karthik Sridharan

Abstract: We introduce a new framework for deriving efficient algorithms that obtain model selection oracle inequalities in the adversarial online learning setting, also sometimes described as parameter-free online learning. While work in this area has focused on specific, highly-structured, function classes, such as nested balls in a Hilbert space, we eschew this approach and propose a generic meta-algorithm framework which achieves oracle inequalities under minimal structural assumptions. This allows us to derive new computationally efficient algorithms with oracle bounds for a wide range of settings where such results were previously unavailable. We give the first computationally efficient algorithms which work in arbitrary Banach spaces under mild smoothness assumptions --- previous results only applied to the Hilbert case. We further derive new oracle inequalities for various matrix classes, non-nested convex sets, and $\R^{d}$ with generic regularizers. Finally, we generalize further by providing oracle inequalities for arbitrary non-linear classes in the contextual learning model; in particular, we give new algorithms for learning with multiple kernels. These results are all derived through a unified meta-algorithm scheme based on a novel "multi-scale" algorithm for prediction with expert advice based on random playout, which may be of independent interest.

Chat is not available.