Skip to yearly menu bar Skip to main content


Poster

The Many Faces of Optimal Weak-to-Strong Learning

Mikael Møller Høgsgaard · Kasper Green Larsen · Markus Engelund Mathiasen

[ ]
Wed 11 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Boosting is an extremely successful idea, allowing one to combine multiple low accuracy classifiers into a much more accurate voting classifier. In this work, we present a new and surprisingly simple Boosting algorithm that obtains a provably optimal sample complexity. Sample optimal Boosting algorithms have only recently been developed, and our new algorithm has the fastest runtime among all such algorithms and is the simplest to describe: Partition your training data into 29 disjoint pieces of equal size, run AdaBoost on each, and combine the resulting classifiers via a majority vote. In addition to this theoretical contribution, we also perform the first empirical comparison of the proposed sample optimal Boosting algorithms. Our experiments suggest that our new algorithm has the best accuracy on large data sets.

Live content is unavailable. Log in and register to view live content