Skip to yearly menu bar Skip to main content


Spotlight

Boosting Algorithms for Maximizing the Soft Margin

Manfred K. Warmuth · Karen Glocer · Gunnar Rätsch


Abstract: We present a novel boosting algorithm, called Softboost, designed for sets of binary labeled examples that are not necessarily separable by convex combinations of base hypotheses. Our algorithm aims to achieve robustness by \emph{capping} the distributions on the examples. Our update of the distribution is motivated by minimizing a relative entropy subject to the capping constraints and constraints on the edges of the obtained base hypotheses. The capping constraints imply a soft margin in the dual optimization problem and our algorithm produces a convex combination of hypotheses whose \emph{soft margin} is within $\delta$ of the optimum. We employ relative entropy projection methods to prove an $O(\frac{\ln N}{\delta^2})$ iteration bound for our algorithm, where $N$ is number of examples.

Chat is not available.