Timezone: »

Boosting Algorithms for Maximizing the Soft Margin
Manfred K. Warmuth · Karen Glocer · Gunnar Rätsch

Tue Dec 04 03:20 PM -- 03:30 PM (PST) @
We present a novel boosting algorithm, called Softboost, designed for sets of binary labeled examples that are not necessarily separable by convex combinations of base hypotheses. Our algorithm aims to achieve robustness by \emph{capping} the distributions on the examples. Our update of the distribution is motivated by minimizing a relative entropy subject to the capping constraints and constraints on the edges of the obtained base hypotheses. The capping constraints imply a soft margin in the dual optimization problem and our algorithm produces a convex combination of hypotheses whose \emph{soft margin} is within $\delta$ of the optimum. We employ relative entropy projection methods to prove an $O(\frac{\ln N}{\delta^2})$ iteration bound for our algorithm, where $N$ is number of examples.

Author Information

Manfred K. Warmuth (Google Brain)
Karen Glocer (UC Santa Cruz)
Gunnar Rätsch (ETHZ)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors