Timezone: »

Multiclass Boosting: Theory and Algorithms
Mohammad J Saberian · Nuno Vasconcelos

Mon Dec 12 10:00 AM -- 02:59 PM (PST) @ None #None

The problem of multiclass boosting is considered. A new framework,based on multi-dimensional codewords and predictors is introduced. The optimal set of codewords is derived, and a margin enforcing loss proposed. The resulting risk is minimized by gradient descent on a multidimensional functional space. Two algorithms are proposed: 1) CD-MCBoost, based on coordinate descent, updates one predictor component at a time, 2) GD-MCBoost, based on gradient descent, updates all components jointly. The algorithms differ in the weak learners that they support but are both shown to be 1) Bayes consistent, 2) margin enforcing, and 3) convergent to the global minimum of the risk. They also reduce to AdaBoost when there are only two classes. Experiments show that both methods outperform previous multiclass boosting approaches on a number of datasets.

Author Information

Mohammad J Saberian (UC San Diego)
Nuno Vasconcelos (UC San Diego)

More from the Same Authors