Timezone: »

Variance Penalizing AdaBoost
Pannagadatta K Shivaswamy · Tony Jebara

Wed Dec 14 08:45 AM -- 02:59 PM (PST) @ None #None

This paper proposes a novel boosting algorithm called VadaBoost which is motivated by recent empirical Bernstein bounds. VadaBoost iteratively minimizes a cost function that balances the sample mean and the sample variance of the exponential loss. Each step of the proposed algorithm minimizes the cost efficiently by providing weighted data to a weak learner rather than requiring a brute force evaluation of all possible weak learners. Thus, the proposed algorithm solves a key limitation of previous empirical Bernstein boosting methods which required brute force enumeration of all possible weak learners. Experimental results confirm that the new algorithm achieves the performance improvements of EBBoost yet goes beyond decision stumps to handle any weak learner. Significant performance gains are obtained over AdaBoost for arbitrary weak learners including decision trees (CART).

Author Information

Pannagadatta K Shivaswamy (Cornell University)
Tony Jebara (Spotify)

More from the Same Authors