Skip to yearly menu bar Skip to main content


Poster

Cost efficient gradient boosting

Sven Peter · Ferran Diego · Fred Hamprecht · Boaz Nadler

Pacific Ballroom #30

Keywords: [ Classification ] [ Regression ] [ Boosting and Ensemble Methods ] [ Algorithms ]


Abstract:

Many applications require learning classifiers or regressors that are both accurate and cheap to evaluate. Prediction cost can be drastically reduced if the learned predictor is constructed such that on the majority of the inputs, it uses cheap features and fast evaluations. The main challenge is to do so with little loss in accuracy. In this work we propose a budget-aware strategy based on deep boosted regression trees. In contrast to previous approaches to learning with cost penalties, our method can grow very deep trees that on average are nonetheless cheap to compute. We evaluate our method on a number of datasets and find that it outperforms the current state of the art by a large margin. Our algorithm is easy to implement and its learning time is comparable to that of the original gradient boosting. Source code is made available at http://github.com/svenpeter42/LightGBM-CEGB.

Live content is unavailable. Log in and register to view live content