Timezone: »

Cost efficient gradient boosting
Sven Peter · Ferran Diego · Fred Hamprecht · Boaz Nadler

Mon Dec 04 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #30

Many applications require learning classifiers or regressors that are both accurate and cheap to evaluate. Prediction cost can be drastically reduced if the learned predictor is constructed such that on the majority of the inputs, it uses cheap features and fast evaluations. The main challenge is to do so with little loss in accuracy. In this work we propose a budget-aware strategy based on deep boosted regression trees. In contrast to previous approaches to learning with cost penalties, our method can grow very deep trees that on average are nonetheless cheap to compute. We evaluate our method on a number of datasets and find that it outperforms the current state of the art by a large margin. Our algorithm is easy to implement and its learning time is comparable to that of the original gradient boosting. Source code is made available at http://github.com/svenpeter42/LightGBM-CEGB.

Author Information

Sven Peter (University Heidelberg)
Ferran Diego (Bosch)
Fred Hamprecht (Heidelberg University)
Boaz Nadler (Weizmann Institute of Science)

More from the Same Authors