Timezone: »

Multi-Layered Gradient Boosting Decision Trees
Ji Feng · Yang Yu · Zhi-Hua Zhou

Thu Dec 06 07:45 AM -- 09:45 AM (PST) @ Room 517 AB #144

Multi-layered distributed representation is believed to be the key ingredient of deep neural networks especially in cognitive tasks like computer vision. While non-differentiable models such as gradient boosting decision trees (GBDTs) are still the dominant methods for modeling discrete or tabular data, they are hard to incorporate with such representation learning ability. In this work, we propose the multi-layered GBDT forest (mGBDTs), with an explicit emphasis on exploring the ability to learn hierarchical distributed representations by stacking several layers of regression GBDTs as its building block. The model can be jointly trained by a variant of target propagation across layers, without the need to derive backpropagation nor differentiability. Experiments confirmed the effectiveness of the model in terms of performance and representation learning ability.

Author Information

Ji Feng (Nanjing University & Sinovation Ventures AI Institute)
Yang Yu (Nanjing University)
Zhi-Hua Zhou (Nanjing University)

More from the Same Authors