Skip to yearly menu bar Skip to main content


Poster

Fast and Flexible Monotonic Functions with Ensembles of Lattices

Mahdi Milani Fard · Kevin Canini · Andrew Cotter · Jan Pfeifer · Maya Gupta

Area 5+6+7+8 #74

Keywords: [ Ensemble Methods and Boosting ] [ (Other) Cognitive Science ] [ (Other) Machine Learning Topics ] [ (Other) Regression ] [ (Other) Classification ]


Abstract:

For many machine learning problems, there are some inputs that are known to be positively (or negatively) related to the output, and in such cases training the model to respect that monotonic relationship can provide regularization, and makes the model more interpretable. However, flexible monotonic functions are computationally challenging to learn beyond a few features. We break through this barrier by learning ensembles of monotonic calibrated interpolated look-up tables (lattices). A key contribution is an automated algorithm for selecting feature subsets for the ensemble base models. We demonstrate that compared to random forests, these ensembles produce similar or better accuracy, while providing guaranteed monotonicity consistent with prior knowledge, smaller model size and faster evaluation.

Live content is unavailable. Log in and register to view live content