Skip to yearly menu bar Skip to main content


Greedy Feature Construction

Dino Oglic · Thomas Gärtner

Area 5+6+7+8 #108

Keywords: [ Sparsity and Feature Selection ] [ Large Scale Learning and Big Data ]


We present an effective method for supervised feature construction. The main goal of the approach is to construct a feature representation for which a set of linear hypotheses is of sufficient capacity -- large enough to contain a satisfactory solution to the considered problem and small enough to allow good generalization from a small number of training examples. We achieve this goal with a greedy procedure that constructs features by empirically fitting squared error residuals. The proposed constructive procedure is consistent and can output a rich set of features. The effectiveness of the approach is evaluated empirically by fitting a linear ridge regression model in the constructed feature space and our empirical results indicate a superior performance of our approach over competing methods.

Live content is unavailable. Log in and register to view live content