Abstract:
We propose a simple extension of {\sl top-down decision tree learning heuristics} such as ID3, C4.5, and CART. Our algorithm achieves provable guarantees for all target functions with respect to the uniform distribution, circumventing impossibility results showing that existing heuristics fare poorly even for simple target functions. The crux of our extension is a new splitting criterion that takes into account the correlations between and {\sl small subsets} of its attributes. The splitting criteria of existing heuristics (e.g. Gini impurity and information gain), in contrast, are based solely on the correlations between and its {\sl individual} attributes.
Our algorithm satisfies the following guarantee: for all target functions , sizes , and error parameters , it constructs a decision tree of size that achieves error , where denotes the error of the optimal size- decision tree for . A key technical notion that drives our analysis is the {\sl noise stability} of , a well-studied smoothness measure of .
Chat is not available.