Loading [MathJax]/jax/output/CommonHTML/jax.js

Universal guarantees for decision tree induction via a higher-order splitting criterion

Part of Advances in Neural Information Processing Systems 33 (NeurIPS 2020)

AuthorFeedback Bibtex MetaReview Paper Review Supplemental

Authors

Guy Blanc, Neha Gupta, Jane Lange, Li-Yang Tan

Abstract

We propose a simple extension of {\sl top-down decision tree learning heuristics} such as ID3, C4.5, and CART. Our algorithm achieves provable guarantees for all target functions f:{1,1}n{1,1} with respect to the uniform distribution, circumventing impossibility results showing that existing heuristics fare poorly even for simple target functions. The crux of our extension is a new splitting criterion that takes into account the correlations between f and {\sl small subsets} of its attributes. The splitting criteria of existing heuristics (e.g. Gini impurity and information gain), in contrast, are based solely on the correlations between f and its {\sl individual} attributes. Our algorithm satisfies the following guarantee: for all target functions f:{1,1}n{1,1}, sizes s\N, and error parameters \eps, it constructs a decision tree of size s˜O((logs)2/\eps2) that achieves error O(\opts)+\eps, where \opts denotes the error of the optimal size-s decision tree for f. A key technical notion that drives our analysis is the {\sl noise stability} of f, a well-studied smoothness measure of f.