Part of Advances in Neural Information Processing Systems 8 (NIPS 1995)
Trevor Hastie, Robert Tibshirani
Nearest neighbor classification expects the class conditional prob(cid:173) abilities to be locally constant, and suffers from bias in high di(cid:173) mensions We propose a locally adaptive form of nearest neighbor classification to try to finesse this curse of dimensionality. We use a local linear discriminant analysis to estimate an effective met(cid:173) ric for computing neighborhoods. We determine the local decision boundaries from centroid information, and then shrink neighbor(cid:173) hoods in directions orthogonal to these local decision boundaries, and elongate them parallel to the boundaries. Thereafter, any neighborhood-based classifier can be employed, using the modified neighborhoods. We also propose a method for global dimension reduction, that combines local dimension information. We indicate how these techniques can be extended to the regression problem.