Speculate-Correct Error Bounds for K-Nearest Neighbor Classifiers
Abstract
We introduce the speculate-correct method to derive error bounds for local classifiers. Using it, we show that k nearest neighbor classifiers, in spite of their famously fractured decision boundaries, have exponential error bounds with O(sqrt((k + ln n) / n)) error bound range for n in-sample examples.
Paper
Machine Learning (Journal)