Speculate-Correct Error Bounds for K-Nearest Neighbor Classifiers

January 1, 2019
Abstract

We introduce the speculate-correct method to derive error bounds for local classifiers. Using it, we show that k nearest neighbor classifiers, in spite of their famously fractured decision boundaries, have exponential error bounds with O(sqrt((k + ln n) / n)) error bound range for n in-sample examples.

Download
Publication Type
Paper
Conference / Journal Name
Machine Learning (Journal)

BibTeX


@inproceedings{
    author = {},
    title = {‌Speculate-Correct Error Bounds for K-Nearest Neighbor Classifiers‌},
    booktitle = {Proceedings of Machine Learning (Journal)‌},
    year = {‌2019‌}
}