Timezone: »

Rates of Convergence for Nearest Neighbor Classification
Kamalika Chaudhuri · Sanjoy Dasgupta

Wed Dec 10 04:00 PM -- 08:59 PM (PST) @ Level 2, room 210D

We analyze the behavior of nearest neighbor classification in metric spaces and provide finite-sample, distribution-dependent rates of convergence under minimal assumptions. These are more general than existing bounds, and enable us, as a by-product, to establish the universal consistency of nearest neighbor in a broader range of data spaces than was previously known. We illustrate our upper and lower bounds by introducing a new smoothness class customized for nearest neighbor classification. We find, for instance, that under the Tsybakov margin condition the convergence rate of nearest neighbor matches recently established lower bounds for nonparametric classification.

Author Information

Kamalika Chaudhuri (UCSD)
Sanjoy Dasgupta (UC San Diego)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors