Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Spotlight
Wed Dec 05 12:30 PM -- 12:35 PM (PST) @ Room 517 CD
The Nearest Neighbor Information Estimator is Adaptively Near Minimax Rate-Optimal
Jiantao Jiao · Weihao Gao · Yanjun Han
We analyze the Kozachenko–Leonenko (KL) fixed k-nearest neighbor estimator for the differential entropy. We obtain the first uniform upper bound on its performance for any fixed k over H\"{o}lder balls on a torus without assuming any conditions on how close the density could be from zero. Accompanying a recent minimax lower bound over the H\"{o}lder ball, we show that the KL estimator for any fixed k is achieving the minimax rates up to logarithmic factors without cognizance of the smoothness parameter s of the H\"{o}lder ball for $s \in (0,2]$ and arbitrary dimension d, rendering it the first estimator that provably satisfies this property.