Timezone: »

Gradient Weights help Nonparametric Regressors
Samory Kpotufe · Abdeslam Boularias

Thu Dec 06 11:40 AM -- 12:00 PM (PST) @ Harveys Convention Center Floor, CC
In regression problems over $\real^d$, the unknown function $f$ often varies more in some coordinates than in others. We show that weighting each coordinate $i$ with the estimated norm of the $i$th derivative of $f$ is an efficient way to significantly improve the performance of distance-based regressors, e.g. kernel and $k$-NN regressors. We propose a simple estimator of these derivative norms and prove its consistency. Moreover, the proposed estimator is efficiently learned online.

Author Information

Samory Kpotufe (Princeton University)
Abdeslam Boularias (Max Planck Institute for Intelligent Systems)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors