`

Timezone: »

 
Poster
Sample complexity and effective dimension for regression on manifolds
Andrew McRae · Justin Romberg · Mark Davenport

Wed Dec 09 09:00 AM -- 11:00 AM (PST) @ Poster Session 3 #1000

We consider the theory of regression on a manifold using reproducing kernel Hilbert space methods. Manifold models arise in a wide variety of modern machine learning problems, and our goal is to help understand the effectiveness of various implicit and explicit dimensionality-reduction methods that exploit manifold structure. Our first key contribution is to establish a novel nonasymptotic version of the Weyl law from differential geometry. From this we are able to show that certain spaces of smooth functions on a manifold are effectively finite-dimensional, with a complexity that scales according to the manifold dimension rather than any ambient data dimension. Finally, we show that given (potentially noisy) function values taken uniformly at random over a manifold, a kernel regression estimator (derived from the spectral decomposition of the manifold) yields minimax-optimal error bounds that are controlled by the effective dimension.

Author Information

Andrew McRae (Georgia Institute of Technology)
Justin Romberg (Georgia Institute of Technology)
Mark Davenport (Georgia Institute of Technology)

More from the Same Authors