Timezone: »

Distributional Convergence of the Sliced Wasserstein Process
Jiaqi Xi · Jonathan Niles-Weed

Thu Dec 01 02:00 PM -- 04:00 PM (PST) @ Hall J #814

Motivated by the statistical and computational challenges of computing Wasserstein distances in high-dimensional contexts, machine learning researchers have defined modified Wasserstein distances based on computing distances between one-dimensional projections of the measures. Different choices of how to aggregate these projected distances (averaging, random sampling, maximizing) give rise to different distances, requiring different statistical analyses. We define the \emph{Sliced Wasserstein Process}, a stochastic process defined by the empirical Wasserstein distance between projections of empirical probability measures to all one-dimensional subspaces, and prove a uniform distributional limit theorem for this process. As a result, we obtain a unified framework in which to prove sample complexity and distributional limit results for all Wasserstein distances based on one-dimensional projections. We illustrate these results on a number of examples where no distributional limits were previously known.

Author Information

Jiaqi Xi (Courant Institute of Mathematical Sciences)
Jonathan Niles-Weed (NYU)

More from the Same Authors