Timezone: »

 
Poster
Sinkhorn Barycenter via Functional Gradient Descent
Zebang Shen · Zhenfu Wang · Alejandro Ribeiro · Hamed Hassani

Thu Dec 10 09:00 PM -- 11:00 PM (PST) @ Poster Session 6 #1860
In this paper, we consider the problem of computing the barycenter of a set of probability distributions under the Sinkhorn divergence. This problem has recently found applications across various domains, including graphics, learning, and vision, as it provides a meaningful mechanism to aggregate knowledge. Unlike previous approaches which directly operate in the space of probability measures, we recast the Sinkhorn barycenter problem as an instance of unconstrained functional optimization and develop a novel functional gradient descent method named \texttt{Sinkhorn Descent} (\texttt{SD}). We prove that \texttt{SD} converges to a stationary point at a sublinear rate, and under reasonable assumptions, we further show that it asymptotically finds a global minimizer of the Sinkhorn barycenter problem. Moreover, by providing a mean-field analysis, we show that \texttt{SD} preserves the {weak convergence} of empirical measures. Importantly, the computational complexity of \texttt{SD} scales linearly in the dimension $d$ and we demonstrate its scalability by solving a $100$-dimensional Sinkhorn barycenter problem.

Author Information

Zebang Shen (University of Pennsylvania)
Zhenfu Wang (Peking University)
Alejandro Ribeiro (University of Pennsylvania)
Hamed Hassani (UPenn)

More from the Same Authors