Skip to yearly menu bar Skip to main content


Poster

Non-geodesically-convex optimization in the Wasserstein space

Hoang Phuc Hau Luu · Hanlin Yu · Bernardo Williams · Petrus Mikkola · Marcelo Hartmann · Kai Puolamäki · Arto Klami

[ ]
Fri 13 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

We study a class of optimization problems in the Wasserstein space (the space of probability measures) where the objective function is nonconvex along generalized geodesics. When the regularization term is the negative entropy, the optimization problem becomes a sampling problem where it minimizes the Kullback-Leibler divergence between a probability measure (optimization variable) and a target probability measure whose logarithmic probability density is a nonconvex function. We derive multiple convergence insights for a novel semi Forward-Backward Euler scheme under several nonconvex (and possibly nonsmooth) regimes. Notably, the semi Forward-Backward Euler is just a slight modification of the Forward-Backward Euler whose convergence is---to our knowledge---still unknown in our very general non-geodesically-convex setting.

Live content is unavailable. Log in and register to view live content