Skip to yearly menu bar Skip to main content


Keynote talk
in
Workshop: Optimal Transport and Machine Learning

Benefits of using optimal transport in computational learning and inversion

Yunan Yang


Abstract:

Understanding the generalization capacity has been a central topic in mathematical machine learning. In this talk, I will present a generalized weighted least-squares optimization method for computational learning and inversion with noisy data. In particular, using the Wasserstein metric as the objective function and implementing the Wasserstein gradient flow (or Wasserstein natural gradient descent method) fall into the framework. The weighting scheme encodes both a priori knowledge on the object to be learned and a strategy to weight the contribution of different data points in the loss function. We will see that appropriate weighting from prior knowledge can greatly improve the generalization capability of the learned model.