Skip to yearly menu bar Skip to main content

Workshop: Optimal Transport and Machine Learning

On Schrödinger Bridge Matching and Expectation Maximization

Rob Brekelmans · Kirill Neklyudov


In this work, we analyze methods for solving the Schrödinger Bridge problem from the perspective of alternating KL divergence minimization. While existing methods such as Iterative Proportional- or Markovian- Fitting require exact updates due to each iteration optimizing the same argument in the \kl divergence, we justify a joint optimization of a single KL divergence objective from the perspective of information geometry. As in the variational EM algorithm, this allows for partial, stochastic gradient updates to decrease a unified objective. We highlight connections with related bridge-matching, flow-matching, and few-step generative modeling approaches, where various parameterizations of the coupling distributions are contextualized from the perspective of marginal-preserving inference.

Chat is not available.