Timezone: »

 
Targeted Separation and Convergence with Kernel Discrepancies
Alessandro Barp · Carl-Johann Simon-Gabriel · Mark Girolami · Lester Mackey
Event URL: https://openreview.net/forum?id=M5xlT_iMmoq »
Kernel Stein discrepancies (KSDs) are maximum mean discrepancies (MMDs) that leverage the score information of distributions, andhave grown central to a wide range of applications. In most settings, these MMDs are required to $(i)$ separate a target $\mathrm{P}$ from other probability measures or even $(ii)$ control weak convergence to $\mathrm{P}$. In this article we derive new sufficient and necessary conditions that substantially broaden the known conditions for KSD separation and convergence control, and develop the first KSDs known to metrize weak convergence to $\mathrm{P}$. Along the way, we highlight the implications of our results for hypothesis testing, measuring and improving sample quality, and sampling with Stein variational gradient descent.

Author Information

Alessandro Barp (University of Cambridge)
Carl-Johann Simon-Gabriel (Amazon Web Services)
Mark Girolami (University of Glasgow)
Lester Mackey (Microsoft Research)

More from the Same Authors