Timezone: »

Kernel Bayesian Inference with Posterior Regularization
Yang Song · Jun Zhu · Yong Ren

Wed Dec 07 09:00 AM -- 12:30 PM (PST) @ Area 5+6+7+8 #132

We propose a vector-valued regression problem whose solution is equivalent to the reproducing kernel Hilbert space (RKHS) embedding of the Bayesian posterior distribution. This equivalence provides a new understanding of kernel Bayesian inference. Moreover, the optimization problem induces a new regularization for the posterior embedding estimator, which is faster and has comparable performance to the squared regularization in kernel Bayes' rule. This regularization coincides with a former thresholding approach used in kernel POMDPs whose consistency remains to be established. Our theoretical work solves this open problem and provides consistency analysis in regression settings. Based on our optimizational formulation, we propose a flexible Bayesian posterior regularization framework which for the first time enables us to put regularization at the distribution level. We apply this method to nonparametric state-space filtering tasks with extremely nonlinear dynamics and show performance gains over all other baselines.

Author Information

Yang Song (Stanford University)
Jun Zhu (Tsinghua University)
Yong Ren (Tsinghua University)

More from the Same Authors