Timezone: »

Loss function based second-order Jensen inequality and its application to particle variational inference
Futoshi Futami · Tomoharu Iwata · naonori ueda · Issei Sato · Masashi Sugiyama

Wed Dec 08 12:30 AM -- 02:00 AM (PST) @

Bayesian model averaging, obtained as the expectation of a likelihood function by a posterior distribution, has been widely used for prediction, evaluation of uncertainty, and model selection. Various approaches have been developed to efficiently capture the information in the posterior distribution; one such approach is the optimization of a set of models simultaneously with interaction to ensure the diversity of the individual models in the same way as ensemble learning. A representative approach is particle variational inference (PVI), which uses an ensemble of models as an empirical approximation for the posterior distribution. PVI iteratively updates each model with a repulsion force to ensure the diversity of the optimized models. However, despite its promising performance, a theoretical understanding of this repulsion and its association with the generalization ability remains unclear. In this paper, we tackle this problem in light of PAC-Bayesian analysis. First, we provide a new second-order Jensen inequality, which has the repulsion term based on the loss function. Thanks to the repulsion term, it is tighter than the standard Jensen inequality. Then, we derive a novel generalization error bound and show that it can be reduced by enhancing the diversity of models. Finally, we derive a new PVI that optimizes the generalization error bound directly. Numerical experiments demonstrate that the performance of the proposed PVI compares favorably with existing methods in the experiment.

Author Information

Futoshi Futami (NTT)
Tomoharu Iwata (NTT)
naonori ueda (NTT Communication Science Labs. / RIKEN AIP)
Issei Sato (The University of Tokyo/RIKEN)
Masashi Sugiyama (RIKEN / University of Tokyo)

More from the Same Authors