Timezone: »

Stochastic Multiple Target Sampling Gradient Descent
Hoang Phan · Ngoc Tran · Trung Le · Toan Tran · Nhat Ho · Dinh Phung

Wed Nov 30 02:00 PM -- 04:00 PM (PST) @ Hall J #710

Sampling from an unnormalized target distribution is an essential problem with many applications in probabilistic inference. Stein Variational Gradient Descent (SVGD) has been shown to be a powerful method that iteratively updates a set of particles to approximate the distribution of interest. Furthermore, when analysing its asymptotic properties, SVGD reduces exactly to a single-objective optimization problem and can be viewed as a probabilistic version of this single-objective optimization problem. A natural question then arises: ``Can we derive a probabilistic version of the multi-objective optimization?''. To answer this question, we propose Stochastic Multiple Target Sampling Gradient Descent (MT-SGD), enabling us to sample from multiple unnormalized target distributions. Specifically, our MT-SGD conducts a flow of intermediate distributions gradually orienting to multiple target distributions, which allows the sampled particles to move to the joint high-likelihood region of the target distributions. Interestingly, the asymptotic analysis shows that our approach reduces exactly to the multiple-gradient descent algorithm for multi-objective optimization, as expected. Finally, we conduct comprehensive experiments to demonstrate the merit of our approach to multi-task learning.

Author Information

Hoang Phan (School of Information and Communication Technology, Hanoi University of Science and Technology)
Ngoc Tran (VinAI Research)
Ngoc Tran

Master of Science from Rensselaer Polytechnic Institute, Research Resident at VinAI Research.

Trung Le (Monash University)
Toan Tran (Vinai artificial intelligence application and research JSC)
Nhat Ho (University of Texas at Austin)
Dinh Phung (Monash University)

More from the Same Authors