Skip to yearly menu bar Skip to main content


Poster

Gradient-based Discrete Sampling with Automatic Cyclical Scheduling

Patrick Pynadath · Riddhiman Bhattacharya · ARUN HARIHARAN · Ruqi Zhang


Abstract:

Discrete distributions are often highly multimodal due to discontinuities inherent in their spaces, especially in high-dimensional distributions encountered in deep models. While gradient-based discrete sampling has proven effective for sampling in discrete spaces, it is susceptible to becoming trapped in local modes due to the gradient information. To tackle this challenge, we propose an automatic cyclical scheduling, designed for efficient and accurate sampling in multimodal discrete distributions. Our method contains three key components: (1) a cyclical step size schedule where large steps discover new modes and small steps exploit each mode; (2) a cyclical balancing schedule, ensuring "balanced" proposals for a given step size; and (3) an automatic tuning scheme for adjusting the hyperparameters in the cyclical schedules, allowing adaptability across diverse datasets without manual tuning. We prove the non-asymptotic convergence of our method in general discrete distributions. Various experiments demonstrate the superiority of our method in learning complex multimodal discrete distributions.

Live content is unavailable. Log in and register to view live content