Skip to yearly menu bar Skip to main content


Oral Poster

Bridging Discrete and Backpropagation: Straight-Through and Beyond

Liyuan Liu · Chengyu Dong · Xiaodong Liu · Bin Yu · Jianfeng Gao

Great Hall & Hall B1+B2 (level 1) #503
[ ] [ Project Page ]
[ Paper [ Slides [ Poster [ OpenReview
Tue 12 Dec 3:15 p.m. PST — 5:15 p.m. PST
 
Oral presentation: Oral 2A Efficient Learning
Tue 12 Dec 1:40 p.m. PST — 2:40 p.m. PST

Abstract:

Backpropagation, the cornerstone of deep learning, is limited to computing gradients for continuous variables. This limitation poses challenges for problems involving discrete latent variables. To address this issue, we propose a novel approach to approximate the gradient of parameters involved in generating discrete latent variables. First, we examine the widely used Straight-Through (ST) heuristic and demonstrate that it works as a first-order approximation of the gradient. Guided by our findings, we propose ReinMax, which achieves second-order accuracy by integrating Heun’s method, a second-order numerical method for solving ODEs. ReinMax does not require Hessian or other second-order derivatives, thus having negligible computation overheads. Extensive experimental results on various tasks demonstrate the superiority of ReinMax over the state of the art.

Chat is not available.