Timezone: »

Tight Mutual Information Estimation With Contrastive Fenchel-Legendre Optimization
Qing Guo · Junya Chen · Dong Wang · Yuewei Yang · Xinwei Deng · Jing Huang · Larry Carin · Fan Li · Chenyang Tao

Tue Nov 29 02:00 PM -- 04:00 PM (PST) @ Hall J #931

Successful applications of InfoNCE (Information Noise-Contrastive Estimation) and its variants have popularized the use of contrastive variational mutual information (MI) estimators in machine learning . While featuring superior stability, these estimators crucially depend on costly large-batch training, and they sacrifice bound tightness for variance reduction. To overcome these limitations, we revisit the mathematics of popular variational MI bounds from the lens of unnormalized statistical modeling and convex optimization. Our investigation yields a new unified theoretical framework encompassing popular variational MI bounds, and leads to a novel, simple, and powerful contrastive MI estimator we name FLO. Theoretically, we show that the FLO estimator is tight, and it converges under stochastic gradient descent. Empirically, the proposed FLO estimator overcomes the limitations of its predecessors and learns more efficiently. The utility of FLO is verified using extensive benchmarks, and we further inspire the community with novel applications in meta-learning. Our presentation underscores the foundational importance of variational MI estimation in data-efficient learning.

Author Information

Qing Guo (Virginia Tech)
Junya Chen (Duke University)
Dong Wang (Duke University)
Yuewei Yang (Duke University)
Xinwei Deng (Virginia Tech)
Jing Huang (JD AI Research)
Larry Carin
Fan Li (Duke University)
Chenyang Tao (Amazon)

More from the Same Authors