Timezone: »

Demystifying Orthogonal Monte Carlo and Beyond
Han Lin · Haoxian Chen · Krzysztof M Choromanski · Tianyi Zhang · Clement Laroche

Wed Dec 09 09:00 AM -- 11:00 AM (PST) @ Poster Session 3 #989

Orthogonal Monte Carlo (OMC) is a very effective sampling algorithm imposing structural geometric conditions (orthogonality) on samples for variance reduction. Due to its simplicity and superior performance as compared to its Quasi Monte Carlo counterparts, OMC is used in a wide spectrum of challenging machine learning applications ranging from scalable kernel methods to predictive recurrent neural networks, generative models and reinforcement learning. However theoretical understanding of the method remains very limited. In this paper we shed new light on the theoretical principles behind OMC, applying theory of negatively dependent random variables to obtain several new concentration results. As a corollary, we manage to obtain first uniform convergence results for OMCs and consequently, substantially strengthen best known downstream guarantees for kernel ridge regression via OMCs. We also propose novel extensions of the method leveraging theory of algebraic varieties over finite fields and particle algorithms, called Near-Orthogonal Monte Carlo (NOMC). We show that NOMC is the first algorithm consistently outperforming OMC in applications ranging from kernel methods to approximating distances in probabilistic metric spaces.

Author Information

Han Lin (Columbia University)
Haoxian Chen (Columbia University)
Krzysztof M Choromanski (Google Brain Robotics)
Tianyi Zhang (Columbia University)
Clement Laroche (Columbia University)

More from the Same Authors