Timezone: »

 
Tutorial
Modern Bayesian Nonparametrics
Peter Orbanz · Yee Whye Teh

Mon Dec 12 03:00 AM -- 05:00 AM (PST) @ Manuel de Falla

A nonparametric model is a model on an infinite dimensional parameter space. The parameter space represents the set of all possible solutions for a given learning problem -- for example, the set of smooth functions in nonlinear regression, or of all probability densities in a density estimation problem. A Bayesian nonparametric model defines a prior distribution on such an infinite dimensional space, where the traditional prior assumptions (e.g. "the parameter is likely to be small") are replaced by structural assumptions ("the function is likely to be smooth"), and learning then requires computation of the posterior distribution given data.

The tutorial will provide a high-level introduction to modern Bayesian nonparametrics. Since first attracting attention at NIPS a decade ago, the field has evolved substantially in the machine learning, statistics and probability communities. We now have a much improved understanding of how novel models can be used effectively in applications, of their theoretical properties, of techniques for posterior computation, and of how they can be combined to fit the requirements of a given problem. In the tutorial, we will survey the current state of the art with a focus on recent developments of interest in machine learning.

Author Information

Peter Orbanz (Columbia University)

Peter Orbanz is a research fellow at the University of Cambridge. He holds a PhD degree from ETH Zurich and will join the Statistics Faculty at Columbia University as an Assistant Professor in 2012. He is interested in the mathematical and algorithmic aspects of Bayesian nonparametric models and of related learning technologies.

Yee Whye Teh (University of Oxford, DeepMind)

I am a Professor of Statistical Machine Learning at the Department of Statistics, University of Oxford and a Research Scientist at DeepMind. I am also an Alan Turing Institute Fellow and a European Research Council Consolidator Fellow. I obtained my Ph.D. at the University of Toronto (working with Geoffrey Hinton), and did postdoctoral work at the University of California at Berkeley (with Michael Jordan) and National University of Singapore (as Lee Kuan Yew Postdoctoral Fellow). I was a Lecturer then a Reader at the Gatsby Computational Neuroscience Unit, UCL, and a tutorial fellow at University College Oxford, prior to my current appointment. I am interested in the statistical and computational foundations of intelligence, and works on scalable machine learning, probabilistic models, Bayesian nonparametrics and deep learning. I was programme co-chair of ICML 2017 and AISTATS 2010.

More from the Same Authors