Workshop
Modern Nonparametrics 3: Automating the Learning Pipeline
Eric Xing · Mladen Kolar · Arthur Gretton · Samory Kpotufe · Han Liu · Zoltán Szabó · Alan Yuille · Andrew G Wilson · Ryan Tibshirani · Sasha Rakhlin · Damian Kozbur · Bharath Sriperumbudur · David Lopez-Paz · Kirthevasan Kandasamy · Francesco Orabona · Andreas Damianou · Wacha Bounliphone · Yanshuai Cao · Arijit Das · Yingzhen Yang · Giulia DeSalvo · Dmitry Storcheus · Roberto Valerio
Level 5, room 511 e
Sat 13 Dec, 5:30 a.m. PST
Nonparametric methods (kernel methods, kNN, classification trees, etc) are designed to handle complex pattern recognition problems. Such complex problems arise in modern applications such as genomic experiments, climate analysis, robotic control, social network analysis, and so forth. In fact, contemporary statistical procedures are making inroads into a variety of modern application areas as part of solutions to larger problems. As such there is a growing need for statistical procedures that can be used "off-the-shelf", i.e. procedures with as few parameters as possible, or better yet, procedures which can "self-tune" to a particular application at hand.
The problem of devising 'parameter-free' procedures has been addressed in separate areas of the pattern-recognition literature under various names and different emphasis.
In traditional statistics, much effort has gone into so called "adaptive" procedures which can attain optimal risks over large sets of models of increasing complexity. Examples are model selection approaches based on penalized empirical risk minimization, approaches based on stability of estimates (e.g. Lepski’s methods), thresholding approaches under sparsity assumptions, and model averaging approaches. Most of these approaches rely on having tight bounds on the risk of learning procedures (under any parameter setting), hence other approaches concentrate on tight estimations of the actual risks, e.g., Stein’s risk estimators, bootstrapping methods, data dependent learning bounds.
In theoretical machine learning, much of the work has focused on proper tuning of the actual optimization procedures used to minimize (penalized) empirical risks. In particular, great effort has gone into the automatic setting of important tuning parameters such as 'learning rates' and 'step sizes'.
Another approach out of machine learning arises in the kernel literature for 'automatic representation learning'. The aim of the approach, similar to theoretical work on model selection, is to automatically learn an appropriate (kernel) transformation of the data for use with kernel methods such as SVMs or Gaussian processes.
In practice, the simplest self-tuning procedures take the form of cross-validation and variants. Cross-validation can however be expensive in practice, and impractical in various constrained settings -- e.g., streaming settings, in settings with large amounts of tuning parameters, and generally in unsupervised learning problems.
More generally, many existing self-tuning or parameter-free methods are unfortunately expensive given large modern data sizes and dimensionality, while the cheaper methods tend to self-tune only to small model classes. Ideally we would want self-tuning procedures that can adapt to easy or difficult (nonparametric) problems, while satisfying the practical constraints of modern applications.
A main aim of this workshop is to cover the various approaches proposed so far towards automating the learning pipeline, and the practicality of these approaches in light of modern constraints. We are particularly interested in understanding whether large datasizes and dimensionality might help the automation effort since such datasets in fact provide more information on the patterns being learned.
Through a number of invited and contributed talks and a focused panel discussion, we plan to bring together both theoretical and applied researchers to discuss these challenges in detail, share insight on existing solutions, and lay out some of the important future directions towards answering the demands of modern applications.
Live content is unavailable. Log in and register to view live content