Timezone: »
The statistical learning theory suggests to choose large capacity models that barely avoid over-fitting the training data. In that perspective, all datasets are small. Things become more complicated when one considers the computational cost of processing large datasets. Computationally challenging training sets appear when one want to emulate intelligence: biological brains learn quite efficiently from the continuous streams of perceptual data generated by our six senses, using limited amounts of sugar as a source of power. Computationally challenging training sets also appear when one wants to analyze the masses of data that describe the life of our computerized society. The more data we understand, the more we enjoy competitive advantages.
The first part of the tutorial clarifies the relation between the statistical efficiency, the design of learning algorithms and their computational cost. The second part makes a detailed exploration of specific learning algorithms and of their implementation, with both simple and complex examples. The third part considers algorithms that learn with a single pass over the data. Certain algorithms have optimal properties but are often too costly. Workarounds are discussed. Finally, the fourth part shows how active example selection provides greater speed and reduces the feedback pressure that constrain parallel implementations.
Author Information
Leon Bottou (Facebook AI Research)
Léon Bottou received a Diplôme from l'Ecole Polytechnique, Paris in 1987, a Magistère en Mathématiques Fondamentales et Appliquées et Informatiques from Ecole Normale Supérieure, Paris in 1988, and a PhD in Computer Science from Université de Paris-Sud in 1991. He joined AT&T Bell Labs from 1991 to 1992 and AT&T Labs from 1995 to 2002. Between 1992 and 1995 he was chairman of Neuristique in Paris, a small company pioneering machine learning for data mining applications. He has been with NEC Labs America in Princeton since 2002. Léon's primary research interest is machine learning. His contributions to this field address theory, algorithms and large scale applications. Léon's secondary research interest is data compression and coding. His best known contribution in this field is the DjVu document compression technology (http://www.djvu.org.) Léon published over 70 papers and is serving on the boards of JMLR and IEEE TPAMI. He also serves on the scientific advisory board of Kxen Inc .
Andrew W Moore (Carnegie Mellon University)
Andrew Moore is currently responsible for growing a new Google office on CMU's campus in Pittsburgh. The office focuses on numerous statistical and large scale systems issues in structured data extraction, Google's infrastructure, internet advertising, and fraud prevention. Prior to joining Google in January 2006, Andrew was a Professor of Robotics and Computer Science at the School of Computer Science, Carnegie Mellon University. Andrew began his career writing video-games for an obscure British personal computer ( http://www.oric.org/index.php?page=software&fille=detail&num_log=2). He rapidly became a thousandaire and retired to academia, where he received a PhD from the University of Cambridge in 1991. His main research interest is data mining: statistical algorithms for finding all the potentially useful and statistically meaningful patterns in large sources of data.
More from the Same Authors
-
2021 : On the Relation between Distributionally Robust Optimization and Data Curation »
Agnieszka Słowik · Leon Bottou -
2021 : On the Relation between Distributionally Robust Optimization and Data Curation »
Agnieszka Słowik · Leon Bottou -
2021 : Poster: Algorithmic Bias and Data Bias: Understanding the Relation between Distributionally Robust Optimization and Data Curation »
Agnieszka Słowik · Leon Bottou -
2022 : Pre-train, fine-tune, interpolate: a three-stage strategy for domain generalization »
Alexandre Rame · Jianyu Zhang · Leon Bottou · David Lopez-Paz -
2022 Poster: The Effects of Regularization and Data Augmentation are Class Dependent »
Randall Balestriero · Leon Bottou · Yann LeCun -
2021 : Algorithmic Bias and Data Bias: Understanding the Relation between Distributionally Robust Optimization and Data Curation »
Agnieszka Słowik · Leon Bottou -
2019 Poster: Cold Case: The Lost MNIST Digits »
Chhavi Yadav · Leon Bottou -
2019 Spotlight: Cold Case: The Lost MNIST Digits »
Chhavi Yadav · Leon Bottou -
2018 Workshop: Causal Learning »
Martin Arjovsky · Christina Heinze-Deml · Anna Klimovskaia · Maxime Oquab · Leon Bottou · David Lopez-Paz -
2018 Workshop: Smooth Games Optimization and Machine Learning »
Simon Lacoste-Julien · Ioannis Mitliagkas · Gauthier Gidel · Vasilis Syrgkanis · Eva Tardos · Leon Bottou · Sebastian Nowozin -
2018 Poster: SING: Symbol-to-Instrument Neural Generator »
Alexandre Defossez · Neil Zeghidour · Nicolas Usunier · Leon Bottou · Francis Bach -
2017 : Geometrical Insights for Unsupervised Learning »
Leon Bottou -
2017 : Looking for a Missing Signal »
Leon Bottou -
2016 : Welcome »
David Lopez-Paz · Alec Radford · Leon Bottou -
2016 Workshop: Adversarial Training »
David Lopez-Paz · Leon Bottou · Alec Radford -
2015 Workshop: Optimization for Machine Learning (OPT2015) »
Suvrit Sra · Alekh Agarwal · Leon Bottou · Sashank J. Reddi -
2014 Workshop: Learning Semantics »
Cedric Archambeau · Antoine Bordes · Leon Bottou · Chris J Burges · David Grangier -
2014 Workshop: Deep Learning and Representation Learning »
Andrew Y Ng · Yoshua Bengio · Adam Coates · Roland Memisevic · Sharanyan Chetlur · Geoffrey E Hinton · Shamim Nemati · Bryan Catanzaro · Surya Ganguli · Herbert Jaeger · Phil Blunsom · Leon Bottou · Volodymyr Mnih · Chen-Yu Lee · Rich M Schwartz -
2013 Workshop: NIPS 2013 Workshop on Causality: Large-scale Experiment Design and Inference of Causal Mechanisms »
Isabelle Guyon · Leon Bottou · Bernhard Schölkopf · Alexander Statnikov · Evelyne Viegas · james m robins -
2011 Workshop: Learning Semantics »
Antoine Bordes · Jason E Weston · Ronan Collobert · Leon Bottou -
2007 Poster: The Tradeoffs of Large Scale Learning »
Leon Bottou · Olivier Bousquet