Timezone: »

 
Tutorial
Optimization Algorithms in Machine Learning
Stephen Wright

Mon Dec 06 01:00 PM -- 03:00 PM (PST) @ Regency E/F
Event URL: http://www.cs.wisc.edu/~swright/nips2010/ »

Optimization provides a valuable framework for thinking about, formulating, and solving many problems in machine learning. Since specialized techniques for the quadratic programming problem arising in support vector classification were developed in the 1990s, there has been more and more cross-fertilization between optimization and machine learning, with the large size and computational demands of machine learning applications driving much recent algorithmic research in optimization. This tutorial reviews the major computational paradigms in machine learning that are amenable to optimization algorithms, then discusses the algorithmic tools that are being brought to bear on such applications. We focus particularly on such algorithmic tools of recent interest as stochastic and incremental gradient methods, online optimization, augmented Lagrangian methods, and the various tools that have been applied recently in sparse and regularized optimization

Author Information

Stephen Wright (UW-Madison)

Steve Wright is a Professor of Computer Sciences at the University of Wisconsin-Madison. His research interests lie in computational optimization and its applications to science and engineering. Prior to joining UW-Madison in 2001, Wright was a Senior Computer Scientist (1997-2001) and Computer Scientist (1990-1997) at Argonne National Laboratory, and Professor of Computer Science at the University of Chicago (2000-2001). He is the past Chair of the Mathematical Optimization Society (formerly the Mathematical Programming Society), the leading professional society in optimization, and a member of the Board of the Society for Industrial and Applied Mathematics (SIAM). Wright is the author or co-author of four widely used books in numerical optimization, including "Primal Dual Interior-Point Methods" (SIAM, 1997) and "Numerical Optimization" (with J. Nocedal, Second Edition, Springer, 2006). He has also authored over 85 refereed journal papers on optimization theory, algorithms, software, and applications. He is coauthor of widely used interior-point software for linear and quadratic optimization. His recent research includes algorithms, applications, and theory for sparse optimization (including applications in compressed sensing and machine learning).

More from the Same Authors