In this talk I will present new inference tools for adaptive statistical procedures. These tools provide p-values and confidence intervals that have correct "post-selection" properties: they account for the selection that has already been carried out on the same data. I discuss application of these ideas to a wide variety of problems including Forward Stepwise Regression, Lasso, PCA, and graphical models. I will also discuss computational issues and software for implementation of these ideas.
This talk represents work (some joint) with many people including Jonathan Taylor, Richard Lockhart, Ryan Tibshirani, Will Fithian, Jason Lee, Dennis Sun, Yuekai Sun and Yunjin Choi.
Robert Tibshirani (Standford University)
Robert Tibshirani is a Professor in the Departments of Statistics and Health Research and Policy at Stanford University. He received a B.Math. from the University of Waterloo, an M.Sc. from the University of Toronto and a Ph.D. from Stanford University. He was a Professor at the University of Toronto from 1985 to 1998. In his work he has made important contributions to the analysis of complex datasets, most recently in genomics and proteomics. Some of his most well-known contributions are the lasso, which uses L1 penalization in regression and related problems, generalized additive models and Significance Analysis of Microarrays (SAM). He also co-authored four books "Generalized Additive Models", "An Introduction to the Bootstrap", and "The Elements of Statistical Learning" (now in its second edition) and "Statistical learning with Sparsity".