Timezone: »

 
Poster
Asymptotically Optimal Regularization in Smooth Parametric Models
Percy Liang · Francis Bach · Guillaume Bouchard · Michael Jordan

Mon Dec 07 07:00 PM -- 11:59 PM (PST) @ None #None

Many types of regularization schemes have been employed in statistical learning, each one motivated by some assumption about the problem domain. In this paper, we present a unified asymptotic analysis of smooth regularizers, which allows us to see how the validity of these assumptions impacts the success of a particular regularizer. In addition, our analysis motivates an algorithm for optimizing regularization parameters, which in turn can be analyzed within our framework. We apply our analysis to several examples, including hybrid generative-discriminative learning and multi-task learning.

Author Information

Percy Liang (Stanford University)
Francis Bach (INRIA - Ecole Normale Superieure)
Guillaume Bouchard (Xerox Research Center Europe)
Michael Jordan (UC Berkeley)

More from the Same Authors