Timezone: »
Poster
The Impact of Regularization on Highdimensional Logistic Regression
Fariborz Salehi · Ehsan Abbasi · Babak Hassibi
Thu Dec 12 10:45 AM  12:45 PM (PST) @ East Exhibition Hall B + C #246
Logistic regression is commonly used for modeling dichotomous outcomes. In the classical setting, where the number of observations is much larger than the number of parameters, properties of the maximum likelihood estimator in logistic regression are well understood. Recently, Sur and Candes~\cite{sur2018modern} have studied logistic regression in the highdimensional regime, where the number of observations and parameters are comparable, and show, among other things, that the maximum likelihood estimator is biased. In the highdimensional regime the underlying parameter vector is often structured (sparse, blocksparse, finitealphabet, etc.) and so in this paper we study regularized logistic regression (RLR), where a convex regularizer that encourages the desired structure is added to the negative of the loglikelihood function. An advantage of RLR is that it allows parameter recovery even for instances where the (unconstrained) maximum likelihood estimate does not exist. We provide a precise analysis of the performance of RLR via the solution of a system of six nonlinear equations, through which any performance metric of interest (mean, meansquared error, probability of support recovery, etc.) can be explicitly computed. Our results generalize those of Sur and Candes and we provide a detailed study for the cases of $\ell_2^2$RLR and sparse ($\ell_1$regularized) logistic regression. In both cases, we obtain explicit expressions for various performance metrics and can find the values of the regularizer parameter that optimizes the desired performance. The theory is validated by extensive numerical simulations across a range of parameter values and problem instances.
Author Information
Fariborz Salehi (California Institute of Technology)
Ehsan Abbasi (Caltech)
Babak Hassibi (Caltech)
More from the Same Authors

2020 Poster: Logarithmic Regret Bound in Partially Observable Linear Dynamical Systems »
Sahin Lale · Kamyar Azizzadenesheli · Babak Hassibi · Anima Anandkumar 
2019 Poster: Universality in Learning from Linear Measurements »
Ehsan Abbasi · Fariborz Salehi · Babak Hassibi 
2018 Poster: Learning without the Phase: Regularized PhaseMax Achieves Optimal Sample Complexity »
Fariborz Salehi · Ehsan Abbasi · Babak Hassibi 
2017 Poster: A Universal Analysis of LargeScale Regularized Least Squares Solutions »
Ashkan Panahi · Babak Hassibi 
2017 Spotlight: A Universal Analysis of LargeScale Regularized Least Squares Solutions »
Ashkan Panahi · Babak Hassibi 
2016 Poster: Fundamental Limits of BudgetFidelity Tradeoff in Label Crowdsourcing »
Farshad Lahouti · Babak Hassibi 
2015 Poster: LASSO with Nonlinear Measurements is Equivalent to One With Linear Measurements »
CHRISTOS THRAMPOULIDIS · Ehsan Abbasi · Babak Hassibi 
2015 Spotlight: LASSO with Nonlinear Measurements is Equivalent to One With Linear Measurements »
CHRISTOS THRAMPOULIDIS · Ehsan Abbasi · Babak Hassibi