Timezone: »
We consider the parameter estimation problem of a probabilistic generative model prescribed using a natural exponential family of distributions. For this problem, the typical maximum likelihood estimator usually overfits under limited training sample size, is sensitive to noise and may perform poorly on downstream predictive tasks. To mitigate these issues, we propose a distributionally robust maximum likelihood estimator that minimizes the worst-case expected log-loss uniformly over a parametric Kullback-Leibler ball around a parametric nominal distribution. Leveraging the analytical expression of the Kullback-Leibler divergence between two distributions in the same natural exponential family, we show that the min-max estimation problem is tractable in a broad setting, including the robust training of generalized linear models. Our novel robust estimator also enjoys statistical consistency and delivers promising empirical results in both regression and classification tasks.
Author Information
Viet Anh Nguyen (Stanford University)
Xuhui Zhang (Stanford University)
Jose Blanchet (Stanford University)
Angelos Georghiou (University of Cyprus)
More from the Same Authors
-
2022 : Minimax Optimal Kernel Operator Learning via Multilevel Training »
Jikai Jin · Yiping Lu · Jose Blanchet · Lexing Ying -
2022 : Synthetic Principle Component Design: Fast Covariate Balancing with Synthetic Controls »
Yiping Lu · Jiajin Li · Lexing Ying · Jose Blanchet -
2022 Poster: Sobolev Acceleration and Statistical Optimality for Learning Elliptic Equations via Gradient Descent »
Yiping Lu · Jose Blanchet · Lexing Ying -
2022 Poster: Tikhonov Regularization is Optimal Transport Robust under Martingale Constraints »
Jiajin Li · Sirui Lin · Jose Blanchet · Viet Anh Nguyen -
2021 : Statistical Numerical PDE : Fast Rate, Neural Scaling Law and When it’s Optimal »
Yiping Lu · Haoxuan Chen · Jianfeng Lu · Lexing Ying · Jose Blanchet -
2021 Poster: Adversarial Regression with Doubly Non-negative Weighting Matrices »
Tam Le · Truyen Nguyen · Makoto Yamada · Jose Blanchet · Viet Anh Nguyen -
2021 Poster: Modified Frank Wolfe in Probability Space »
Carson Kent · Jiajin Li · Jose Blanchet · Peter W Glynn -
2020 Poster: Quantifying the Empirical Wasserstein Distance to a Set of Measures: Beating the Curse of Dimensionality »
Nian Si · Jose Blanchet · Soumyadip Ghosh · Mark Squillante -
2020 Spotlight: Quantifying the Empirical Wasserstein Distance to a Set of Measures: Beating the Curse of Dimensionality »
Nian Si · Jose Blanchet · Soumyadip Ghosh · Mark Squillante -
2020 Poster: Distributionally Robust Local Non-parametric Conditional Estimation »
Viet Anh Nguyen · Fan Zhang · Jose Blanchet · Erick Delage · Yinyu Ye -
2019 Poster: Calculating Optimistic Likelihoods Using (Geodesically) Convex Optimization »
Viet Anh Nguyen · Soroosh Shafieezadeh Abadeh · Man-Chung Yue · Daniel Kuhn · Wolfram Wiesemann -
2019 Poster: Learning in Generalized Linear Contextual Bandits with Stochastic Delays »
Zhengyuan Zhou · Renyuan Xu · Jose Blanchet -
2019 Spotlight: Learning in Generalized Linear Contextual Bandits with Stochastic Delays »
Zhengyuan Zhou · Renyuan Xu · Jose Blanchet -
2019 Poster: Optimistic Distributionally Robust Optimization for Nonparametric Likelihood Approximation »
Viet Anh Nguyen · Soroosh Shafieezadeh Abadeh · Man-Chung Yue · Daniel Kuhn · Wolfram Wiesemann -
2019 Poster: Online EXP3 Learning in Adversarial Bandits with Delayed Feedback »
Ilai Bistritz · Zhengyuan Zhou · Xi Chen · Nicholas Bambos · Jose Blanchet -
2019 Poster: Multivariate Distributionally Robust Convex Regression under Absolute Error Loss »
Jose Blanchet · Peter W Glynn · Jun Yan · Zhengqing Zhou -
2019 Poster: Semi-Parametric Dynamic Contextual Pricing »
Virag Shah · Ramesh Johari · Jose Blanchet