Timezone: »

Distributionally Robust Parametric Maximum Likelihood Estimation
Viet Anh Nguyen · Xuhui Zhang · Jose Blanchet · Angelos Georghiou

Thu Dec 10 09:00 AM -- 11:00 AM (PST) @ Poster Session 5 #1670

We consider the parameter estimation problem of a probabilistic generative model prescribed using a natural exponential family of distributions. For this problem, the typical maximum likelihood estimator usually overfits under limited training sample size, is sensitive to noise and may perform poorly on downstream predictive tasks. To mitigate these issues, we propose a distributionally robust maximum likelihood estimator that minimizes the worst-case expected log-loss uniformly over a parametric Kullback-Leibler ball around a parametric nominal distribution. Leveraging the analytical expression of the Kullback-Leibler divergence between two distributions in the same natural exponential family, we show that the min-max estimation problem is tractable in a broad setting, including the robust training of generalized linear models. Our novel robust estimator also enjoys statistical consistency and delivers promising empirical results in both regression and classification tasks.

Author Information

Viet Anh Nguyen (Stanford University)
Xuhui Zhang (Stanford University)
Jose Blanchet (Stanford University)
Angelos Georghiou (University of Cyprus)

More from the Same Authors