Skip to yearly menu bar Skip to main content


Poster

The Robustness of Estimator Composition

Pingfan Tang · Jeff M Phillips

Area 5+6+7+8 #30

Keywords: [ Regularization and Large Margin Methods ] [ Large Scale Learning and Big Data ] [ (Other) Statistics ]


Abstract:

We formalize notions of robustness for composite estimators via the notion of a breakdown point. A composite estimator successively applies two (or more) estimators: on data decomposed into disjoint parts, it applies the first estimator on each part, then the second estimator on the outputs of the first estimator. And so on, if the composition is of more than two estimators. Informally, the breakdown point is the minimum fraction of data points which if significantly modified will also significantly modify the output of the estimator, so it is typically desirable to have a large breakdown point. Our main result shows that, under mild conditions on the individual estimators, the breakdown point of the composite estimator is the product of the breakdown points of the individual estimators. We also demonstrate several scenarios, ranging from regression to statistical testing, where this analysis is easy to apply, useful in understanding worst case robustness, and sheds powerful insights onto the associated data analysis.

Live content is unavailable. Log in and register to view live content