Skip to yearly menu bar Skip to main content


Poster

Relaxed Clipping: A Global Training Method for Robust Regression and Classification

Yao-Liang Yu · Min Yang · Linli Xu · Martha White · Dale Schuurmans


Abstract:

Robust regression and classification are often thought to require non-convex loss functions that prevent scalable, global training. However, such a view neglects the possibility of reformulated training methods that can yield practically solvable alternatives. A natural way to make a loss function more robust to outliers is to truncate loss values that exceed a maximum threshold. We demonstrate that a relaxation of this form of ``loss clipping'' can be made globally solvable and applicable to any standard loss while guaranteeing robustness against outliers. We present a generic procedure that can be applied to standard loss functions and demonstrate improved robustness in regression and classification problems.

Live content is unavailable. Log in and register to view live content