Poster
Trading off Consistency and Dimensionality of Convex Surrogates for Multiclass Classification
Enrique Nueve · Dhamma Kimpara · Bo Waggoner · Jessica Finocchiaro
East Exhibit Hall A-C #4405
Abstract:
In multiclass classification over outcomes, we typically optimize some surrogate loss assigning real-valued error to predictions in . In this paradigm, outcomes must be embedded into the reals with dimension in order to design a consistent surrogate loss. Consistent losses are well-motivated theoretically, yet for large , such as in information retrieval and structured prediction tasks, their optimization may be computationally infeasible. In practice, outcomes are typically embedded into some for , with little known about their suitability for multiclass classification. We investigate two approaches for trading off consistency and dimensionality in multiclass classification while using a convex surrogate loss. We first formalize partial consistency when the optimized surrogate has dimension . We then check if partial consistency holds under a given embedding and low-noise assumption, providing insight into when to use a particular embedding into . Finally, we present a new method to construct (fully) consistent losses with out of multiple problem instances. Our practical approach leverages parallelism to sidestep lower bounds on .
Chat is not available.