Skip to yearly menu bar Skip to main content


Poster

Decomposition-Invariant Conditional Gradient for General Polytopes with Line Search

Mohammad Ali Bashiri · Xinhua Zhang

Pacific Ballroom #168

Keywords: [ Large Margin Methods ] [ Convex Optimization ] [ Kernel Methods ]


Abstract:

Frank-Wolfe (FW) algorithms with linear convergence rates have recently achieved great efficiency in many applications. Garber and Meshi (2016) designed a new decomposition-invariant pairwise FW variant with favorable dependency on the domain geometry. Unfortunately, it applies only to a restricted class of polytopes and cannot achieve theoretical and practical efficiency at the same time. In this paper, we show that by employing an away-step update, similar rates can be generalized to arbitrary polytopes with strong empirical performance. A new "condition number" of the domain is introduced which allows leveraging the sparsity of the solution. We applied the method to a reformulation of SVM, and the linear convergence rate depends, for the first time, on the number of support vectors.

Live content is unavailable. Log in and register to view live content