Timezone: »

Efficient Optimization for Average Precision SVM
Pritish Mohapatra · C.V. Jawahar · M. Pawan Kumar

Tue Dec 09 04:00 PM -- 08:59 PM (PST) @ Level 2, room 210D

The accuracy of information retrieval systems is often measured using average precision (AP). Given a set of positive (relevant) and negative (non-relevant) samples, the parameters of a retrieval system can be estimated using the AP-SVM framework, which minimizes a regularized convex upper bound on the empirical AP loss. However, the high computational complexity of loss-augmented inference, which is required for learning an AP-SVM, prohibits its use with large training datasets. To alleviate this deficiency, we propose three complementary approaches. The first approach guarantees an asymptotic decrease in the computational complexity of loss-augmented inference by exploiting the problem structure. The second approach takes advantage of the fact that we do not require a full ranking during loss-augmented inference. This helps us to avoid the expensive step of sorting the negative samples according to their individual scores. The third approach approximates the AP loss over all samples by the AP loss over difficult samples (for example, those that are incorrectly classified by a binary SVM), while ensuring the correct classification of the remaining samples. Using the PASCAL VOC action classification and object detection datasets, we show that our approaches provide significant speed-ups during training without degrading the test accuracy of AP-SVM.

Author Information

Pritish Mohapatra (International Institute of Information Technology, Hyderabad)
C.V. Jawahar (International Institute of Information Technology, Hyderabad)
M. Pawan Kumar (Ecole Centrale Paris)

More from the Same Authors