Timezone: »

 
Oral
Multi-Label Prediction via Compressed Sensing
Daniel Hsu · Sham M Kakade · John Langford · Tong Zhang

Tue Dec 08 09:30 AM -- 09:50 AM (PST) @

We consider multi-label prediction problems with large output spaces under the assumption of output sparsity -- that the target (label) vectors have small support. We develop a general theory for a variant of the popular error correcting output code scheme, using ideas from compressed sensing for exploiting this sparsity. The method can be regarded as a simple reduction from multi-label regression problems to binary regression problems. We show that the number of subproblems need only be logarithmic in the total number of possible labels, making this approach radically more efficient than others. We also state and prove robustness guarantees for this method in the form of regret transform bounds (in general), and also provide a more detailed analysis for the linear prediction setting.

Author Information

Daniel Hsu (Columbia University)

See <https://www.cs.columbia.edu/~djhsu/>

Sham M Kakade (Harvard University &amp; Amazon)
John Langford (Microsoft Research)

John Langford is a machine learning research scientist, a field which he says "is shifting from an academic discipline to an industrial tool". He is the author of the weblog hunch.net and the principal developer of Vowpal Wabbit. John works at Microsoft Research New York, of which he was one of the founding members, and was previously affiliated with Yahoo! Research, Toyota Technological Institute, and IBM's Watson Research Center. He studied Physics and Computer Science at the California Institute of Technology, earning a double bachelor's degree in 1997, and received his Ph.D. in Computer Science from Carnegie Mellon University in 2002. He was the program co-chair for the 2012 International Conference on Machine Learning.

Tong Zhang (The Hong Kong University of Science and Technology)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors