Timezone: »

 
Spotlight
Convex Learning with Invariances
Choon Hui Teo · Amir Globerson · Sam T Roweis · Alexander Smola

Mon Dec 03 08:10 PM -- 08:25 PM (PST) @

Incorporating invariances into a learning algorithm is a common problem in machine learning. We provide a convex formulation which can deal with arbitrary loss functions and arbitrary losses. In addition, it is a drop-in replacement for most optimization algorithms for kernels, including solvers of the SVMStruct family. The advantage of our setting is that it relies on column generation instead of modifying the underlying optimization problem directly.

Author Information

Choon Hui Teo (Amazon)
Amir Globerson (Tel Aviv University, Google)

Amir Globerson is senior lecturer at the School of Engineering and Computer Science at the Hebrew University. He received a PhD in computational neuroscience from the Hebrew University, and was a Rothschild postdoctoral fellow at MIT. He joined the Hebrew University in 2008. His research interests include graphical models and probabilistic inference, convex optimization, robust learning and natural language processing.

Sam T Roweis (University of Toronto)
Alexander Smola (Amazon)

**AWS Machine Learning**

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors