Timezone: »

Learning Monotonic Transformations for Classification
Andrew G Howard · Tony Jebara

Mon Dec 03 08:10 PM -- 08:25 PM (PST) @ None

A discriminative method is proposed for learning monotonic transformations of the training data jointly while estimating a large-margin classifier. Fixed monotonic transformations can be useful as a preprocessing step for many domains such as document classification, image histogram classification and gene microarray experiments. However, most classifiers only explore transformations through manual trial and error or via prior domain knowledge. The proposed method learns monotonic transformations automatically while training a large-margin classifier without any prior knowledge of the domain at hand. A monotonic piecewise linear function is learned which transforms data for subsequent processing by a linear hyperplane classifier. Two algorithmic implementations of the method are formalized. The first performs an alternating sequence of quadratic and linear programs to convergence until it obtains a locally optimal solution. A second algorithm is also provided using a convex semidefinite relaxation that overcomes initialization issues in the initial optimization problem. The effectiveness of these learned transformations on synthetic problems, text data and image data is demonstrated.

Author Information

Andrew G Howard (Columbia University)
Tony Jebara (Spotify)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors