Timezone: »

Inductive Regularized Learning of Kernel Functions
Prateek Jain · Brian Kulis · Inderjit Dhillon

Tue Dec 07 12:00 AM -- 12:00 AM (PST) @ None #None
In this paper we consider the fundamental problem of semi-supervised kernel function learning. We propose a general regularized framework for learning a kernel matrix, and then demonstrate an equivalence between our proposed kernel matrix learning framework and a general linear transformation learning problem. Our result shows that the learned kernel matrices parameterize a linear transformation kernel function and can be applied inductively to new data points. Furthermore, our result gives a constructive method for kernelizing most existing Mahalanobis metric learning formulations. To make our results practical for large-scale data, we modify our framework to limit the number of parameters in the optimization process. We also consider the problem of kernelized inductive dimensionality reduction in the semi-supervised setting. We introduce a novel method for this problem by considering a special case of our general kernel learning framework where we select the trace norm function as the regularizer. We empirically demonstrate that our framework learns useful kernel functions, improving the $k$-NN classification accuracy significantly in a variety of domains. Furthermore, our kernelized dimensionality reduction technique significantly reduces the dimensionality of the feature space while achieving competitive classification accuracies.

Author Information

Prateek Jain (Microsoft Research)
Brian Kulis (Boston University)
Inderjit Dhillon (UT Austin & Amazon)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors