Timezone: »
Poster
Kernel Methods for Deep Learning
Youngmin Cho · Lawrence Saul
We introduce a new family of positive-definite kernel functions that mimic the computation in large, multilayer neural nets. These kernel functions can be used in shallow architectures, such as support vector machines (SVMs), or in deep kernel-based architectures that we call multilayer kernel machines (MKMs). We evaluate SVMs and MKMs with these kernel functions on problems designed to illustrate the advantages of deep architectures. On several problems, we obtain better results than previous, leading benchmarks from both SVMs with Gaussian kernels as well as deep belief nets.
Author Information
Youngmin Cho (University of California, San Diego)
Lawrence Saul (Flatiron Institute)
More from the Same Authors
-
2021 Poster: An online passive-aggressive algorithm for difference-of-squares classification »
Lawrence Saul -
2012 Poster: Latent Coincidence Analysis: A Hidden Variable Model for Distance Metric Learning »
Matthew F Der · Lawrence Saul -
2011 Poster: Maximum Covariance Unfolding : Manifold Learning for Bimodal Data »
Vijay Mahadevan · Chi Wah Wong · Jose Costa Pereira · Tom Liu · Nuno Vasconcelos · Lawrence Saul -
2010 Talk: Manifold Learning »
Lawrence Saul -
2010 Poster: Latent Variable Models for Predicting File Dependencies in Large-Scale Software Development »
Diane Hu · Laurens van der Maaten · Youngmin Cho · Lawrence Saul · Sorin Lerner -
2006 Poster: Large Margin Gaussian Mixture Models for Automatic Speech Recognition »
Fei Sha · Lawrence Saul -
2006 Talk: Large Margin Gaussian Mixture Models for Automatic Speech Recognition »
Fei Sha · Lawrence Saul -
2006 Poster: Graph Regularization for Maximum Variance Unfolding with an Application to Sensor Localization »
Kilian Q Weinberger · Fei Sha · Qihui Zhu · Lawrence Saul