Skip to yearly menu bar Skip to main content


Poster

Learning with Fredholm Kernels

Qichao Que · Mikhail Belkin · Yusu Wang

Level 2, room 210D

Abstract:

In this paper we propose a framework for supervised and semi-supervised learning based on reformulating the learning problem as a regularized Fredholm integral equation. Our approach fits naturally into the kernel framework and can be interpreted as constructing new data-dependent kernels, which we call Fredholm kernels. We proceed to discuss the "noise assumption" for semi-supervised learning and provide evidence evidence both theoretical and experimental that Fredholm kernels can effectively utilize unlabeled data under the noise assumption. We demonstrate that methods based on Fredholm learning show very competitive performance in the standard semi-supervised learning setting.

Live content is unavailable. Log in and register to view live content