Skip to yearly menu bar Skip to main content


Poster

Inverse M-Kernels for Linear Universal Approximators of Non-Negative Functions

Hideaki Kim

East Exhibit Hall A-C #3608
[ ]
Wed 11 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

Kernel methods have been widely utilized in machine learning field to learn from training data a latent function in reproducing kernel Hilbert space. It is well known that the obtained approximator usually achieves a linear representation, which brings various computational benefits, while maintaining its great representation power (i.e., universal approximation). However, when non-negativity constraints are imposed on outputs of function, it has been considered in the literature that kernel method-based approximators either can have linear representations at the expense of limited model flexibility, or achieve good representation power by allowing for their nonlinear forms. The main contribution of this paper is to derive a sufficient condition on positive definite kernel so that it may construct flexible and linear approximators of non-negative functions. We call a kernel function that satisfies the condition an {\it inverse M-kernel}, which is the kernel reminiscent of inverse M-matrix. Furthermore, we show that for one-dimensional input space, universal exponential/Abel kernels belong to the inverse M-kernels and construct linear universal approximators of non-negative functions. To the best of our knowledge, it is the first time that the existence of linear universal approximators of non-negative functions has been clarified. We confirm the effectiveness of our results experimentally on the problems of non-negativity-constrained regression, density estimation, and intensity estimation. Finally, we discuss issues and perspectives on multi-dimensional input settings.

Live content is unavailable. Log in and register to view live content