NIPS 2010
Skip to yearly menu bar Skip to main content


Workshop

Tensors, Kernels, and Machine Learning

Tamara G Kolda · Vicente Malave · David F Gleich · Johan Suykens · Marco Signoretto · Andreas Argyriou

Westin: Nordic

Tensors are a generalization of vectors and matrices to high
dimensions. The goal of this workshop is to explore the links between
tensors and information processing. We expect that many problems in, for
example, machine learning and kernel methods can benefit from being
expressing as tensor problems; conversely, the tensor community may
learn from the estimation techniques commonly used in information
processing and from some of the kernel extensions to nonlinear models.

On the other hand, standard tensor-based techniques can only deliver
multi-linear models. As a consequence, they may suffer from limited
discriminative power. A properly defined kernel-based extension might
overcome this limitation. The statistical machine learning community has
much to offer on different aspects such as learning (supervised,
unsupervised and semi-supervised) and generalization, regularization
techniques, loss functions and model selection.\[1ex]

The goal of this workshop is to promote the cross-fertilization between
machine learning and tensor-based techniques. \[1ex]

This workshop is appropriate for anyone who wishes to learn more about
tensor methods and/or share their machine learning or kernel techniques
with the tensor community; conversely, we invite contributions from
tensor experts seeking to use tensors for problems in machine learning
and information processing. \[1ex]

We hope to discuss the following topics:

\begin{itemize}
\item Applications using tensors for information processing (e.g. image
recognition, EEG, text analysis, diffusion weighted tensor imaging,
etc.) as well as the appropriateness of tensors models for various
information processing tasks.

\item Specialized tensor decompositions that may be of interest for
informatics processing (e.g. nonnegative factorizations, specialized
objective functions or constraints, symmetric factorizations, handling
missing data, handling special types of noise).

\item Information processing techniques that have connections to tensor
representations and factorizations, such as nonlinear kernel methods,
multi-task learning, specialized learning algorithms that can be
adapted to tensor factorizations.


\item Theoretical questions of interest in applying tensor information
processing methods (e.g. questions surrounding tensor rank, extension
of nuclear norm to tensors).

\end{itemize}

Live content is unavailable. Log in and register to view live content