NIPS 2010
Skip to yearly menu bar Skip to main content


Workshop

Transfer Learning Via Rich Generative Models.

Russ Salakhutdinov · Ryan Adams · Josh Tenenbaum · Zoubin Ghahramani · Tom Griffiths

Westin: Emerald A

Intelligent systems must be capable of transferring previously-learned abstract knowledge to new concepts, given only a small or noisy set of examples. This transfer of higher order information to new learning tasks lies at the core of many problems in the fields of computer vision, cognitive science, machine learning, speech perception and natural language processing.

\par Over the last decade, there has been considerable progress in
developing cross-task transfer (e.g., multi-task learning and
semi-supervised learning) using both discriminative and generative approaches. However, many existing learning systems today can not cope with new tasks for which they have not been specifically trained. Even when applied to related tasks, trained systems often display unstable behavior. More recently, researchers have begun developing new approaches to building rich generative models that are capable of extracting useful, high-level structured representations from high-dimensional sensory input. The learned representations have been shown to give promising results for solving a multitude of novel learning tasks, even though these tasks may be unknown when the generative model is being trained. A few notable examples include learning of Deep Belief Networks, Deep Boltzmann Machines, deep nonparametric Bayesian models, as well as Bayesian models inspired by human learning. \

\par``Learning to learn'' new concepts via rich generative models has emerged as one of the most promising areas of research in both machine learning and cognitive science. Although there has been recent progress, existing computational models are still far from being able to represent, identify and learn the wide variety of possible patterns and structure in real-world data. The goal of this workshop is to assess the current state of the field and explore new directions in both theoretical foundations and empirical applications.

Live content is unavailable. Log in and register to view live content