Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Information-Theoretic Principles in Cognitive Systems (InfoCog)

Variable Selection in GPDMs Using the Information Bottleneck Method

Jesse St. Amand · Martin Giese

[ ] [ Project Page ]
Fri 15 Dec 12:40 p.m. PST — 1:30 p.m. PST

Abstract:

In computer graphics and robotics, there is an increasing need for real-time generative models of human motion. Neural networks are often the favored choice, yet their generalization properties are limited, especially on small data sets. This paper utilizes the Gaussian process dynamical model (GPDM) as an alternative. Despite their successes in various motion tasks, GPDMs face challenges like high computational complexity and the need for many hyperparameters. This work addresses these issues by integrating the information bottleneck (IB) framework with GPDMs. The IB approach aims to optimally balance data fit and generalization through measures of mutual information. Our technique uses IB variable selection as a component of GPLVM back-constraints to select features for the latent space, reduce parameter count, and increase the model's robustness to changes in latent space dimensionality.

Chat is not available.