Timezone: »
In many applications, it is desirable to extract only the relevant aspects of data. A principled way to do this is the information bottleneck (IB) method, where one seeks a code that maximises information about a relevance variable, Y, while constraining the information encoded about the original data, X. Unfortunately however, the IB method is computationally demanding when data are high-dimensional and/or non-gaussian. Here we propose an approximate variational scheme for maximising a lower bound on the IB objective, analogous to variational EM. Using this method, we derive an IB algorithm to recover features that are both relevant and sparse. Finally, we demonstrate how kernelised versions of the algorithm can be used to address a broad range of problems with non-linear relation between X and Y.
Author Information
Matthew Chalk (IST Austria)
Olivier Marre (Institut de la vision)
Gasper Tkacik (Institute of Science and Technology Austria)
More from the Same Authors
-
2020 Poster: A new inference approach for training shallow and deep generalized linear models of noisy interacting neurons »
Gabriel Mahuas · Giulio Isacchini · Olivier Marre · Ulisse Ferrari · Thierry Mora -
2020 Spotlight: A new inference approach for training shallow and deep generalized linear models of noisy interacting neurons »
Gabriel Mahuas · Giulio Isacchini · Olivier Marre · Ulisse Ferrari · Thierry Mora -
2016 Oral: Relevant sparse codes with variational information bottleneck »
Matthew Chalk · Olivier Marre · Gasper Tkacik -
2016 Poster: Estimating Nonlinear Neural Response Functions using GP Priors and Kronecker Methods »
Cristina Savin · Gasper Tkacik