Timezone: »
We introduce Invertible Dense Networks (i-DenseNets), a more parameter efficient extension of Residual Flows. The method relies on an analysis of the Lipschitz continuity of the concatenation in DenseNets, where we enforce invertibility of the network by satisfying the Lipschitz constant. Furthermore, we propose a learnable weighted concatenation, which not only improves the model performance but also indicates the importance of the concatenated weighted representation. Additionally, we introduce the Concatenated LipSwish as activation function, for which we show how to enforce the Lipschitz condition and which boosts performance. The new architecture, i-DenseNet, out-performs Residual Flow and other flow-based models on density estimation evaluated in bits per dimension, where we utilize an equal parameter budget. Moreover, we show that the proposed model out-performs Residual Flows when trained as a hybrid model where the model is both a generative and a discriminative model.
Author Information
Yura Perugachi-Diaz (Vrije Universiteit Amsterdam)
Jakub Tomczak (Vrije Universiteit Amsterdam)
Sandjai Bhulai (Vrije Universiteit Amsterdam)
More from the Same Authors
-
2021 : Semi-supervised Multiple Instance Learning using Variational Auto-Encoders »
Ali Nihat Uzunalioglu · Tameem Adel · Jakub M. Tomczak -
2021 : Semi-supervised Multiple Instance Learning using Variational Auto-Encoders »
Ali Nihat Uzunalioglu · Tameem Adel · Jakub M. Tomczak -
2022 : Kendall Shape-VAE : Learning Shapes in a Generative Framework »
Sharvaree Vadgama · Jakub Tomczak · Erik Bekkers -
2022 Spotlight: Alleviating Adversarial Attacks on Variational Autoencoders with MCMC »
Anna Kuzina · Max Welling · Jakub Tomczak -
2022 : Kendall Shape-VAE : Learning Shapes in a Generative Framework »
Sharvaree Vadgama · Jakub Tomczak · Erik Bekkers -
2022 Poster: Alleviating Adversarial Attacks on Variational Autoencoders with MCMC »
Anna Kuzina · Max Welling · Jakub Tomczak -
2022 Poster: On Analyzing Generative and Denoising Capabilities of Diffusion-based Deep Generative Models »
Kamil Deja · Anna Kuzina · Tomasz Trzcinski · Jakub Tomczak -
2021 Poster: Storchastic: A Framework for General Stochastic Automatic Differentiation »
Emile van Krieken · Jakub Tomczak · Annette Ten Teije -
2020 Poster: The Convolution Exponential and Generalized Sylvester Flows »
Emiel Hoogeboom · Victor Garcia Satorras · Jakub Tomczak · Max Welling -
2019 Poster: Combinatorial Bayesian Optimization using the Graph Cartesian Product »
Changyong Oh · Jakub Tomczak · Stratis Gavves · Max Welling