Timezone: »
The two key characteristics of a normalizing flow is that it is invertible (in particular, dimension preserving) and that it monitors the amount by which it changes the likelihood of data points as samples are propagated along the network. Recently, multiple generalizations of normalizing flows have been introduced that relax these two conditions \citep{nielsen2020survae,huang2020augmented}. On the other hand, neural networks only perform a forward pass on the input, there is neither a notion of an inverse of a neural network nor is there one of its likelihood contribution. In this paper we argue that certain neural network architectures can be enriched with a stochastic inverse pass and that their likelihood contribution can be monitored in a way that they fall under the generalized notion of a normalizing flow mentioned above. We term this enrichment \emph{flowification}. We prove that neural networks only containing linear and convolutional layers and invertible activations such as LeakyReLU can be flowified and evaluate them in the generative setting on image datasets.
Author Information
Bálint Máté (University of Geneva)
Samuel Klein (University of Geneva, Switzerland)
Tobias Golling (University of Geneva)
François Fleuret (University of Geneva)
François Fleuret got a PhD in Mathematics from INRIA and the University of Paris VI in 2000, and an Habilitation degree in Mathematics from the University of Paris XIII in 2006. He is Full Professor in the department of Computer Science at the University of Geneva, and Adjunct Professor in the School of Engineering of the École Polytechnique Fédérale de Lausanne. He has published more than 80 papers in peer-reviewed international conferences and journals. He is Associate Editor of the IEEE Transactions on Pattern Analysis and Machine Intelligence, serves as Area Chair for NeurIPS, AAAI, and ICCV, and in the program committee of many top-tier international conferences in machine learning and computer vision. He was or is expert for multiple funding agencies. He is the inventor of several patents in the field of machine learning, and co-founder of Neural Concept SA, a company specializing in the development and commercialization of deep learning solutions for engineering design. His main research interest is machine learning, with a particular focus on computational aspects and sample efficiency.
More from the Same Authors
-
2020 : Exact Preimages of Neural Network Aircraft Collision Avoidance Systems »
Kyle Matoba · François Fleuret -
2021 : Test time Adaptation through Perturbation Robustness »
Prabhu Teja Sivaprasad · François Fleuret -
2021 : Turbo-Sim: a generalised generative model with a physical latent space »
Guillaume Quétant · Vitaliy Kinakh · Tobias Golling · Slava Voloshynovskiy -
2021 : Funnels: Exact maximum likelihood with dimensionality reduction »
Samuel Klein · John Raine · Tobias Golling · Slava Voloshynovskiy · Sebastion Pina-Otey -
2021 : Generation of data on discontinuous manifolds via continuous stochastic non-invertible networks »
Mariia Drozdova · Vitaliy Kinakh · Guillaume Quétant · Tobias Golling · Slava Voloshynovskiy -
2021 : Information-theoretic stochastic contrastive conditional GAN: InfoSCC-GAN »
Vitaliy Kinakh · Mariia Drozdova · Guillaume Quétant · Tobias Golling · Slava Voloshynovskiy -
2022 : Decorrelation with Conditional Normalizing Flows »
Samuel Klein · Tobias Golling -
2022 : Deformations of Boltzmann Distributions »
Bálint Máté · François Fleuret -
2022 : Diversity through Disagreement for Better Transferability »
Matteo Pagliardini · Martin Jaggi · François Fleuret · Sai Praneeth Karimireddy -
2023 Poster: Faster Causal Attention Over Large Sequences Through Sparse Flash Attention »
Matteo Pagliardini · Daniele Paliotta · Martin Jaggi · François Fleuret -
2023 Poster: SUPA: A Lightweight Diagnostic Simulator for Machine Learning in Particle Physics »
Atul Kumar Sinha · Daniele Paliotta · Bálint Máté · John Raine · Tobias Golling · François Fleuret -
2022 : Transformers are Sample-Efficient World Models »
Vincent Micheli · Eloi Alonso · François Fleuret -
2022 Poster: Efficient Training of Low-Curvature Neural Networks »
Suraj Srinivas · Kyle Matoba · Himabindu Lakkaraju · François Fleuret -
2020 Poster: Fast Transformers with Clustered Attention »
Apoorv Vyas · Angelos Katharopoulos · François Fleuret -
2019 : Morning Coffee Break & Poster Session »
Eric Metodiev · Keming Zhang · Markus Stoye · Randy Churchill · Soumalya Sarkar · Miles Cranmer · Johann Brehmer · Danilo Jimenez Rezende · Peter Harrington · AkshatKumar Nigam · Nils Thuerey · Lukasz Maziarka · Alvaro Sanchez Gonzalez · Atakan Okan · James Ritchie · N. Benjamin Erichson · Harvey Cheng · Peihong Jiang · Seong Ho Pahng · Samson Koelle · Sami Khairy · Adrian Pol · Rushil Anirudh · Jannis Born · Benjamin Sanchez-Lengeling · Brian Timar · Rhys Goodall · Tamás Kriváchy · Lu Lu · Thomas Adler · Nathaniel Trask · Noëlie Cherrier · Tomohiko Konno · Muhammad Kasim · Tobias Golling · Zaccary Alperstein · Andrei Ustyuzhanin · James Stokes · Anna Golubeva · Ian Char · Ksenia Korovina · Youngwoo Cho · Chanchal Chatterjee · Tom Westerhout · Gorka Muñoz-Gil · Juan Zamudio-Fernandez · Jennifer Wei · Brian Lee · Johannes Kofler · Bruce Power · Nikita Kazeev · Andrey Ustyuzhanin · Artem Maevskiy · Pascal Friederich · Arash Tavakoli · Willie Neiswanger · Bohdan Kulchytskyy · sindhu hari · Paul Leu · Paul Atzberger -
2019 Poster: Reducing Noise in GAN Training with Variance Reduced Extragradient »
Tatjana Chavdarova · Gauthier Gidel · François Fleuret · Simon Lacoste-Julien -
2019 Demonstration: Real Time CFD simulations with 3D Mesh Convolutional Networks »
Pierre Baque · Pascal Fua · François Fleuret -
2019 Poster: Full-Gradient Representation for Neural Network Visualization »
Suraj Srinivas · François Fleuret -
2018 Poster: Practical Deep Stereo (PDS): Toward applications-friendly deep stereo matching »
Stepan Tulyakov · Anton Ivanov · François Fleuret -
2017 Poster: K-Medoids For K-Means Seeding »
James Newling · François Fleuret -
2017 Spotlight: K-Medoids For K-Means Seeding »
James Newling · François Fleuret -
2016 Poster: Nested Mini-Batch K-Means »
James Newling · François Fleuret -
2015 Poster: Kullback-Leibler Proximal Variational Inference »
Mohammad Emtiyaz Khan · Pierre Baque · François Fleuret · Pascal Fua -
2014 Demonstration: A 3D Simulator for Evaluating Reinforcement and Imitation Learning Algorithms on Complex Tasks »
Leonidas Lefakis · François Fleuret · Cijo Jose -
2013 Poster: Reservoir Boosting : Between Online and Offline Ensemble Learning »
Leonidas Lefakis · François Fleuret -
2011 Poster: Boosting with Maximum Adaptive Sampling »
Charles Dubout · François Fleuret -
2010 Demonstration: Platform to Share Feature Extraction Methods »
François Fleuret -
2010 Poster: Joint Cascade Optimization Using A Product Of Boosted Classifiers »
Leonidas Lefakis · François Fleuret