Timezone: »
Normalizing flows (NFs) have become a prominent method for deep generative models that allow for an analytic probability density estimation and efficient synthesis. However, a flow-based network is considered to be inefficient in parameter complexity because of reduced expressiveness of bijective mapping, which renders the models unfeasibly expensive in terms of parameters. We present an alternative parameterization scheme called NanoFlow, which uses a single neural density estimator to model multiple transformation stages. Hence, we propose an efficient parameter decomposition method and the concept of flow indication embedding, which are key missing components that enable density estimation from a single neural network. Experiments performed on audio and image models confirm that our method provides a new parameter-efficient solution for scalable NFs with significant sublinear parameter complexity.
Author Information
Sang-gil Lee (Seoul National University)
Sungwon Kim (Seoul National University)
Sungroh Yoon (Seoul National University)
Dr. Sungroh Yoon is Associate Professor of Electrical and Computer Engineering at Seoul National University, Korea. Prof. Yoon received the B.S. degree from Seoul National University, South Korea, and the M.S. and Ph.D. degrees from Stanford University, CA, respectively, all in electrical engineering. He held research positions with Stanford University, CA, Intel Corporation, Santa Clara, CA, and Synopsys, Inc., Mountain View, CA. He was an Assistant Professor with the School of Electrical Engineering, Korea University, from 2007 to 2012. He is currently an Associate Professor with the Department of Electrical and Computer Engineering, Seoul National University, South Korea. Prof. Yoon is the recipient of 2013 IEEE/IEIE Joint Award for Young IT Engineers. His research interests include deep learning, machine learning, data-driven artificial intelligence, and large-scale applications including biomedicine.
More from the Same Authors
-
2022 : Sample-efficient Adversarial Imitation Learning »
Dahuin Jung · Hyungyu Lee · Sungroh Yoon -
2021 Poster: Reducing Information Bottleneck for Weakly Supervised Semantic Segmentation »
Jungbeom Lee · Jooyoung Choi · Jisoo Mok · Sungroh Yoon -
2020 Poster: Glow-TTS: A Generative Flow for Text-to-Speech via Monotonic Alignment Search »
Jaehyeon Kim · Sungwon Kim · Jungil Kong · Sungroh Yoon -
2020 Oral: Glow-TTS: A Generative Flow for Text-to-Speech via Monotonic Alignment Search »
Jaehyeon Kim · Sungwon Kim · Jungil Kong · Sungroh Yoon -
2017 Poster: Deep Recurrent Neural Network-Based Identification of Precursor microRNAs »
Seunghyun Park · Seonwoo Min · Hyun-Soo Choi · Sungroh Yoon -
2016 Poster: Neural Universal Discrete Denoiser »
Taesup Moon · Seonwoo Min · Byunghan Lee · Sungroh Yoon