Timezone: »
Poster
iFlow: Numerically Invertible Flows for Efficient Lossless Compression via a Uniform Coder
Shifeng Zhang · Ning Kang · Tom Ryder · Zhenguo Li
It was estimated that the world produced $59 ZB$ ($5.9 \times 10^{13} GB$) of data in 2020, resulting in the enormous costs of both data storage and transmission. Fortunately, recent advances in deep generative models have spearheaded a new class of so-called "neural compression" algorithms, which significantly outperform traditional codecs in terms of compression ratio. Unfortunately, the application of neural compression garners little commercial interest due to its limited bandwidth; therefore, developing highly efficient frameworks is of critical practical importance. In this paper, we discuss lossless compression using normalizing flows which have demonstrated a great capacity for achieving high compression ratios. As such, we introduce iFlow, a new method for achieving efficient lossless compression. We first propose Modular Scale Transform (MST) and a novel family of numerically invertible flow transformations based on MST. Then we introduce the Uniform Base Conversion System (UBCS), a fast uniform-distribution codec incorporated into iFlow, enabling efficient compression. iFlow achieves state-of-the-art compression ratios and is $5 \times$ quicker than other high-performance schemes. Furthermore, the techniques presented in this paper can be used to accelerate coding time for a broad class of flow-based algorithms.
Author Information
Shifeng Zhang (Department of Computer Science and Technology, Tsinghua University)
Ning Kang (The University of Hong Kong)
Tom Ryder (Huawei Technologies Ltd.)
Zhenguo Li (Noah's Ark Lab, Huawei Tech Investment Co Ltd)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 Spotlight: iFlow: Numerically Invertible Flows for Efficient Lossless Compression via a Uniform Coder »
Dates n/a. Room
More from the Same Authors
-
2021 : One Million Scenes for Autonomous Driving: ONCE Dataset »
Jiageng Mao · Niu Minzhe · ChenHan Jiang · hanxue liang · Jingheng Chen · Xiaodan Liang · Yamin Li · Chaoqiang Ye · Wei Zhang · Zhenguo Li · Jie Yu · Hang Xu · Chunjing XU -
2021 : SODA10M: A Large-Scale 2D Self/Semi-Supervised Object Detection Dataset for Autonomous Driving »
Jianhua Han · Xiwen Liang · Hang Xu · Kai Chen · Lanqing Hong · Jiageng Mao · Chaoqiang Ye · Wei Zhang · Zhenguo Li · Xiaodan Liang · Chunjing XU -
2021 : How Well Does Self-Supervised Pre-Training Perform with Streaming ImageNet? »
Dapeng Hu · · Qizhengqiu Lu · Lanqing Hong · Hailin Hu · Yifan Zhang · Zhenguo Li · Jiashi Feng -
2021 : Architecture Personalization in Resource-constrained Federated Learning »
Mi Luo · Fei Chen · Zhenguo Li · Jiashi Feng -
2022 Poster: CAGroup3D: Class-Aware Grouping for 3D Object Detection on Point Clouds »
Haiyang Wang · Lihe Ding · Shaocong Dong · Shaoshuai Shi · Aoxue Li · Jianan Li · Zhenguo Li · Liwei Wang -
2022 Spotlight: Lightning Talks 2B-3 »
Jie-Jing Shao · Jiangmeng Li · Jiashuo Liu · Zongbo Han · Tianyang Hu · Jiayun Wu · Wenwen Qiang · Jun WANG · Zhipeng Liang · Lan-Zhe Guo · Wenjia Wang · Yanan Zhang · Xiao-wen Yang · Fan Yang · Bo Li · Wenyi Mo · Zhenguo Li · Liu Liu · Peng Cui · Yu-Feng Li · Changwen Zheng · Lanqing Li · Yatao Bian · Bing Su · Hui Xiong · Peilin Zhao · Bingzhe Wu · Changqing Zhang · Jianhua Yao -
2022 Spotlight: Understanding Square Loss in Training Overparametrized Neural Network Classifiers »
Tianyang Hu · Jun WANG · Wenjia Wang · Zhenguo Li -
2022 Poster: DetCLIP: Dictionary-Enriched Visual-Concept Paralleled Pre-training for Open-world Detection »
Lewei Yao · Jianhua Han · Youpeng Wen · Xiaodan Liang · Dan Xu · Wei Zhang · Zhenguo Li · Chunjing XU · Hang Xu -
2022 Poster: ZooD: Exploiting Model Zoo for Out-of-Distribution Generalization »
Qishi Dong · Awais Muhammad · Fengwei Zhou · Chuanlong Xie · Tianyang Hu · Yongxin Yang · Sung-Ho Bae · Zhenguo Li -
2022 Poster: Understanding Square Loss in Training Overparametrized Neural Network Classifiers »
Tianyang Hu · Jun WANG · Wenjia Wang · Zhenguo Li -
2021 : Layer-Parallel Training of Residual Networks with Auxiliary Variables »
Qi Sun · Hexin Dong · Zewei Chen · WeiZhen Dian · Jiacheng Sun · Yitong Sun · Zhenguo Li · Bin Dong -
2021 : Contributed Talk 3: Architecture Personalization in Resource-constrained Federated Learning »
Mi Luo · Fei Chen · Zhenguo Li · Jiashi Feng -
2021 Poster: On Effective Scheduling of Model-based Reinforcement Learning »
Hang Lai · Jian Shen · Weinan Zhang · Yimin Huang · Xing Zhang · Ruiming Tang · Yong Yu · Zhenguo Li -
2021 Poster: OSOA: One-Shot Online Adaptation of Deep Generative Models for Lossless Compression »
Chen Zhang · Shifeng Zhang · Fabio Maria Carlucci · Zhenguo Li -
2021 Poster: MixACM: Mixup-Based Robustness Transfer via Distillation of Activated Channel Maps »
Awais Muhammad · Fengwei Zhou · Chuanlong Xie · Jiawei Li · Sung-Ho Bae · Zhenguo Li -
2021 Poster: Towards a Theoretical Framework of Out-of-Distribution Generalization »
Haotian Ye · Chuanlong Xie · Tianle Cai · Ruichen Li · Zhenguo Li · Liwei Wang -
2020 Poster: Bridging the Gap between Sample-based and One-shot Neural Architecture Search with BONAS »
Han Shi · Renjie Pi · Hang Xu · Zhenguo Li · James Kwok · Tong Zhang -
2020 Poster: Locally Differentially Private (Contextual) Bandits Learning »
Kai Zheng · Tianle Cai · Weiran Huang · Zhenguo Li · Liwei Wang -
2020 Poster: Understanding and Exploring the Network with Stochastic Architectures »
Zhijie Deng · Yinpeng Dong · Shifeng Zhang · Jun Zhu