Timezone: »
In most of existing deep convolutional neural networks (CNNs) for classification, global average (first-order) pooling (GAP) has become a standard module to summarize activations of the last convolution layer as final representation for prediction. Recent researches show integration of higher-order pooling (HOP) methods clearly improves performance of deep CNNs. However, both GAP and existing HOP methods assume unimodal distributions, which cannot fully capture statistics of convolutional activations, limiting representation ability of deep CNNs, especially for samples with complex contents. To overcome the above limitation, this paper proposes a global Gated Mixture of Second-order Pooling (GM-SOP) method to further improve representation ability of deep CNNs. To this end, we introduce a sparsity-constrained gating mechanism and propose a novel parametric SOP as component of mixture model. Given a bank of SOP candidates, our method can adaptively choose Top-K (K > 1) candidates for each input sample through the sparsity-constrained gating module, and performs weighted sum of outputs of K selected candidates as representation of the sample. The proposed GM-SOP can flexibly accommodate a large number of personalized SOP candidates in an efficient way, leading to richer representations. The deep networks with our GM-SOP can be end-to-end trained, having potential to characterize complex, multi-modal distributions. The proposed method is evaluated on two large scale image benchmarks (i.e., downsampled ImageNet-1K and Places365), and experimental results show our GM-SOP is superior to its counterparts and achieves very competitive performance. The source code will be available at http://www.peihuali.org/GM-SOP.
Author Information
Qilong Wang (Tianjin University)
Zilin Gao (Dalian University of Technology)
Jiangtao Xie (Dalian University of Technology)
Wangmeng Zuo (Harbin Institute of Technology)
Peihua Li (Dalian University of Technology)
More from the Same Authors
-
2022 Poster: DropCov: A Simple yet Effective Method for Improving Deep Architectures »
Qilong Wang · Mingze Gao · Zhaolin Zhang · Jiangtao Xie · Peihua Li · Qinghua Hu -
2022 Spotlight: DropCov: A Simple yet Effective Method for Improving Deep Architectures »
Qilong Wang · Mingze Gao · Zhaolin Zhang · Jiangtao Xie · Peihua Li · Qinghua Hu -
2022 Spotlight: Lightning Talks 6A-1 »
Ziyi Wang · Nian Liu · Yaming Yang · Qilong Wang · Yuanxin Liu · Zongxin Yang · Yizhao Gao · Yanchen Deng · Dongze Lian · Nanyi Fei · Ziyu Guan · Xiao Wang · Shufeng Kong · Xumin Yu · Daquan Zhou · Yi Yang · Fandong Meng · Mingze Gao · Caihua Liu · Yongming Rao · Zheng Lin · Haoyu Lu · Zhe Wang · Jiashi Feng · Zhaolin Zhang · Deyu Bo · Xinchao Wang · Chuan Shi · Jiangnan Li · Jiangtao Xie · Jie Zhou · Zhiwu Lu · Wei Zhao · Bo An · Jiwen Lu · Peihua Li · Jian Pei · Hao Jiang · Cai Xu · Peng Fu · Qinghua Hu · Yijie Li · Weigang Lu · Yanan Cao · Jianbin Huang · Weiping Wang · Zhao Cao · Jie Zhou -
2021 Poster: Temporal-attentive Covariance Pooling Networks for Video Recognition »
Zilin Gao · Qilong Wang · Bingbing Zhang · Qinghua Hu · Peihua Li -
2020 Poster: Cross-Scale Internal Graph Neural Network for Image Super-Resolution »
Shangchen Zhou · Jiawei Zhang · Wangmeng Zuo · Chen Change Loy