Timezone: »

Temporal Effective Batch Normalization in Spiking Neural Networks
Chaoteng Duan · Jianhao Ding · Shiyan Chen · Zhaofei Yu · Tiejun Huang


Spiking Neural Networks (SNNs) are promising in neuromorphic hardware owing to utilizing spatio-temporal information and sparse event-driven signal processing. However, it is challenging to train SNNs due to the non-differentiable nature of the binary firing function. The surrogate gradients alleviate the training problem and make SNNs obtain comparable performance as Artificial Neural Networks (ANNs) with the same structure. Unfortunately, batch normalization, contributing to the success of ANNs, does not play a prominent role in SNNs because of the additional temporal dimension. To this end, we propose an effective normalization method called temporal effective batch normalization (TEBN). By rescaling the presynaptic inputs with different weights at every time-step, temporal distributions become smoother and uniform. Theoretical analysis shows that TEBN can be viewed as a smoother of SNN's optimization landscape and could help stabilize the gradient norm. Experimental results on both static and neuromorphic datasets show that SNNs with TEBN outperform the state-of-the-art accuracy with fewer time-steps, and achieve better robustness to hyper-parameters than other normalizations.

Author Information

Chaoteng Duan (Peking University)
Jianhao Ding (Peking University)
Shiyan Chen (Peking University)
Zhaofei Yu (Peking University)
Tiejun Huang (Peking University)

More from the Same Authors