Timezone: »

Training Spiking Neural Networks with Local Tandem Learning
Qu Yang · Jibin Wu · Malu Zhang · Yansong Chua · Xinchao Wang · Haizhou Li

Thu Dec 01 02:00 PM -- 04:00 PM (PST) @ Hall J #125

Spiking neural networks (SNNs) are shown to be more biologically plausible and energy efficient over their predecessors. However, there is a lack of an efficient and generalized training method for deep SNNs, especially for deployment on analog computing substrates. In this paper, we put forward a generalized learning rule, termed Local Tandem Learning (LTL). The LTL rule follows the teacher-student learning approach by mimicking the intermediate feature representations of a pre-trained ANN. By decoupling the learning of network layers and leveraging highly informative supervisor signals, we demonstrate rapid network convergence within five training epochs on the CIFAR-10 dataset while having low computational complexity. Our experimental results have also shown that the SNNs thus trained can achieve comparable accuracies to their teacher ANNs on CIFAR-10, CIFAR-100, and Tiny ImageNet datasets. Moreover, the proposed LTL rule is hardware friendly. It can be easily implemented on-chip to perform fast parameter calibration and provide robustness against the notorious device non-ideality issues. It, therefore, opens up a myriad of opportunities for training and deployment of SNN on ultra-low-power mixed-signal neuromorphic computing chips.

Author Information

Qu Yang (National University of Singapore)
Jibin Wu (National University of Singapore)
Malu Zhang (National University of Singapore)
Yansong Chua (Huawei Technologies Co., Ltd)
Xinchao Wang
Haizhou Li (The Chinese University of Hong Kong (Shenzhen); National University of Singapore)

More from the Same Authors