Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks
Wenrui Zhang, Peng Li
Spotlight presentation: Orals & Spotlights Track 28: Deep Learning
on 2020-12-10T08:00:00-08:00 - 2020-12-10T08:10:00-08:00
on 2020-12-10T08:00:00-08:00 - 2020-12-10T08:10:00-08:00
Toggle Abstract Paper (in Proceedings / .pdf)
Abstract: Spiking neural networks (SNNs) are well suited for spatio-temporal learning and implementations on energy-efficient event-driven neuromorphic processors. However, existing SNN error backpropagation (BP) methods lack proper handling of spiking discontinuities and suffer from low performance compared with the BP methods for traditional artificial neural networks. In addition, a large number of time steps are typically required to achieve decent performance, leading to high latency and rendering spike-based computation unscalable to deep architectures. We present a novel Temporal Spike Sequence Learning Backpropagation (TSSL-BP) method for training deep SNNs, which breaks down error backpropagation across two types of inter-neuron and intra-neuron dependencies and leads to improved temporal learning precision. It captures inter-neuron dependencies through presynaptic firing times by considering the all-or-none characteristics of firing activities and captures intra-neuron dependencies by handling the internal evolution of each neuronal state in time. TSSL-BP efficiently trains deep SNNs within a much shortened temporal window of a few steps while improving the accuracy for various image classification datasets including CIFAR10.