Skip to yearly menu bar Skip to main content


Poster

Theoretically Provable Spiking Neural Networks

Shao-Qun Zhang · Zhi-Hua Zhou

Hall J (level 1) #924

Keywords: [ Computational Efficiency ] [ Self Connection ] [ Continuous Dynamical Systems ] [ Approximation Power ] [ spiking neural networks ]


Abstract:

Spiking neural networks have attracted increasing attention in recent years due to their potential of handling time-dependent data. Many algorithms and techniques have been developed; however, theoretical understandings of many aspects of spiking neural networks are far from clear. A recent work [Zhang and Zhou, 2021] disclosed that typical spiking neural networks could hardly work on spatio-temporal data due to their bifurcation dynamics and suggested that the self-connection structure has to be added. In this paper, we theoretically investigate the approximation ability and computational efficiency of spiking neural networks with self connections, and show that the self-connection structure enables spiking neural networks to approximate discrete dynamical systems using a polynomial number of parameters within polynomial time complexities. Our theoretical results may shed some insight for the future studies of spiking neural networks.

Chat is not available.