AUTHOR=Liu Fangxin , Zhao Wenbo , Chen Yongbiao , Wang Zongwu , Yang Tao , Jiang Li TITLE=SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training JOURNAL=Frontiers in Neuroscience VOLUME=Volume 15 - 2021 YEAR=2021 URL=https://www.frontiersin.org/journals/neuroscience/articles/10.3389/fnins.2021.756876 DOI=10.3389/fnins.2021.756876 ISSN=1662-453X ABSTRACT=Spiking Neural Networks~(SNNs), a possible pathway, empower low-power event-driven neuromorphic hardware due to spatio-temporal information processing capability and high biological plausibility. Currently, SNNs, although more efficient than artificial neural networks (ANNs), are not as accurate as ANNs. Error backpropagation is the most common method for directly training neural networks, promoting the prosperity of ANNs in various deep learning fields. However, since the signals transmitted in the SNN are non-differentiable discrete binary spike events, the activation function in the form of spikes imposes difficulties for the gradient-based optimization algorithms to be directly applied in SNNs, leading to an performance~(i.e., accuracy and latency) gap between SNNs and ANNs. In this paper, we introduce a new learning algorithm, called SSTDP, which bridges the gap between backpropagation (BP)-based learning and spike-time-dependent plasticity (STDP)-based learning to train SNN efficiently. The scheme incorporates the global optimization process from BP and the efficient weight update derived from STDP. It not only avoids the non-differentiable derivation in the BP process, but also utilizes the local feature extraction property of STDP. Consequently, our method can lower the possibility of vanishing spikes in BP training and reduces the the number of time steps so as to reduce latency of the network. Our experiments show the effectiveness of the proposed SSTDP learning algorithm on the SNN by achieving the best classification accuracy $99.3\%$ on the Caltech 101 dataset, $98.1\%$ on the MNIST dataset, and $91.3\%$ on the CIFAR-10 dataset compared to other SNNs trained with other learning methods. It also surpasses the best inference accuracy of the directly trained SNN with $25 \sim 32\times$ less inference latency. Moreover, we analyze event-based computations to demonstrate the efficacy of the SNN for inference operation in the spiking domain, and SSTDP method can achieve $1.3\sim 37.7\times$ fewer addition operations per inference. Codes will be open-source in the final version.