AUTHOR=Liao Wangdan , Chen Fei , Liu Changyue , Wang Weidong , Liu Hongyun TITLE=SpikeAtConv: an integrated spiking-convolutional attention architecture for energy-efficient neuromorphic vision processing JOURNAL=Frontiers in Neuroscience VOLUME=Volume 19 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/neuroscience/articles/10.3389/fnins.2025.1536771 DOI=10.3389/fnins.2025.1536771 ISSN=1662-453X ABSTRACT=IntroductionSpiking Neural Networks (SNNs) offer a biologically inspired alternative to conventional artificial neural networks, with potential advantages in power efficiency due to their event-driven computation. Despite their promise, SNNs have yet to achieve competitive performance on complex visual tasks, such as image classification.MethodsThis study introduces a novel SNN architecture called SpikeAtConv, designed to enhance computational efficacy and task accuracy. The architecture features optimized spiking modules that facilitate the processing of spatio-temporal patterns in visual data, aiming to reconcile the computational demands of high-level vision tasks with the energy-efficient processing of SNNs.ResultsExtensive experiments show that the proposed SpikeAtConv architecture outperforms or is comparable to the state-of-the-art SNNs on the datasets. Notably, we achieved a top-1 accuracy of 81.23% on ImageNet-1K using the directly trained Large SpikeAtConv, which is a state-of-the-art result in the field of SNN.DiscussionOur evaluations on standard image classification benchmarks indicate that the proposed architecture narrows the performance gap with traditional neural networks, providing insights into the design of more efficient and capable neuromorphic computing systems.