REVIEW article
Front. Neurosci.
Sec. Neuromorphic Engineering
Volume 19 - 2025 | doi: 10.3389/fnins.2025.1623497
This article is part of the Research TopicBrain-Inspired Computing under the Era of Large Model and Generative AI: Theory, Algorithms, and ApplicationsView all 4 articles
Review of Deep Learning Models with Spiking Neural Networks for Modelling and Analysis of Multimodal Neuroimaging Data
Provisionally accepted- 1The University of Auckland, Auckland, New Zealand
- 2Auckland University of Technology, Auckland, New Zealand
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Medical imaging has become an essential tool for identifying and treating neurological conditions. Traditional deep learning (DL) models have made tremendous advances in neuroimaging analysis; however, they face difficulties when modeling complicated spatiotemporal brain data. Spiking Neural Networks (SNNs), which are inspired by real neurons, provide a promising option for efficiently processing spatiotemporal data. This review discusses current improvements in using SNNs for multimodal neuroimaging analysis. Quantitative and thematic analyses were conducted on 21 selected publications to assess trends, research topics, and geographical contributions. Results show that SNNs outperform traditional DL approaches in classification, feature extraction, and prediction tasks, especially when combining multiple modalities. Despite their potential, challenges of multimodal data fusion, computational demands, and limited large-scale datasets persist. We discussed the growth of SNNs in analysis, prediction, and diagnosis of neurological data, along with the emphasis on future direction and improvements for more efficient and clinically applicable models.
Keywords: Neuroimaging, Multimodalities, deep learning, machine learning, SpikingNeurons, spiking neural networks
Received: 06 May 2025; Accepted: 22 Oct 2025.
Copyright: © 2025 Khan, Shim, Fernandez, Kasabov and Wang. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Alan Wang, alan.wang@auckland.ac.nz
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.