AUTHOR=Abuhajar Nidal , Wang Zhewei , Baltes Marc , Yue Ye , Xu Li , Karanth Avinash , Smith Charles D. , Liu Jundong TITLE=Three-stage hybrid spiking neural networks fine-tuning for speech enhancement JOURNAL=Frontiers in Neuroscience VOLUME=Volume 19 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/neuroscience/articles/10.3389/fnins.2025.1567347 DOI=10.3389/fnins.2025.1567347 ISSN=1662-453X ABSTRACT=IntroductionIn the past decade, artificial neural networks (ANNs) have revolutionized many AI-related fields, including Speech Enhancement (SE). However, achieving high performance with ANNs often requires substantial power and memory resources. Recently, spiking neural networks (SNNs) have emerged as a promising low-power alternative to ANNs, leveraging their inherent sparsity to enable efficient computation while maintaining performance.MethodWhile SNNs offer improved energy efficiency, they are generally more challenging to train compared to ANNs. In this study, we propose a three-stage hybrid ANN-to-SNN fine-tuning scheme and apply it to Wave-U-Net and ConvTasNet, two major network solutions for speech enhancement. Our framework first trains the ANN models, followed by converting them into their corresponding spiking versions. The converted SNNs are subsequently fine-tuned with a hybrid training scheme, where the forward pass uses spiking signals and the backward pass uses ANN signals to enable backpropagation. In order to maintain the performance of the original ANN models, various modifications to the original network architectures have been made. Our SNN models operate entirely in the temporal domain, eliminating the need to convert wave signals into the spectral domain for input and back to the waveform for output. Moreover, our models uniquely utilize spiking neurons, setting them apart from many models that incorporate regular ANN neurons in their architectures.Results and discussionExperiments on noisy VCTK and TIMIT datasets demonstrate the effectiveness of the hybrid training, where the fine-tuned SNNs show significant improvement and robustness over the baseline models.