AUTHOR=Otarbay Zhenis , Kyzyrkanov Abzal TITLE=SVM-enhanced attention mechanisms for motor imagery EEG classification in brain-computer interfaces JOURNAL=Frontiers in Neuroscience VOLUME=Volume 19 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/neuroscience/articles/10.3389/fnins.2025.1622847 DOI=10.3389/fnins.2025.1622847 ISSN=1662-453X ABSTRACT=Brain-Computer Interfaces (BCIs) leverage brain signals to facilitate communication and control, particularly benefiting individuals with motor impairments. Motor imagery (MI)-based BCIs, utilizing non-invasive electroencephalography (EEG), face challenges due to high signal variability, noise, and class overlap. Deep learning architectures, such as CNNs and LSTMs, have improved EEG classification but still struggle to fully capture discriminative features for overlapping motor imagery classes. This study introduces a hybrid deep neural architecture that integrates Convolutional Neural Networks, Long Short-Term Memory networks, and a novel SVM-enhanced attention mechanism. The proposed method embeds the margin maximization objective of Support Vector Machines directly into the self-attention computation to improve interclass separability during feature learning. We evaluate our model on four benchmark datasets: Physionet, Weibo, BCI Competition IV 2a, and 2b, using a Leave-One-Subject-Out (LOSO) protocol to ensure robustness and generalizability. Results demonstrate consistent improvements in classification accuracy, F1-score, and sensitivity compared to conventional attention mechanisms and baseline CNN-LSTM models. Additionally, the model significantly reduces computational cost, supporting real-time BCI applications. Our findings highlight the potential of SVM-enhanced attention to improve EEG decoding performance by enforcing feature relevance and geometric class separability simultaneously.