Impact Factor 5.890 | CiteScore 2.8
More on impact ›


Deep Feature Mining via Attention-based BiLSTM-GCN for Human Motor Imagery Recognition

Provisionally accepted
The final version of the article will be published here soon pending final quality checks
  • 1Northeast Electric Power University, China
  • 2Northwestern Polytechnical University, China
  • 3The University of Sydney, Australia

Recognition accuracy and response time are both critically essential ahead of building practical electroencephalography (EEG) based brain-computer interface (BCI). Recent approaches, however, have either compromised in the classification accuracy or responding time. This paper presents a novel deep learning approach designed towards remarkably accurate and responsive motor imagery (MI) recognition based on scalp EEG. Bidirectional Long Short-term Memory (BiLSTM) with the Attention mechanism manages to derive relevant features from raw EEG signals. The connected graph convolutional neural network (GCN) promotes the decoding performance by cooperating with the topological structure of features, which are estimated from the overall data. The 0.4-second detection framework has shown effective and efficient prediction based on individual and group-wise training, with 98.81% and 94.64% accuracy, respectively, which outperformed all the state-of-the-art studies. The introduced deep feature mining approach can precisely recognize human motion intents from raw EEG signals, which paves the road to translate the EEG based MI recognition to practical BCI systems.

Keywords: brain-computer interface (BCI), Electroencephalography (EEG), Motor Imagery (MI), Bidirectional long short-term memory (BiLSTM), Graph convolutional neural network (GCN)

Received: 07 May 2021; Accepted: 30 Dec 2021.

Copyright: © 2021 Hou, Jia, Lun, Zhang, Chen, Wang and Lv. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Mx. Shuyue Jia, Northeast Electric Power University, Jilin, 132012, Jilin Province, China