Your new experience awaits. Try the new design now and help us make it even better

ORIGINAL RESEARCH article

Front. Hum. Neurosci.

Sec. Brain-Computer Interfaces

This article is part of the Research TopicDeep learning in brain-computer interfacesView all articles

Exploring Physical and Functional EEG Connectivity with Multilayer Graph Transformer Convolutional Networks for Emotion Recognition

Provisionally accepted
S  M Atoar RahmanS M Atoar Rahman1Md  Ibrahim KhalilMd Ibrahim Khalil1Hui  ZhouHui Zhou1*Ziyun  DingZiyun Ding2Yu  GuoYu Guo1
  • 1Nanjing University of Science and Technology, Nanjing, China
  • 2University of Birmingham, Birmingham, United Kingdom

The final, formatted version of the article will be published soon.

Electroencephalogram (EEG)–based emotion recognition has emerged as a compelling direction in affective computing, driven by its ability to provide objective, neural-level insights into emotional states. However, the high-dimensional and complex spatial and functional characteristics of EEG data present substantial challenges for accurate modelling. To address this, we propose Multilayer-GTCN (Multilayer Graph Transformer Convolutional Network), which combines the strengths of Graph Convolutional Networks (GCNs) and Graph Transformer layers to effectively capture both local and global dependencies in EEG signals. The framework employs a dual-graph design over feature nodes: a physical proximity graph instantiated as a complete topology to stabilize information flow, and a functional connectivity graph whose edges are correlations derived from inter-feature relationships. Within this representation, GCN layers consolidate stable relational patterns, while transformer-based graph convolutions capture long-range dependencies and transient interactions across the feature space. Combining the two encoded views results in representations that jointly capture localized structure and global context, providing a robust basis for affective decoding. Extensive experiments on benchmark datasets confirm the effectiveness of our approach, achieving 98.24 ± 1.74% on SEED, 95.82 ± 1.89% on SEED-IV, and 93.35 ± 4.08% (valence) / 94.11 ± 2.98% (arousal) on DEAP. These results highlight the efficiency and flexibility of Multilayer-GTCN across varied datasets. By merging a physical-proximity graph with correlation-based functional connectivity in a multilayer architecture, this work lays a foundation for scalable affective-computing systems and delivers a framework to guide upcoming advances in neural signal study.

Keywords: EEG Emotion Recognition1, Graph Transformer Convolutional layers2, GraphConvolutional Neural Networks 3, Physical Proximity4, functional connectivity5

Received: 29 Sep 2025; Accepted: 28 Nov 2025.

Copyright: © 2025 Rahman, Khalil, Zhou, Ding and Guo. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Hui Zhou

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.