Your new experience awaits. Try the new design now and help us make it even better

ORIGINAL RESEARCH article

Front. Neurosci.

Sec. Brain Imaging Methods

Volume 19 - 2025 | doi: 10.3389/fnins.2025.1643554

This article is part of the Research TopicNeurocinematics: How the Brain Perceives AudiovisualsView all 9 articles

Time-frequency feature calculation of multi-stage audiovisual neural processing via electroencephalogram microstates

Provisionally accepted
Yang  XiYang Xi1*Lu  ZhangLu Zhang2Cunzhen  LiCunzhen Li2Xiaopeng  LvXiaopeng Lv3Zhu  LanZhu Lan2
  • 1School of Computer Science and Technology, Changchun University of Science and Technology, Changchun, China
  • 2Northeast Electric Power University, Jilin, China
  • 3Jilin City Hospital of Chemical Industry, Jilin, China

The final, formatted version of the article will be published soon.

Audiovisual (AV) perception, an effective, commonly used modality for environmental cognition and social communication, involves multisensory processing of large-scale neuronal activities exhibiting nonlinear characteristics modulated by attentional mechanisms; however precise characterization of AV processing remains elusive. We designed an AV semantic discrimination experiment to acquire electroencephalogram (EEG) data under attended and unattended conditions. We developed an EEG microstate-based method for segmenting AV processing into functional sub-stages (temporally resolved neural signatures). Through hierarchical clustering of global field power-peak topographic maps and the Krzanowski-Lai Global Explained Variance-based evaluation, we identified distinct temporally continuous microstate sequences characterizing attended/unattended processing. By analyzing filtered EEG data across frequency bands, we quantified microstate attributes to derive time-frequency features that achieved 97.8% accuracy in classifying attended/unattended states and 98.6% accuracy in discriminating unimodal (visual/auditory) versus multimodal processing using multiple machine learning models. Our method effectively characterizes AV processing dynamics and provides neurophysiologically interpretable explanations for classification outcomes.

Keywords: Audiovisual processing, Electroencephalography, Microstates, time-frequency features, Attentional mechanism

Received: 10 Jun 2025; Accepted: 22 Jul 2025.

Copyright: © 2025 Xi, Zhang, Li, Lv and Lan. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Yang Xi, School of Computer Science and Technology, Changchun University of Science and Technology, Changchun, China

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.