Impact Factor 3.648 | CiteScore 3.99
More on impact ›

Original Research ARTICLE Provisionally accepted The full-text will be published soon. Notify me

Front. Neurosci. | doi: 10.3389/fnins.2019.01111

Network representations of facial and bodily expressions: Evidence from multivariate connectivity pattern classification

 Yin Liang1, 2*, Baolin Liu2, 3, Junzhong Ji1 and Xianglin Li4
  • 1Faculty of Information Technology, Beijing University of Technology, China
  • 2Key Laboratory of Cognitive Computing and Application, School of Computer Science and Technology, Tianjin University, China
  • 3Research State Key Laboratory of Intelligent Technology and Systems, National Laboratory for Information Science and Technology, Tsinghua University, China
  • 4Medical Imaging Research Institute, Binzhou Medical University, China

Emotions can be perceived from both facial and bodily expressions. Our previous study has found the successful decoding of facial expressions based on the functional connectivity (FC) patterns. However, the role of the FC patterns in the recognition of bodily expressions remained unclear, and no neuroimaging studies have adequately addressed the question of whether emotions perceiving from facial and bodily expressions are processed rely upon common or different neural networks. To address this, the present study collected functional magnetic resonance imaging (fMRI) data from a block design experiment with facial and bodily expression videos as stimuli (3 emotions: anger, fear, joy), and conducted multivariate pattern classification analysis based on the estimated FC patterns. We found that in addition to the facial expressions, bodily expressions could also be successfully decoded based on the large-scale FC patterns. And the emotion classification accuracies for the facial expressions were higher than that for the bodily expressions. Further contributive FC analysis showed that emotion-discriminative networks were widely distributed in both hemispheres, containing regions ranged from primary visual areas to higher-level cognitive areas. Moreover, for a particular emotion, discriminative FCs for facial and bodily expressions were distinct. Together, our findings highlight the key role of the FC patterns in the emotion processing, indicating how large-scale FC patterns reconfigure in processing of facial and bodily expressions, and suggest the distributed neural representation for the emotion recognition. Furthermore, our results also suggest that human brain employs separate network representations for facial and bodily expressions of the same emotions. This study provides new evidence for the network representations for emotion perception, and may further our understanding of the potential mechanisms underlying body language emotion recognition.

Keywords: facial expressions, bodily expressions, functional magnetic resonance imaging, functional connectivity, multivariate pattern classification

Received: 25 Jul 2019; Accepted: 02 Oct 2019.

Copyright: © 2019 Liang, Liu, Ji and Li. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Dr. Yin Liang, Faculty of Information Technology, Beijing University of Technology, Beijing, China, yinliang@tju.edu.cn