%A Piwek,Lukasz %A Pollick,Frank %A Petrini,Karin %D 2015 %J Frontiers in Psychology %C %F %G English %K multisensory integration,Social Interactions,Point-light display,Voice,happiness,Anger %Q %R 10.3389/fpsyg.2015.00611 %W %L %M %P %7 %8 2015-May-08 %9 Original Research %+ Lukasz Piwek,Behaviour Research Lab, Bristol Business School, University of the West of England,Bristol, UK,lpiwek@gmail.com %# %! Audiovisual integration of emotional signals from others’ social interactions. %* %< %T Audiovisual integration of emotional signals from others' social interactions %U https://www.frontiersin.org/articles/10.3389/fpsyg.2015.00611 %V 6 %0 JOURNAL ARTICLE %@ 1664-1078 %X Audiovisual perception of emotions has been typically examined using displays of a solitary character (e.g., the face-voice and/or body-sound of one actor). However, in real life humans often face more complex multisensory social situations, involving more than one person. Here we ask if the audiovisual facilitation in emotion recognition previously found in simpler social situations extends to more complex and ecological situations. Stimuli consisting of the biological motion and voice of two interacting agents were used in two experiments. In Experiment 1, participants were presented with visual, auditory, auditory filtered/noisy, and audiovisual congruent and incongruent clips. We asked participants to judge whether the two agents were interacting happily or angrily. In Experiment 2, another group of participants repeated the same task, as in Experiment 1, while trying to ignore either the visual or the auditory information. The findings from both experiments indicate that when the reliability of the auditory cue was decreased participants weighted more the visual cue in their emotional judgments. This in turn translated in increased emotion recognition accuracy for the multisensory condition. Our findings thus point to a common mechanism of multisensory integration of emotional signals irrespective of social stimulus complexity.