Skip to main content

ORIGINAL RESEARCH article

Front. Psychol., 06 February 2018
Sec. Emotion Science

Neural Correlates of Facial Mimicry: Simultaneous Measurements of EMG and BOLD Responses during Perception of Dynamic Compared to Static Facial Expressions

  • 1Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, SWPS University of Social Sciences and Humanities, Warsaw, Poland
  • 2Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland

Facial mimicry (FM) is an automatic response to imitate the facial expressions of others. However, neural correlates of the phenomenon are as yet not well established. We investigated this issue using simultaneously recorded EMG and BOLD signals during perception of dynamic and static emotional facial expressions of happiness and anger. During display presentations, BOLD signals and zygomaticus major (ZM), corrugator supercilii (CS) and orbicularis oculi (OO) EMG responses were recorded simultaneously from 46 healthy individuals. Subjects reacted spontaneously to happy facial expressions with increased EMG activity in ZM and OO muscles and decreased CS activity, which was interpreted as FM. Facial muscle responses correlated with BOLD activity in regions associated with motor simulation of facial expressions [i.e., inferior frontal gyrus, a classical Mirror Neuron System (MNS)]. Further, we also found correlations for regions associated with emotional processing (i.e., insula, part of the extended MNS). It is concluded that FM involves both motor and emotional brain structures, especially during perception of natural emotional expressions.

Introduction

Facial mimicry (FM) is an unconscious and unintentional automatic response to the facial expressions of others. Numerous studies have shown that observing the emotional states of others leads to congruent facial muscle activity. For example, observing angry facial expressions can result in enhanced activity in the viewer's muscle responsible for frowning (CS), while viewing happy images leads to Increased activity in the facial muscle involved in smiling (ZM), and decreased activity of the CS (Hess et al., 1998; Dimberg and Petterson, 2000). However, it has recently been suggested that FM may not be an exclusive automatic reaction but rather a multifactorial response dependent on properties such as stimulus modality (e.g., static or dynamic) or interpersonal characteristics (e.g., emotional contagion susceptibility) (for review see Seibt et al., 2015).

There are two main psychological approaches trying to explain the mechanisms of FM. One of these is the perception-behavior link model which assumes perception and execution of a specific action show a certain overlap (Chartrand and Bargh, 1999). According to the theory, mere perception of the emotional facial expressions of others automatically evokes the same behavior in the perceiver, and the facial expression is copied spontaneously (Chartrand and Bargh, 1999; Dimberg et al., 2000). This notion was supported by recent evidence from neuroimaging literature showing that both the perception and execution of facial emotional expressions engage overlapping brain structures, such as the inferior frontal gyrus (IFG) and inferior parietal lobule (IPL) (Carr et al., 2003; Rizzolatti and Craighero, 2004; Iacoboni and Dapretto, 2006), being regions constituting the classical mirror neuron system (MNS). An example of empirical support for this assumption can be found in a study involving patients with Parkinson's disease where patients demonstrated difficulties with both execution of emotional expression and identification of emotions (Livingstone et al., 2016). Another approach describes FM as a consequence of contagion to the emotional states of others (Hatfield et al., 1993; Bastiaansen et al., 2009). In other words, the observation of other's emotional facial expressions triggers corresponding emotions in the observer. It is suggested that contagion occur due to direct activation of neural substrate, which is involved in the experience of the observed emotion (Wicker et al., 2003). Those emotional-related brain structures, i.e. insula and amygdala, among others related to extended MNS, were activated during both the observation and execution of emotional facial expressions (Carr et al., 2003; van der Gaag et al., 2007; Kircher et al., 2013).

It is worth noting that most of what we know about the neural correlates of automatic FM has been derived from functional neuroimaging studies during the passive viewing or the imitation of emotional facial displays presented to subjects. Direct investigation of the neural correlates of FM, such as simultaneous measurement of BOLD responses (using functional magnetic resonance imaging, fMRI) and facial muscular reactions (using electromyography, EMG) may contribute to improved understanding of the neural basis of FM. To date, only one study (Likowski et al., 2012) has examined the brain structures involved in the occurrence of automatic facial reactions by simultaneously measuring BOLD and facial EMG signals in an MRI scanner. These investigators found that automatic and spontaneous FM of happiness, sadness, and anger displays led to activation of a prominent part of the classic MNS (i.e., the IFG), as well areas responsible for emotional processing (i.e., the insula). They concluded that the perception of emotional facial expressions activated a variety of structures presumed to belong to the classic and extended MNS, but only a small number were correlated with the magnitude of FM. It is currently unknown whether the perception of real, dynamic emotional facial expressions rather than static avatars, used in the study (Likowski et al., 2012), would reveal more associations between the strength of the FM reactions and regional brain activation. Importantly, recent neuroimaging studies (Trautmann et al., 2009; Arsalidou et al., 2011; Kessler et al., 2011) have found that the perception of dynamic emotional stimuli, in comparison to static stimuli, engages a widespread activation pattern that involves parts of the MNS, including the IFG (Sato et al., 2004, 2015; Kessler et al., 2011) and other emotion-related structures like the amygdala and insula (Kilts et al., 2003; Trautmann et al., 2009). Indeed, it has been demonstrated that dynamic emotional facial expressions can improve emotion recognition of subtle facial expressions (Ambadar et al., 2005; Trautmann et al., 2009), enhance emotional arousal (Sato and Yoshikawa, 2007), and elicit stronger FM than static presentations (Weyers et al., 2006; Sato et al., 2008; Rymarczyk et al., 2011). In light of these studies, determining which brain structures are involved in automatic, spontaneous FM could be addressed, at least in part, by simultaneous measurement of facial muscular activity (EMG) and the BOLD responses (fMRI) during passive perception of real, dynamic emotional facial expressions.

In the present study, we simultaneously recorded EMG and BOLD signals during the perception of realistic dynamic and static emotional facial expressions. We measured facial EMG responses from three muscles, the ZM, CS, and OO, while participants passively viewed happy, angry, and neutral displays. Following earlier research, we measured facial muscle activity over the cheek region ZM involved in smiling and over the brow CS region responsible for frowning (e.g., Andréasson and Dimberg, 2008). The activity over the eye OO, typically linked with true joy, smile expression (Hess and Blairy, 2001; Hess and Bourgeois, 2010; Korb et al., 2014) was also measured. It was proposed that contraction of OO transforms a non-Duchenne into a Duchenne smile (Ekman and Rosenberg, 2012). Other researchers suggested that contractions of OO could be additionally indicative for the negative signal value of anger configurations, discomfort-pain or distress-cry situations (Russell and Fernandez-Dols, 1997). Based on previous studies (van der Gaag et al., 2007; Jabbi and Keysers, 2008; Likowski et al., 2012), we anticipated that motor and emotional brain structures would be responsible for differences in automatic FM during perception of dynamic compared to static displays. We examined which of the classic and extended MNS regions showed a relationship with the strength of facial reactions. Furthermore, since dynamic facial expressions constitute a more powerful medium for emotional communication than static presentations, we anticipated that regional brain activation and muscle responses would be more pronounced for dynamic emotional facial expressions. We predicted that presentations of dynamic happy facial expressions would engage brain areas associated with the representation of pleasant feelings and reward (such as the basal ganglia structures, in particular the nucleus accumbens) and would correlate with increased activity of the ZM and OO muscles. For dynamic facial expression anger, we predicted co-activation of limbic structures (i.e., amygdala), proposed to be involved in the automatic detection of evolutionary threats (van der Zwaag et al., 2012), would be associated with CS activity.

Methods

Subjects

Forty-six healthy individuals (21 females, 26 males; mean age = 23.7 ± 2.5 years) participated in this study. The subjects had normal or corrected to normal eyesight and none of them reported neurological diseases. This study was carried out in accordance with the recommendations of Ethics Committee at the University of Social Sciences and Humanities with written informed consent from all subjects. All subjects gave written informed consent in accordance with the Declaration of Helsinki. The protocol was approved by the Ethics Committee at the University of Social Sciences and Humanities. An informed consent form was signed by each participant after the experimental procedures had been clearly explained. After the scanning session, subjects were informed about the aim of the study.

Facial Stimuli and Apparatus

We used videos and static emotional pictures illustrating forward-facing facial expressions of happiness and anger taken from The Amsterdam Dynamic Facial Expression Set (van der Schalk et al., 2011). Additionally, we included neutral conditions (no visible emotional facial expression) presented as static and dynamic displays. Dynamic stimuli clips of three males and females were used (F01, F03, F09, M03, M07, M11). Each character presented happy, angry, and neutral facial expressions and participants observed only one type of expression at a time (as a photo or a video). In the case of neutral dynamic condition, the motion could be observed because characters were either closing their eyes or slightly changing the position of their head. Each stimulus in the neutral static condition presented one frame from the dynamic video clip, used in the neutral dynamic condition. Stimuli were 576 pixels in height and 720 pixels in width. Expressions were presented on a gray background. All procedures were controlled using Presentation® software running on a computer with Microsoft Windows operating system and were displayed on a 32-inch NNL LCD MRI-compatible monitor (1,920 × 1,080 pixels resolution; 32 bit color rate; 60 Hz refresh rate) from a viewing distance of approximately 140 cm.

EMG Acquisition

Data were recorded using an MRI-compatible BrainCap (Brain Products) consisting of 3 bipolar and one reference electrode with a diameter of 2 mm and filled with electrode paste. The electrodes were positioned in pairs over three muscles—the CS, ZM and OO on the left side of the face (Cacioppo et al., 1986; Fridlund and Cacioppo, 1986). A reference electrode, 2 mm in diameter, was attached to the forehead. Before the electrodes were attached, the skin was cleaned with alcohol and a thin coating of electrode paste was applied. This procedure was repeated until electrode impedance was reduced to 5 kΩ or less. The digitized EMG signals were recorded using a BrainAmp MR plus ExG amplifier and BrainVision Recorder. The hardware low-pass filtered the signal at 250 Hz. Finally, data was digitized with a sampling rate of 5 kHz, and stored on a computer running MS Windows 7 for offline analysis.

Image Acquisition

MRI acquisition was acquired on a Siemens Trio 3 T MR-scanner equipped with 12-channel phased array head coil. Functional MRI images were registered using T2*-weighted EPI gradient-echo pulse sequence with the following parameters: TR = 2,000 ms, TE = 25 ms; 90° flip angle, FOV = 250 mm, matrix = 64 × 64, voxel size = 3.5 × 3.5 × 3.5 mm, interleaved even acquisition, slice thickness = 3.5 mm, 39 slices.

Procedure

Each volunteer was introduced to the experimental procedure and signed a consent form. To conceal the true purpose, facial electromyography recordings, participants were told that sweat gland activity was being recorded while watching the faces of actors selected for commercials by an external marketing company. Following the attachment of the electrodes of the FaceEMGCap-MR, participants were reminded to carefully observe the actors presented on the screen and were positioned in the scanner. The subjects were verbally encouraged to feel comfortable and behave naturally.

The scanning session started with a reminder of the subject's task. In the session subjects were presented with 72 trials that lasted approximately 15 min. Each trial started with a white fixation cross, 80 pixels in diameter, which was visible for 2 s in the center of the screen. Next, one of the stimuli with a facial expression (happy, angry or neutral, each presented as static image or dynamic video clip) was presented for 6 s. The expression was followed by a blank gray screen presented for 2.75–5.25 s (see Figure 1). All stimuli were presented in the center of the screen. In summary, each stimulus was repeated once, for a total of 6 presentations within a type of expression (e.g., 6 dynamic presentations of happiness). The stimulus appeared in an event-related manner, pseudo-randomized trial by trail with constraints in randomization: no facial expression from the same actor, and no more than 2 actors of the same sex or the same emotion were presented consecutively. In total, 6 randomized event-related sessions with introduced constraints were balanced between subjects.

FIGURE 1
www.frontiersin.org

Figure 1. Scheme of procedure used in the study.

Data Analysis

EMG Analysis

Pre-processing was carried out using The BrainVision Analyser 2 (version 2.1.0.327). First, EPI gradient-echo pulse artifacts were removed using the average artifact subtraction AAS method (Allen et al., 2000) implemented in the Analyser based on the sliding average calculation and consisting of 11 consecutive functional volumes marked in the data logs. A successful AAS method is possible due to synchronization hardware and markers that were created by the triggers received from the MR system. Next, data were filtered at 30 Hz high-pass and 500 Hz low-pass filters. After rectification and integration over 125 ms, the signal was resampled to 10 Hz. Artifacts related to EMG were detected in two ways. Firstly, when single muscle activity was above 8 μV at baseline (visibility of the fixation cross) (Weyers et al., 2006; Likowski et al., 2008, 2011), the trial was classified as an artifact and excluded from further analysis (M = 3,8 trials per participant were excluded). All remaining trials were blind-coded and visually checked for artifacts. Later, trials were baseline corrected such that the EMG response was measured as the difference of averaged signal activity between the stimuli duration (6 s) and baseline period (2 s). Finally, the signal was averaged for each condition, for each participant and was imported to SPSS 21 for statistical analysis.

For testing differences in EMG responses, a two-way repeated-measures ANOVA with two within-subjects factors (expression: happiness, anger, neutral; and stimulus modality: dynamic, static) were used. Separate ANOVAs were calculated for responses from a single muscle and reported with a Bonferroni correction. In order to confirm that EMG activity changed from baseline and FM occurred, the EMG data of each significant effect were tested for a difference from zero (baseline) using one-sample, two-tailed t-tests.

Image Processing and Analysis

Image processing and analysis were carried out using SPM12 (6470) run in MATLAB 2013b (The Mathworks Inc., 2013). Functional images were motion-corrected and co-registered to the mean functional image. Brain structural images were segmented into different tissue classes—gray matter, white matter, and non-brain (cerebrospinal fluid, skull) using the segmentation module. Next, a DARTEL algorithm was used to create a study-specific template for all participants based on segmented structural images. The template was later affine registered to the MNI space, and the functional images were warped to this template and resliced to 2 × 2 × 2 mm isotropic voxels to be later smoothed with an 8 × 8 × 8 mm full-width at half maximum Gaussian kernel. Single subject design matrices included six experimental conditions (dynamic: happiness, anger, neutral; and static: happiness, anger, neutral) that were modeled with standard hemodynamic response function and other covariates produced by Artifact Detection Toolbox (ART) that included head movements and other parameters excluding the artificial fMRI signal. Later, the same sets of contrasts of interest were calculated for each subject and used in group level analysis (i.e., one-sample t-test) for statistical Regions of Interest (ROIs) analysis. The analysis was performed using the MarsBar toolbox (Brett et al., 2002) specifically for the separate ROIs. Anatomical region of interest masks were created with the WFU Pickatlas (Wake Forest University, 2014) (primary motor, premotor cortex, IPL, BA44, BA45, amygdala, ACC, insula, caudate head, putamen, nucleus accumbens, globus pallidus), and SPM Anatomy Toolbox (Eickhoff, 2016) [MT+/V5, primary somatosensory cortex (Areas 1, 2, 3a, 3b)]. The STS and pre-SMA ROIs were based on activation peaks from the literature (Van Overwalle, 2009) and a meta-analysis (Kohn et al., 2014), and were defined as an overlapping set of peaks with a radius of 8 mm each. The data were extracted as mean values of each ROI and statistics of brain activity were reported with Bonferroni correction applied to the data (i.e., p-value divided by number of ROIs).

Correlation Analysis

Pearson correlation coefficients were calculated between selected contrasts of brain activity (happiness dynamic, happiness static, anger dynamic, anger static) and corresponding mimicry muscles activity in order to understand mutual relationship between brain activity and the facial muscle activity. Additionally, bias-corrected and accelerated (BCa) bootstrap 95% confidence intervals (samples = 1,000) were computed for Pearson correlation coefficients.

Each brain-ROI was represented by a single mean value (of all the voxels in anatomical atlas in each hemisphere). Each value was specific to each participant and ROI. Muscle activity was defined as average baseline corrected EMG trials of the same muscle and type. So the correlations were performed in pairs of variables of muscles (in specific conditions) and EMG responses, e.g., happiness_static_ZM with happiness_static_insulaRight.

Results

EMG Measures

M. corrugator supercilii

ANOVA showed a significant main effect of expression [F(1.722, 65.422) = 30.676, p < 0.001, η2 = 0.447], indicating that activity of the CS for happiness (M = −0.366, SE = 0.072) was lower than for angry [M = 0.168, SE = 0.067; t(36) = 6.271, p < 0.001] and neutral expressions [M = 0.067, SE = 0.030; t(36) = 6.186, p < 0.001]. The main effect of modality [F(1, 38) = 4.020, p = 0.052, η2 = 0.096] approached significance with the activity of CS generally higher for static (M = 0.007, SE = 0.047) than dynamic [M = −0.094, SE = 0.050] facial expressions. The significant interaction of expression × modality [F(1.389, 52.774) = 3.964, p = 0.039, η2 = 0.094] revealed that activity of the CS for dynamic and static happiness was lower than that for angry [t(33)dynamic = 5.044, p < 0.001; t(33)static = 5.219, p < 0.001] and neutral [t(33)dynamic = 4.815, p < 0.001; t(33)static = 3.959, p < 0.01] facial expressions, respectively (see Figure 2). The decrease of CS activity was higher for dynamic than static happiness conditions [t(33) = 2.269, p = 0.029].

FIGURE 2
www.frontiersin.org

Figure 2. Mean (±SE) EMG activity changes and corresponding statistics for corrugator supercilii during presentation conditions. Asterisks with lines beneath indicate significant differences between conditions (simple effects) in EMG responses: *p < 0.05, ***p < 0.001. Separate asterisks indicate significant differences from baseline EMG responses: +p < 0.1, *p < 0.05, **p < 0.01, ***p < 0.001.

One-sample t-tests revealed significantly lower CS activity for dynamic [t(40) = −4.595, p < 0.001] and static [t(40) = −2.618, p = 0.012] happiness conditions, compared to baseline. CS responses for static anger [t(41) = 2.724, p = 0.009] were higher than baseline. All other conditions were marginally higher than baseline [t(39)anger_dynamic = 2.016, p = 0.051; t(39)neutral_dynamic = 1.858, p = 0.071; t(39)neutral_static = 1.827, p = 0.075].

M. orbicularis oculi

ANOVA showed a significant main effect of expression [F(2, 76) = 15.561, p < 0.001, η2 = 0.291], indicating that activity of the OO for happiness (M = 0.207, SE = 0.075) was higher than for angry [M = −0.054, SE = 0.055; t(36) = 4.279, p < 0.001] and for neutral expressions [M = −0.111, SE = 0.045; t(36) = 4.746, p < 0.001]. A significant expression × modality interaction [F(1.688, 64.132) = 5.217, p = 0.011, η2 = 0.121] revealed that OO activity for dynamic expressions was higher than for static happiness [t(33) = 3.099, p = 0.009]. Other observed differences included higher OO activity in the happiness dynamic compared to the angry dynamic [t(33) = 4.303, p < 0.001] and neutral dynamic [t(33) = 4.679, p < 0.001] facial expressions (see Figure 3).

FIGURE 3
www.frontiersin.org

Figure 3. Mean (±SE) EMG activity changes and corresponding statistics for orbicularis oculi during presentation conditions. Asterisks with lines beneath indicate significant differences between conditions (simple effects) in EMG responses: *p < 0.05, ***p < 0.001. Separate asterisks indicate significant differences from baseline EMG responses: **p < 0.01.

One-sample t-tests revealed increased OO activity, compared to baseline, for dynamic happiness [t(40) = 3.328, p = 0.002] and reduced activity for dynamic neutral [t(40) = −2.862, p = 0.007] facial expressions. All other OO activities did not differ from baseline [t(40)happiness_static = 1.032, p = 0.308; t(39)anger_dynamic = −0.916, p = 0.365; t(41)anger_static = −0.113, p = 0.911; t(39)neutral_static = −0.857, p = 0.397].

M. zygomaticus major

ANOVA showed a significant main effect of expression [F(1.142, 43.404) = 11.060, p < 0.001, η2 = 0.225], indicating that activity of the ZM for happiness [M = 0.404, SE = 0.138] was increased compared to angry [M = −0.125, SE = 0.054; t(36) = 3.458, p = 0.004] and neutral expressions [M = −0.140, SE = 0,043; t(36) = 3.358, p = 0.005]. The main effect of modality approached significance [F(1, 38) = 3.545, p = 0.067, η2 = 0.085], with activity of the ZM greater for dynamic [M = 0.091, SE = 0.091] than static [M = 0.003, SE = 0.043] facial expressions. A significant expression × modality interaction [F(1.788, 67.943) = 4.385, p = 0.020, η2 = 0.103] revealed that ZM activity was higher for dynamic than for static happiness [t(33) = 2.681, p = 0.011]. Higher ZM activity was observed in dynamic happiness compared to angry [t(33) = 3.541, p = 0.003] and neutral [t(33) = 3.354, p = 0.006] facial expressions. Results of higher ZM activity were observed during comparison of static happiness with static angry [t(33) = 3.124, p = 0.011] and neutral [t(33) = 3.050, p = 0.013] facial expression conditions (see Figure 4).

FIGURE 4
www.frontiersin.org

Figure 4. Mean (±SE) EMG activity changes and corresponding statistics for zygomaticus major during presentation conditions. Asterisks with lines beneath indicate significant differences between conditions (simple effects) in EMG responses: *p < 0.05, ***p < 0.001. Separate asterisks indicate significant differences from baseline EMG responses: *p < 0.05, **p < 0.01.

One-sample t-tests revealed increased ZM activity, compared to baseline, for dynamic [t(40) = 3.217, p = 0.003] and static [t(40) = 2.415, p = 0.020] happiness and lower activity for dynamic [t(39) = −2.307, p = 0.026] and static [t(39) = 3.612, p = 0.001] neutral facial expressions. Mean ZM activity for anger did not differ from baseline [t(39) dynamic = −0.688, p = 0.498; t(41) static = −1.589, p = 0.120].

fMRI Data

Region of interest (ROI) analyses were carried out for the contrasts comparing brain activation during dynamic vs. static expressions, resulting in 11 contrasts of interest: happiness dynamic > happiness static, anger dynamic > anger static, neutral dynamic > neutral static, emotion dynamic > emotion static (emotion dynamic—pooled dynamic happiness, and anger conditions; emotion static—similar pooling), all dynamic > all static (all dynamic—pooled dynamic happiness, anger and neutral conditions; all static—similar pooling), happiness dynamic > neutral dynamic, happiness static > neutral static, anger dynamic > neutral dynamic, anger static > neutral static, emotion dynamic > neutral dynamic, emotion static > neutral static. Mentioned contrasts were calculated in order to investigate two types of questions. The contrast emotion/happiness/anger/all dynamic/static > neutral dynamic/static addresses neural correlates of FM of emotional/happiness/anger/all expressions. The other contrasts (i.e., emotion/happiness/anger/all dynamic > emotion/happiness/anger/all static) relate to the difference in processing between dynamic and static stimuli.

ROI analyses indicated that for the happiness dynamic > happiness static contrast, V5/MT+ and STS were activated bilaterally. Other structures for the contrast were activated only in the right hemisphere (i.e., pre-SMA, IPL, BA45) (see Table 1; for whole brain analysis see Supplementary Table 1).

TABLE 1
www.frontiersin.org

Table 1. Summary statistics of ROIs' activations for happiness dynamic > happiness static contrast.

For the anger dynamic > anger static contrast, V5/MT+ and STS were also activated bilaterally. However, this contrast revealed additional bilateral activation of the amygdala. Other structures revealed by this contrast were visible only in the right hemisphere (i.e., pre-SMA and BA45) (see Table 2; for whole brain analysis see Supplementary Table 2).

TABLE 2
www.frontiersin.org

Table 2. Summary statistics of ROIs' activations for anger dynamic > anger static contrast.

For the neutral dynamic > neutral static contrast, only V5/MT+ and STS were activated bilaterally (see Table 3; for whole brain analysis see Supplementary Tables 3–5).

TABLE 3
www.frontiersin.org

Table 3. Summary statistics of ROIs' activations for neutral dynamic > neutral static contrast.

The emotion dynamic > emotion static contrast, revealed bilateral activations of V5/MT+, STS, BA45, BA44, and Amygdala. Additionally, this contrast revealed activations of Putamen, Globus Pallidus, IPL and pre-SMA in the right hemisphere (see Table 4; for whole brain analysis see Supplementary Table 4).

TABLE 4
www.frontiersin.org

Table 4. Summary statistics of ROIs' activations for emotion dynamic > emotion static contrast.

The all dynamic > all static contrast, illustrating general processing of dynamic compared to static expressions, revealed bilateral activations of V5/MT+, STS, BA44, BA45, and the amygdala. Moreover, few cortical areas and subcortical structures were activated only in the right hemisphere (i.e., premotor cortex (trend effect), pre-SMA, IPL, caudate head, putamen and globus pallidus) (see Table 5; for whole brain analysis see Supplementary Table 5).

TABLE 5
www.frontiersin.org

Table 5. Summary statistics of ROIs' activations for all dynamic > all static expressions contrast.

The happiness dynamic > neutral dynamic contrast, showed bilateral activations of V5/MT+, STS, pre-SMA, IPL, BA45, Amygdala, Anterior Cingulate Cortex, Caudate Head, Putamen, and Globus Pallidus. Activation of BA44 was visible only in the left hemisphere (see Table 6; for whole brain analysis see Supplementary Table 6).

TABLE 6
www.frontiersin.org

Table 6. Summary statistics of ROIs' activations for happiness dynamic > neutral dynamic contrast.

For the happiness static > neutral static contrast, only pre-SMA was activated bilaterally (see Table 7; for whole brain analysis see Supplementary Table 7).

TABLE 7
www.frontiersin.org

Table 7. Summary statistics of ROIs' activations for happiness static > neutral static contrast.

For the anger dynamic > neutral dynamic contrast, analysis revealed bilateral activations of V5/MT+, STS, Amygdala and BA45. Pre-SMA activation was visible only in the right hemisphere (see Table 8; for whole brain analysis see Supplementary Table 8).

TABLE 8
www.frontiersin.org

Table 8. Summary statistics of ROIs' activations for anger dynamic > neutral dynamic contrast.

The anger static > neutral static contrast, revealed no significant activations of brain structures (see Table 9; for whole brain analysis see Supplementary Table 9).

TABLE 9
www.frontiersin.org

Table 9. Summary statistics of ROIs' activations for anger static > neutral static contrast.

The emotion dynamic > neutral dynamic contrast, showed that V5/MT+, STS, Amygdala, BA45, pre-SMA and Globus Pallidus were activated bilaterally. Additionally, this contrast revealed IPL activation in the left hemisphere and Caudate Head activation in the right hemisphere (see Table 10; for whole brain analysis see Supplementary Table 10).

TABLE 10
www.frontiersin.org

Table 10. Summary statistics of ROIs' activations for emotion dynamic > neutral dynamic contrast.

For the emotion static > neutral static contrast, ROI analysis revealed only pre-SMA activations (see Table 11; for whole brain analysis see Supplementary Table 11).

TABLE 11
www.frontiersin.org

Table 11. Summary statistics of ROIs' activations for emotion static > neutral static contrast.

Correlation Analysis

Muscle-Brain Correlations of Dynamic and Static Happiness Conditions

Correlation analyses computed for the happiness dynamic condition with ZM revealed positive relations bilaterally in the pre-SMA (trend effect), putamen, nucleus accumbens and globus pallidus. Trend effects were found in the activations of the right BA44 and insular cortex. No relationships were found between brain activity in the happiness dynamic conditions and OO muscle activity. For the CS, negative relations were found for V5/MT+, STS, BA45 in the left hemisphere, while IPL and ACC in the right hemisphere. Negative trend relationships were found bilaterally in the caudate head (see Table 12).

TABLE 12
www.frontiersin.org

Table 12. Muscles-brain correlations of dynamic and static happiness conditions.

Correlation analyses computed for the happiness static condition with ZM indicated positive relationships for the left insula, putamen and globus pallidus. Trend positive effects of ZM and brain activity were found in the right insula and putamen. Positive relationships of the OO and brain activity during perception of the happiness static condition were found in the right primary motor cortex, right primary somatosensory cortex and left insula. Trend effects of the OO and brain activity were observed for the left primary motor cortex, premotor cortex and caudate head. Negative relationships of the CS and brain activity were found for the premotor cortex and BA44 in the left hemisphere. Moreover, CS activity was negatively related to activity of the primary somatosensory cortex (bilaterally), primary motor cortex (bilaterally) and premotor cortex (right) (see Table 12).

Muscle-Brain Correlations of Dynamic and Static Anger Conditions

Correlation analyses performed for the anger dynamic condition indicated a negative relationship of the CS and activity in left BA44. The positive relationship was found with the OO and brain activity during perception of dynamic angry expressions in the STS (bilaterally) and right premotor cortex. Trend positive relationships were found in the primary motor cortex (bilaterally) and right BA45, amygdala and insula (see Table 13).

TABLE 13
www.frontiersin.org

Table 13. Muscles-brain correlations of dynamic and static anger conditions.

Positive relationships of brain and CS activity for static anger were observed in the right STS and right IPL (trend effect). Activity in the pre-SMA (bilaterally) was positively related to OO activity during perception of angry pictures. Trend effects of the relationship between the OO and brain activity during perception of angry static conditions were observed in the right caudate (positive), left BA45 (positive) and in V5/MT+ (negative, see Table 13).

Discussion

The present study examined neural correlates of FM during the observation of dynamic compared to static facial expressions. Proofs of concept came from facial EMG, fMRI, and combined EMG-fMRI analyses. Firstly, the anticipated patterns of mimicry were observed, demonstrated by increased ZM and OO activity and decreased CS activity for happiness (Rymarczyk et al., 2016), as well as increased CS activity for anger (Dimberg and Petterson, 2000). Moreover, we found that dynamic presentations of happy facial expressions induced higher EMG amplitude in the ZM, OO, and CS compared to static presentations. Angry facial expression were not associated with differences in the CS response between static and dynamic displays. Analysis of fMRI data revealed that dynamic (compared to static) emotional expressions activated bilateral STS, V5/MT+, and frontal and parietal areas. On the other hand, the perception of neutral dynamic compared to neutral static facial displays activated only structures related to biological motion i.e., bilaterally V5/MT+ and STS. Furthermore, some interaction effects of emotion and modality were found. For example, dynamic compared to static displays induced greater activity in the bilateral amygdala for anger, while this effect was found in the right IPL for happiness. The correlations between brain activity and facial muscle reactions revealed that correlated regions are related to the motor simulation of facial expressions, such as the IFG, which is considered a classical MNS. Conversely, the correlations between brain activity and facial muscle reactions also demonstrate a role in emotional processing, such as in the insula, which is part of extended MNS.

EMG Response for Dynamic Compared to Static Facial Expressions

The recorded EMG data showed that the subjects reacted spontaneously to happy facial expressions with increased ZM and OO activity (Rymarczyk et al., 2016) and decreased CS activity, interpretable as FM (Dimberg and Thunberg, 1998). However, EMG responses observed in our study were low in amplitude but comparable to other reports (Sato et al., 2008; Dimberg et al., 2011; Rymarczyk et al., 2011). In all muscles, the response was more pronounced when dynamic happy stimuli were presented (Weyers et al., 2006; Sato et al., 2008; Rymarczyk et al., 2011), which points to the benefits of applying dynamic stimuli (Murata et al., 2016). Patterns of ZM and OO reactions observed for dynamic happiness could be interpreted as a Duchenne smile (Ekman et al., 1990), suggesting that subjects could experience true and genuine positive emotion. Moreover, we observed higher CS reactions, similar for static and dynamic anger conditions, showing typical evidence of FM for this emotion (Sato et al., 2008; Dimberg et al., 2011). Increased CS response was found for neutral facial expressions as well. Some studies have reported increased CS activity as a function of mental effort (Neumann and Strack, 2000), disapproval (Cannon et al., 2011) or global negative affect (Larsen et al., 2003). In the case of our study, we interpret that increased CS activity for neutral facial expressions was a consequence of the instruction used in the procedure that asked subjects to pay careful attention (i.e., mental effort) to observed actors.

Neural Network for Dynamic Compared to Static Facial Expressions

We found that passive viewing of emotional dynamic, compared to neutral dynamic stimuli, activated a wide network of brain regions. This network included the inferior frontal gyrus (left BA44 and bilaterally BA45), left IPL, bilaterally preSMA, STS and V5/MT+, as well as left and right amygdala, right caudate head and bilaterally pallidus. In contrast, the emotional static displays compared to neutral displays activated only bilateral preSMA. The last mentioned neuronal pattern was significant due to happiness. Furthermore, dynamic happiness evoked activity that was greater than static happiness in the right IPL, while dynamic vs. static angry faces evoked greater bilateral activity in the amygdala. As expected, we found that, regardless of a specific emotion, dynamic stimuli selectively activated the bilateral visual area V5/MT+ and superior temporal sulcus, structures associated with motion and biological motion perception, respectively (Robins et al., 2009; Arsalidou et al., 2011; Foley et al., 2012; Furl et al., 2015). Recently, in a magneto-encephalography (MEG) study, Sato et al. (2015) explored temporal profiles and dynamic interaction patterns of brain activity during perception of dynamic emotional facial expressions in comparison to dynamic mosaics. Notably, they found that, apart from V5/MT+ and STS, the right IFG exhibited higher activity for dynamic faces vs. dynamic mosaics. Furthermore, they have found a direct functional connectivity between the STS and IFG, closely related to FM.

Our findings concerning IFG are in line with those of previous studies (Carr et al., 2003; Leslie et al., 2004) suggesting that perception of emotional facial displays involves a classical MNS, which is sensitive to goal-directed actions. There is an assumption that during the observation of another individual's actions, the brain simulates the same action by activating the neurons localized in the IFG, which are involved in executing the same behavior (Jabbi and Keysers, 2008). For example, Carr et al. (2003) asked subjects to observe and imitate static emotional facial expressions and found that both tasks induced extensive activity in the IFG. Activation of the right IFG and parietal cortex were also found during passive viewing of dynamic compared to static emotional facial expressions (Arsalidou et al., 2011; Foley et al., 2012) or during viewing and executing smiles (Hennenlotter et al., 2005). It was suggested that activated mirror neurons localized in the IFG and parietal regions could convert observed emotional facial expressions into a pattern of neural activity that would be suitable for producing similar facial expressions, and would provide the basis for a motor simulation of facial expressions (Gazzola et al., 2006; van der Gaag et al., 2007; Jabbi and Keysers, 2008). Our results seems to be in line with the perception-behavior link model (Chartrand and Bargh, 1999), which assumes that an observer's motor system “resonates” and facilitates the understanding of the perceived action. It is believed that classical MNS are responsible for such processes (for a review, see Bastiaansen et al., 2009). Moreover, it seems that dynamic emotional facial expressions might be a stronger social signal to induce imitation processes in MNS, since we did not observe activity of IFG when comparing neutral dynamic to neutral static facial expressions' contrasts.

To summarize, our findings have revealed that functional properties of classical MNS manifest mainly during perception of dynamic compared to static facial displays. It may be justified that dynamic stimuli, which are relevant for social interaction, engage a wide network of brain regions sensitive to motion stimuli (Kilts et al., 2003; Kessler et al., 2011) and signaling intentions (Gallagher et al., 2000; Pelphrey et al., 2003), and thus may be a strong social signal to induce simulation processes in MNS.

Relationships between Neural Activity and Facial Muscle Responses

One of the fundamental questions regarding the neural basis of FM is whether this phenomenon involves motor and/or affective representations of observed expressions (for a review, see Bastiaansen et al., 2009). So far, only one study has examined that question with simultaneous recording of EMG and BOLD signal, However, only static avatar emotional expressions were used (Likowski et al., 2012). In our study, where both static and dynamic natural displays were applied, the associations between activity of brain regions and facial muscle reactions revealed that correlated regions are related to motor (IFG, pre-SMA, IPL) simulation of facial expressions, but also to emotional processing. Additionally, happiness display correlations with muscle responses were found in basal ganglia structures (right caudate head, bilaterally globus pallidus and putamen), nucleus accumbens and insula, while for angry displays, in the right amygdala and insula, among others.

Activations in the IFG and pre-SMA observed in our study coincide with earlier studies (Hennenlotter et al., 2005; Lee et al., 2006; Jabbi and Keysers, 2008; Likowski et al., 2012; Kircher et al., 2013) that claimed that the regions constitute a representation network for observation and imitation of emotional facial expressions (for a review, see Bastiaansen et al., 2009). For example, Lee et al. (2006), who also explored the relation between brain activity and facial muscle movement (facial markers), emphasized the role of the IFG in intentional imitation of emotional expressions.

It is interesting that the results of our study indicated that activation of the pre-SMA correlated with magnitude of facial muscle response for happy dynamic displays. Consistent with our results, similar ones were observed in a study by Iwase et al. (2002) during spontaneous facial execution of smiling. It was proposed that the activation of the pre-SMA could be understood as contagion of the happy facial expressions (Dimberg et al., 2002) due to pre-SMA connections to the striatum (Lehéricy et al., 2004), a critical component of the motor and reward systems. Moreover, it is well known that smiles evoke a positive response (Sims et al., 2012), serving as socially rewarding stimuli (Heerey and Crossley, 2013) in face-to-face interactions. This interpretation fits with our results of the involvement of the basal ganglia and nucleus accumbens, structures constituting reward-related circuitry (for a review, see Kringelbach and Berridge, 2010) in the processing of positive facial expressions. Basal nuclei activity correlated positively with ZM (nucleus accumbens, putamen) and negatively with CS (caudate head) activity for happiness dynamic displays, which is consistent with previous findings in the literature. It has been shown that the nuclei accumbens responds for different positive stimuli, such as money (Clithero et al., 2011), erotic pictures (Sabatinelli et al., 2007) or happy facial displays (Monk et al., 2008), and is thought to be involved in the experience of pleasure (Ernst et al., 2004). This interpretation is supported by the fact that for happiness displays we also found a significant EMG response in the OO muscle, which could mean that subjects recognized happiness as “real” smiles. Furthermore, our results remain in agreement with another earlier study (Likowski et al., 2012) in which the stronger ZM reactions to happy faces were associated with an increase in activity in the right caudate. This corresponds to Vrticka et al. (2013) who showed that the left putamen is more activated during imitation than passive observation of happy displays.

Interestingly, in our study, the activation in the right caudate also correlated positively with OO reactions for anger expressions. Caudate nucleus, part of the dorsal stratum, is known to be involved in motor and non-motor processes, e.g., including procedural learning, associative learning and inhibitory control of action (Soghomonian, 2016). Moreover, it is suggested that activity of the basal ganglia also reflect approach motivation and could represent reward (O'Doherty et al., 2003; Lee et al., 2006). Recently, Mühlberger et al. (2011) reported that perception of both happy and anger dynamic facial expressions were related to dorsal striatum activity. Furthermore, the activity of caudate nuclei during perception of anger may reflect a more general role in detection of danger signals. For example, it has been shown that PD subjects exhibited selective impairments in the recognition of negative facial emotion, e.g., for anger (Sprengelmeyer et al., 2003; Clark et al., 2008); fear (Livingstone et al., 2016), and sadness and disgust (Sprengelmeyer et al., 2003; Dujardin et al., 2004). Accordingly, neuroimaging data from healthy subjects tends to confirm the role of caudate nuclei in processing of negative emotions, particularly in recognition of angry expression (Beyer et al., 2017).

Importantly, we observed that during perception of anger dynamic displays, OO response correlated not only with caudate nucleus but with the right amygdala activity as well. Historically, the amygdala has been observed as playing a substantial role in the processing and expression of fear, but has recently been linked to other emotions, both positive and negative. For example, some studies have found amygdala activation during the observation and execution of both negative and positive facial expressions (Carr et al., 2003; van der Gaag et al., 2007), suggesting that this structure may reflect not only imitation but also the experience of a particular emotion (Kircher et al., 2013). As far as contraction of OO for anger expressions is concerned, this could be interpreted as a reaction to negative signal value or as a sign of arousal or interest (Witvliet and Vrana, 1995).

Further, in our study, we observed correlations between activity of the insula and facial responses during perception of both happy and angry facial expressions. Recently, a considerable number of studies (Carr et al., 2003; van der Gaag et al., 2007; Jabbi and Keysers, 2008) have suggested that the anterior insula and adjacent inferior frontal operculum (IFO) may represent an emotional component of the MNS. The role of those structures has been shown not only for observing but also for experiencing of emotions [i.e., unpleasant odors, (Wicker et al., 2003) or tastes (Jabbi et al., 2007)]. Moreover, the insula is involved in the experience of positive emotions, such as during the viewing of pleasing facial expressions (Jabbi et al., 2007), or during observation and execution of smile expressions (Hennenlotter et al., 2005). As far as the nature of FM is concerned, there is an idea that the insula and IFO may underlie a simulation of emotional feeling states (referred to as hot simulation). In contrast, the IFG (which activates during observation of neutral and emotional facial expressions) may reflect a form of motor simulation (referred to as cold simulation) (for a review, see Bastiaansen et al., 2009). The support for this idea comes from connectivity analysis of IFO and IFG activity where subjects experience unpleasant and neutral tastes. Using Granger causality, Jabbi and Keysers (2008) showed that activity in the IFO (a structure functionally related to the insula) is causally triggered by activity in the IFG. In other words, motor simulation in the IFG seems to trigger an affective simulation in the IFO of what the other person is feeling. Our results regarding the correlated activity of the IFG and muscle responses, as well as the separate correlated activity between those muscle responses and the insula, seem to be in line with the aforementioned interpretation.

It should be noted that in our study, as in others (Lee et al., 2006; van der Gaag et al., 2007), we did not observed activity in the motor or somatosensory cortex during passive viewing of emotional expressions. Indeed, there is a theoretical assumption that FM processes activate motor as well as somatosensory neuronal structures involved in processing the facial expression (Korb et al., 2015; Paracampo et al., 2016; Wood et al., 2016). Conversely, based on neuroimaging data, it seems that the magnitude of facial muscular change during emotional expression resonates activity related to emotion processing, i.e., insula or amygdala (Lee et al., 2006; van der Gaag et al., 2007), rather than the motor and somatosensory cortex. Moreover, it was shown that explicit imitation and not passive observation of facial expressions engages more somatosensory and premotor cortices. Accordingly, it was shown that activity in IFG was more pronounced during imitation than passive viewing of emotional expression (Carr et al., 2003).

In conclusion, our study confirmed the general agreement that exists among researchers that dynamic facial expressions are a valuable source of information in social communication. The evidence was visible during the stronger FM and greater neural network activations during dynamic compared to static facial expressions of happiness and anger. Moreover, the direct relationships between FM response and brain activity revealed that the associated structures belong to motor and emotional components of the FM phenomenon. The activity of the IFG and pre-SMA (classical MNS) appears to reflect action representation (i.e., the motor aspects of observed facial expressions), while the insula and amygdalae (extended MNS) process the emotional content of facial expressions. Furthermore, it seems that our results agree with the proposal that FM is not pure motor copy of behavior but rather it engages unique neural networks involved in emotion processing. Based on the current set of knowledge, it seems FM includes motor imitation and emotional contagion processes; however, their mutual relations are so far not established conclusively. For example, it could be possible that motor imitation leads to emotional contagion or vice versa, among other factors which could play an important role in social interactions.

Author Contributions

Conceived and designed the experiments: KR and ŁZ. Performed the experiments: KR and ŁZ. Analyzed the data: KR and ŁZ. Contributed materials: KR and ŁZ. Wrote the paper: KR, ŁZ, KJ-S, and IS.

Funding

This study was supported by grant no. 2011/03/B/HS6/05161 from the Polish National Science Centre provided to KR.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fpsyg.2018.00052/full#supplementary-material

References

Allen, P. J., Josephs, O., and Turner, R. (2000). A method for removing imaging artifact from continuous EEG recorded during functional MRI. Neuroimage 12, 230–239. doi: 10.1006/nimg.2000.0599

PubMed Abstract | CrossRef Full Text | Google Scholar

Ambadar, Z., Schooler, J. W., and Cohn, J. F. (2005). Deciphering the enigmatic face: the importance of facial dynamics in interpreting subtle facial expressions. Psychol. Sci. 16, 403–410. doi: 10.1111/j.0956-7976.2005.01548.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Andréasson, P., and Dimberg, U. (2008). Emotional empathy and facial feedback. J. Nonverbal Behav. 32, 215–224. doi: 10.1007/s10919-008-0052-z

CrossRef Full Text | Google Scholar

Arsalidou, M., Morris, D., and Taylor, M. J. (2011). Converging evidence for the advantage of dynamic facial expressions. Brain Topogr. 24, 149–163. doi: 10.1007/s10548-011-0171-4

PubMed Abstract | CrossRef Full Text | Google Scholar

Bastiaansen, J. A. C. J., Thioux, M., and Keysers, C. (2009). Evidence for mirror systems in emotions. Biol. Sci. Evol. Dev. Int. Control Imitation Philos. Trans. R. Soc. B 364, 2391–2404. doi: 10.1098/rstb.2009.0058

PubMed Abstract | CrossRef Full Text | Google Scholar

Beyer, F., Krämer, U. M., and Beckmann, C. F. (2017). Anger-sensitive networks: characterising neural systems recruited during aggressive social interactions using data-driven analysis. Soc. Cogn. Affect. Neurosci. 26, 2480–2492. doi: 10.1093/scan/nsx117

CrossRef Full Text

Brett, M., Anton, J. L., Valabregue, R., and Poline, J. B. (2002). “Region of interest analysis using an SPM toolbox,” in Neuroimage (Sendai: 8th International Conference on Functional Mapping of the Human Brain). Available online at: http://matthew.dynevor.org/research/abstracts/marsbar/marsbar_abstract.pdf

Cacioppo, J. T., Petty, R. E., Losch, M. E., and Kim, H. S. (1986). Electromyographic activity over facial muscle regions can differentiate the valence and intensity of affective reactions. J. Pers. Soc. Psychol. 50, 260–268. doi: 10.1037/0022-3514.50.2.260

PubMed Abstract | CrossRef Full Text | Google Scholar

Cannon, P. R., Schnall, S., and White, M. (2011). Transgressions and expressions: affective facial muscle activity predicts moral judgments. Soc. Psychol. Personal. Sci. 2, 325–331. doi: 10.1177/1948550610390525

CrossRef Full Text | Google Scholar

Carr, L., Iacoboni, M., Dubeau, M. C., Mazziotta, J. C., and Lenzi, G. L. (2003). Neural mechanisms of empathy in humans: a relay from neural systems for imitation to limbic areas. Proc. Natl. Acad. Sci. U.S.A. 100, 5497–5502. doi: 10.1073/pnas.0935845100

PubMed Abstract | CrossRef Full Text | Google Scholar

Chartrand, T. L., and Bargh, J. A. (1999). The chameleon effect: the perception-behavior link and social interaction. J. Pers. Soc. Psychol. 76, 893–910. doi: 10.1037/0022-3514.76.6.893

PubMed Abstract | CrossRef Full Text | Google Scholar

Clark, U. S., Neargarder, S., and Cronin-Golomb, A. (2008). Specific impairments in the recognition of emotional facial expressions in Parkinson's disease. Neuropsychologia 46, 2300–2309. doi: 10.1016/j.neuropsychologia.2008.03.014

PubMed Abstract | CrossRef Full Text | Google Scholar

Clithero, J. A., Reeck, C., Carter, R. M., Smith, D. V., and Huettel, S. A. (2011). Nucleus accumbens mediates relative motivation for rewards in the absence of choice. Front. Hum. Neurosci. 5:87. doi: 10.3389/fnhum.2011.00087

PubMed Abstract | CrossRef Full Text | Google Scholar

Dimberg, U., and Petterson, M. (2000). Facial reactions to happy and angry facial expressions: evidence for right hemisphere dominance. Psychophysiology 37, 693–696. doi: 10.1111/1469-8986.3750693

PubMed Abstract | CrossRef Full Text | Google Scholar

Dimberg, U., and Thunberg, M. (1998). Rapid facial reactions to emotional facial expressions. Scand. J. Psychol. 39, 39–45. doi: 10.1111/1467-9450.00054

PubMed Abstract | CrossRef Full Text | Google Scholar

Dimberg, U., Andréasson, P., and Thunberg, M. (2011). Emotional empathy and facial reactions to facial expressions. J. Psychophysiol. 25, 26–31. doi: 10.1027/0269-8803/a000029

CrossRef Full Text | Google Scholar

Dimberg, U., Thunberg, M., and Elmehed, K. (2000). Unconscious facial reactions to emotional facial expressions. Psychol. Sci. J. Am. Psychol. Soc. Austr. Psychol. Soc. 11, 86–89. doi: 10.1111/1467-9280.00221

PubMed Abstract | CrossRef Full Text | Google Scholar

Dimberg, U., Thunberg, M., and Grunedal, S. (2002). Facial reactions to emotional stimuli: automatically controlled emotional responses. Cogn. Emot. 16, 449–471. doi: 10.1080/02699930143000356

CrossRef Full Text | Google Scholar

Dujardin, K., Blairy, S., Defebvre, L., Duhem, S., Noël, Y., Hess, U., et al. (2004). Deficits in decoding emotional facial expressions in Parkinson's disease. Neuropsychologia 42, 239–250. doi: 10.1016/S0028-3932(03)00154-4

PubMed Abstract | CrossRef Full Text | Google Scholar

Ekman, P., and Rosenberg, E. L. (2012). What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS). doi: 10.1093/acprof:oso/9780195179644.001.0001

CrossRef Full Text

Ekman, P., Davidson, R. J., and Friesen, W. V. (1990). The Duchenne smile: emotional expression and brain physiology. II. J. Pers. Soc. Psychol. 58, 342–353. doi: 10.1037/0022-3514.58.2.342

PubMed Abstract | CrossRef Full Text | Google Scholar

Ernst, M., Nelson, E. E., McClure, E. B., Monk, C. S., Munson, S., Eshel, N., et al. (2004). Choice selection and reward anticipation: an fMRI study. Neuropsychologia 42, 1585–1597. doi: 10.1016/j.neuropsychologia.2004.05.011

PubMed Abstract | CrossRef Full Text | Google Scholar

Foley, E., Rippon, G., Thai, N. J., Longe, O., and Senior, C. (2012). Dynamic facial expressions evoke distinct activation in the face perception network: a connectivity analysis study. J. Cogn. Neurosci. 24, 507–520. doi: 10.1162/jocn_a_00120

PubMed Abstract | CrossRef Full Text | Google Scholar

Fridlund, A. J., and Cacioppo, J. T. (1986). Guidelines for human electromyographic research. Psychophysiology 23, 567–589. doi: 10.1111/j.1469-8986.1986.tb00676.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Furl, N., Henson, R. N., Friston, K. J., and Calder, A. J. (2015). Network interactions explain sensitivity to dynamic faces in the superior temporal sulcus. Cereb. Cortex 25, 2876–2882. doi: 10.1093/cercor/bhu083

PubMed Abstract | CrossRef Full Text | Google Scholar

Gallagher, H. L., Happé, F., Brunswick, N., Fletcher, P. C., Frith, U., and Frith, C. D. (2000). Reading the mind in cartoons and stories: an fMRI study of “theory of mind” in verbal and nonverbal tasks. Neuropsychologia 38, 11–21. doi: 10.1016/S0028-3932(99)00053-6

CrossRef Full Text | Google Scholar

Gazzola, V., Aziz-Zadeh, L., and Keysers, C. (2006). Empathy and the somatotopic auditory mirror system in humans. Curr. Biol. 16, 1824–1829. doi: 10.1016/j.cub.2006.07.072

PubMed Abstract | CrossRef Full Text | Google Scholar

Hatfield, E., Cacioppo, J. T., and Rapson, R. L. (1993). Emotional contagion. Curr. Dir. Psychol. Sci. 2, 96–99. doi: 10.1111/1467-8721.ep10770953

CrossRef Full Text | Google Scholar

Heerey, E. A., and Crossley, H. M. (2013). Predictive and reactive mechanisms in smile reciprocity. Psychol. Sci. 24, 1446–1455. doi: 10.1177/0956797612472203

PubMed Abstract | CrossRef Full Text | Google Scholar

Hennenlotter, A., Schroeder, U., Erhard, P., Castrop, F., Haslinger, B., Stoecker, D., et al. (2005). A common neural basis for receptive and expressive communication of pleasant facial affect. Neuroimage 26, 581–591. doi: 10.1016/j.neuroimage.2005.01.057

PubMed Abstract | CrossRef Full Text | Google Scholar

Hess, U., and Blairy, S. (2001). Facial mimicry and emotional contagion to dynamic emotional facial expressions and their influence on decoding accuracy. Int. J. Psychophysiol. 40, 129–141. doi: 10.1016/S0167-8760(00)00161-6

PubMed Abstract | CrossRef Full Text | Google Scholar

Hess, U., and Bourgeois, P. (2010). You smile–I smile: emotion expression in social interaction. Biol. Psychol. 84, 514–520. doi: 10.1016/j.biopsycho.2009.11.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Hess, U., Philippot, P., and Blairy, S. (1998). Facial reactions to emotional facial expressions: affect or cognition? Cogn. Emot. 12, 509–531. doi: 10.1080/026999398379547

CrossRef Full Text | Google Scholar

Iacoboni, M., and Dapretto, M. (2006). The mirror neuron system and the consequences of its dysfunction. Nat. Rev. Neurosci. 7, 942–951. doi: 10.1038/nrn2024

PubMed Abstract | CrossRef Full Text | Google Scholar

Iwase, M., Ouchi, Y., Okada, H., Yokoyama, C., Nobezawa, S., Yoshikawa, E., et al. (2002). Neural substrates of human facial expression of pleasant emotion induced by comic films: a PET study. Neuroimage 17, 758–768. doi: 10.1006/nimg.2002.1225

PubMed Abstract | CrossRef Full Text | Google Scholar

Jabbi, M., and Keysers, C. (2008). Inferior frontal gyrus activity triggers anterior insula response to emotional facial expressions. Emotion 8, 775–780. doi: 10.1037/a0014194

PubMed Abstract | CrossRef Full Text | Google Scholar

Jabbi, M., Swart, M., and Keysers, C. (2007). Empathy for positive and negative emotions in the gustatory cortex. Neuroimage 34, 1744–1753. doi: 10.1016/j.neuroimage.2006.10.032

PubMed Abstract | CrossRef Full Text | Google Scholar

Kessler, H., Doyen-Waldecker, C., Hofer, C., Hoffmann, H., Traue, H. C., and Abler, B. (2011). Neural correlates of the perception of dynamic versus static facial expressions of emotion. Psychosoc. Med. 8, 8. doi: 10.3205/psm000072

PubMed Abstract | CrossRef Full Text | Google Scholar

Kilts, C. D., Egan, G., Gideon, D. A., Ely, T. D., and Hoffman, J. M. (2003). Dissociable neural pathways are involved in the recognition of emotion in static and dynamic facial expressions. Neuroimage 18, 156–168. doi: 10.1006/nimg.2002.1323

PubMed Abstract | CrossRef Full Text | Google Scholar

Kircher, T., Pohl, A., Krach, S., Thimm, M., Schulte-Rüther, M., Anders, S., et al. (2013). Affect-specific activation of shared networks for perception and execution of facial expressions. Soc. Cogn. Affect. Neurosci. 8, 370–377. doi: 10.1093/scan/nss008

PubMed Abstract | CrossRef Full Text | Google Scholar

Kohn, N., Eickhoff, S. B., Scheller, M., Laird, A. R., Fox, P. T., and Habel, U. (2014). Neural network of cognitive emotion regulation — an ALE meta-analysis and MACM analysis. Neuroimage 87, 345–355. doi: 10.1016/j.neuroimage.2013.11.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Korb, S., Malsert, J., Rochas, V., Rihs, T. A., Rieger, S. W., Schwab, S., et al. (2015). Gender differences in the neural network of facial mimicry of smiles – an rTMS study. Cortex 70, 101–114. doi: 10.1016/j.cortex.2015.06.025

PubMed Abstract | CrossRef Full Text | Google Scholar

Korb, S., With, S., Niedenthal, P., Kaiser, S., and Grandjean, D. (2014). The perception and mimicry of facial movements predict judgments of smile authenticity. PLoS ONE 9:e99194. doi: 10.1371/journal.pone.0099194

PubMed Abstract | CrossRef Full Text | Google Scholar

Kringelbach, M. L., and Berridge, K. C. (2010). The functional neuroanatomy of pleasure and happiness. Discov. Med. 9, 579–587.

PubMed Abstract | Google Scholar

Larsen, J. T., Norris, C. J., and Cacioppo, J. T. (2003). Effects of positive and negative affect on electromyographic activity over zygomaticus major and corrugator supercilii. Psychophysiology 40, 776–785. doi: 10.1111/1469-8986.00078

PubMed Abstract | CrossRef Full Text | Google Scholar

Lee, T. W., Josephs, O., Dolan, R. J., and Critchley, H. D. (2006). Imitating expressions: emotion-specific neural substrates in facial mimicry. Soc. Cogn. Affect. Neurosci. 1, 122–135. doi: 10.1093/scan/nsl012

PubMed Abstract | CrossRef Full Text | Google Scholar

Lehéricy, S., Ducros, M., Krainik, A., Francois, C., Van de Moortele, P. F., Ugurbil, K., et al. (2004). 3-D diffusion tensor axonal tracking shows distinct SMA and pre-SMA projections to the human striatum. Cereb. Cortex 14, 1302–1309. doi: 10.1093/cercor/bhh091

PubMed Abstract | CrossRef Full Text | Google Scholar

Leslie, A. M., Friedman, O., and German, T. P. (2004). Core mechanisms in “theory of mind.” Trends Cogn. Sci. 8, 528–533. doi: 10.1016/j.tics.2004.10.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Likowski, K. U., Mühlberger, A., Gerdes, A. B., Wieser, M. J., Pauli, P., and Weyers, P. (2012). Facial mimicry and the mirror neuron system: simultaneous acquisition of facial electromyography and functional magnetic resonance imaging. Front. Hum. Neurosci. 6:214. doi: 10.3389/fnhum.2012.00214

PubMed Abstract | CrossRef Full Text | Google Scholar

Likowski, K. U., Mühlberger, A., Seibt, B., Pauli, P., and Weyers, P. (2008). Modulation of facial mimicry by attitudes. J. Exp. Soc. Psychol. 44, 1065–1072. doi: 10.1016/j.jesp.2007.10.007

CrossRef Full Text | Google Scholar

Likowski, K. U., Mühlberger, A., Seibt, B., Pauli, P., and Weyers, P. (2011). Processes underlying congruent and incongruent facial reactions to emotional facial expressions. Emotion 11, 457–467. doi: 10.1037/a0023162

PubMed Abstract | CrossRef Full Text | Google Scholar

Livingstone, S. R., Vezer, E., McGarry, L. M., Lang, A. E., and Russo, F. A. (2016). Deficits in the mimicry of facial expressions in Parkinson's disease. Front. Psychol. 7:780. doi: 10.3389/fpsyg.2016.00780

PubMed Abstract | CrossRef Full Text | Google Scholar

Monk, C. S., Klein, R. G., Telzer, E. H., Schroth, E. A., Mannuzza, S., Moulton, J. L., et al. (2008). Amygdala and nucleus accumbens activation to emotional facial expressions in children and adolescents at risk for major depression. Am. J. Psychiatry 165, 90–98. doi: 10.1176/appi.ajp.2007.06111917

PubMed Abstract | CrossRef Full Text | Google Scholar

Mühlberger, A., Wieser, M. J., Gerdes, A. B., Frey, M. C. M., Weyers, P., and Pauli, P. (2011). Stop looking angry and smile, please: start and stop of the very same facial expression differentially activate threat- and reward-related brain networks. Soc. Cogn. Affect. Neurosci. 6, 321–329. doi: 10.1093/scan/nsq039

PubMed Abstract | CrossRef Full Text | Google Scholar

Murata, A., Saito, H., Schug, J., Ogawa, K., and Kameda, T. (2016). Spontaneous facial mimicry Is enhanced by the goal of inferring emotional states: evidence for moderation of “automatic” mimicry by higher cognitive processes. PLoS ONE 11:e0153128. doi: 10.1371/journal.pone.0153128

PubMed Abstract | CrossRef Full Text | Google Scholar

Neumann, R., and Strack, F. (2000). Approach and avoidance: The influence of proprioceptive and exteroceptive cues on encoding of affective information. J. Pers. Soc. Psychol. 79, 39–48. doi: 10.1037/0022-3514.79.1.39

PubMed Abstract | CrossRef Full Text | Google Scholar

O'Doherty, J., Winston, J., Critchley, H., Perrett, D., Burt, D., and Dolan, R. (2003). Beauty in a smile: the role of medial orbitofrontal cortex in facial attractiveness. Neuropsychologia 41, 147–155. doi: 10.1016/S0028-3932(02)00145-8

PubMed Abstract | CrossRef Full Text | Google Scholar

Paracampo, R., Tidoni, E., Borgomaneri, S., di Pellegrino, G., and Avenanti, A. (2016). Sensorimotor network crucial for inferring amusement from smiles. Cereb. Cortex 7, 5116–5129. doi: 10.1093/cercor/bhw294

CrossRef Full Text | Google Scholar

Pelphrey, K. A., Singerman, J. D., Allison, T., and McCarthy, G. (2003). Brain activation evoked by perception of gaze shifts: the influence of context. Neuropsychologia 41, 156–170. doi: 10.1016/S0028-3932(02)00146-X

PubMed Abstract | CrossRef Full Text | Google Scholar

Rizzolatti, G., and Craighero, L. (2004). The Mirror-Neuron System. Annu. Rev. Neurosci. 27, 169–192. doi: 10.1146/annurev.neuro.27.070203.144230

PubMed Abstract | CrossRef Full Text | Google Scholar

Robins, D. L., Hunyadi, E., and Schultz, R. T. (2009). Superior temporal activation in response to dynamic audio-visual emotional cues. Brain Cogn. 69, 269–278. doi: 10.1016/j.bandc.2008.08.007

PubMed Abstract | CrossRef Full Text | Google Scholar

Russell, J. A., and Fernández-Dols, J. M. (eds.). (1997). The Psychology of Facial Expression. Cambridge: Cambridge University Press.

Google Scholar

Rymarczyk, K., Biele, C., Grabowska, A., and Majczynski, H. (2011). EMG activity in response to static and dynamic facial expressions. Int. J. Psychophysiol. 79, 330–333. doi: 10.1016/j.ijpsycho.2010.11.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Rymarczyk, K., Zurawski, Ł., Jankowiak-Siuda, K., and Szatkowska, I. (2016). Do dynamic compared to static facial expressions of happiness and anger reveal enhanced facial mimicry? PLoS ONE 11:e0158534. doi: 10.1371/journal.pone.0158534

PubMed Abstract | CrossRef Full Text | Google Scholar

Sabatinelli, D., Bradley, M. M., Lang, P. J., Costa, V. D., and Versace, F. (2007). Pleasure rather than salience activates human nucleus accumbens and medial prefrontal cortex. J. Neurophysiol. 98, 1374–1379. doi: 10.1152/jn.00230.2007

PubMed Abstract | CrossRef Full Text | Google Scholar

Sato, W., and Yoshikawa, S. (2007). Enhanced experience of emotional arousal in response to dynamic facial expressions. J. Nonverbal Behav. 31, 119–135. doi: 10.1007/s10919-007-0025-7

CrossRef Full Text | Google Scholar

Sato, W., Fujimura, T., and Suzuki, N. (2008). Enhanced facial EMG activity in response to dynamic facial expressions. Int. J. Psychophysiol. 70, 70–74. doi: 10.1016/j.ijpsycho.2008.06.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Sato, W., Kochiyama, T., and Uono, S. (2015). Spatiotemporal neural network dynamics for the processing of dynamic facial expressions. Sci. Rep. 5:12432. doi: 10.1038/srep12432

PubMed Abstract | CrossRef Full Text | Google Scholar

Sato, W., Kochiyama, T., Yoshikawa, S., Naito, E., and Matsumura, M. (2004). Enhanced neural activity in response to dynamic facial expressions of emotion: an fMRI study. Cogn. Brain Res. 20, 81–91. doi: 10.1016/j.cogbrainres.2004.01.008

PubMed Abstract | CrossRef Full Text | Google Scholar

Seibt, B., Mühlberger, A., Likowski, K. U., and Weyers, P. (2015). Facial mimicry in its social setting. Front. Psychol. 6:1122. doi: 10.3389/fpsyg.2015.01122

PubMed Abstract | CrossRef Full Text | Google Scholar

Sims, T. B., Van Reekum, C. M., Johnstone, T., and Chakrabarti, B. (2012). How reward modulates mimicry: EMG evidence of greater facial mimicry of more rewarding happy faces. Psychophysiology 49, 998–1004. doi: 10.1111/j.1469-8986.2012.01377.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Soghomonian, J.-J. (eds.). (2016). The Basal Ganglia. Cham: Springer International Publishing. doi: 10.1007/978-3-319-42743-0

CrossRef Full Text | Google Scholar

Sprengelmeyer, R., Young, A. W., Mahn, K., Schroeder, U., Woitalla, D., Büttner, T., et al. (2003). Facial expression recognition in people with medicated and unmedicated Parkinson's disease. Neuropsychologia 41, 1047–1057. doi: 10.1016/S0028-3932(02)00295-6

PubMed Abstract | CrossRef Full Text | Google Scholar

The Mathworks Inc. (2013). Matlab 2013b. Natick, MA.

Trautmann, S. A., Fehr, T., and Herrmann, M. (2009). Emotions in motion: dynamic compared to static facial expressions of disgust and happiness reveal more widespread emotion-specific activations. Brain Res. 1284, 100–115. doi: 10.1016/j.brainres.2009.05.075

PubMed Abstract | CrossRef Full Text | Google Scholar

van der Gaag, C., Minderaa, R. B., and Keysers, C. (2007). Facial expressions: What the mirror neuron system can and cannot tell us. Soc. Neurosci. 2, 179–222. doi: 10.1080/17470910701376878

PubMed Abstract | CrossRef Full Text | Google Scholar

van der Schalk, J., Hawk, S. T., Fischer, A. H., and Doosje, B. (2011). Moving faces, looking places: validation of the Amsterdam dynamic facial expression set (ADFES). Emotion 11, 907–920. doi: 10.1037/a0023853

PubMed Abstract | CrossRef Full Text | Google Scholar

van der Zwaag, W., Da Costa, S. E., Zürcher, N. R., Adams, R. B. Jr., and Hadjikhani, N. (2012). A 7 tesla fMRI study of amygdala responses to fearful faces. Brain Topogr. 25, 125–128. doi: 10.1007/s10548-012-0219-0

PubMed Abstract | CrossRef Full Text | Google Scholar

Van Overwalle, F. (2009). Social cognition and the brain: a meta-analysis. Hum. Brain Mapp. 30, 829–858. doi: 10.1002/hbm.20547

PubMed Abstract | CrossRef Full Text | Google Scholar

Vrticka, P., Simioni, S., Fornari, E., Schluep, M., Vuilleumier, P., and Sander, D. (2013). Neural substrates of social emotion regulation: a fMRI study on imitation and expressive suppression to dynamic facial signals. Front. Psychol. 4:95. doi: 10.3389/fpsyg.2013.00095

PubMed Abstract | CrossRef Full Text | Google Scholar

Wake Forest University (2014). WFU PickAtlas 3.0.3.

Weyers, P., Mühlberger, A., Hefele, C., and Pauli, P. (2006). Electromyographic responses to static and dynamic avatar emotional facial expressions. Psychophysiology 43, 450–453. doi: 10.1111/j.1469-8986.2006.00451.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Wicker, B., Keysers, C., Plailly, J., Royet, J. P., Gallese, V., and Rizzolatti, G. (2003). Both of us disgusted in My insula: the common neural basis of seeing and feeling disgust. Neuron 40, 655–664. doi: 10.1016/S0896-6273(03)00679-2

PubMed Abstract | CrossRef Full Text | Google Scholar

Witvliet, C. V., and Vrana, S. R. (1995). Psychophysiological responses as indices of affective dimensions. Psychophysiology 32, 436–443. doi: 10.1111/j.1469-8986.1995.tb02094.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Wood, A., Rychlowska, M., Korb, S., and Niedenthal, P. (2016). Fashioning the face: sensorimotor simulation contributes to facial expression recognition. Trends Cogn. Sci. 20, 227–240. doi: 10.1016/j.tics.2015.12.010

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: facial mimicry, EMG, fMRI, mirror neuron system, emotional expressions, dynamic, happiness, anger

Citation: Rymarczyk K, Żurawski Ł, Jankowiak-Siuda K and Szatkowska I (2018) Neural Correlates of Facial Mimicry: Simultaneous Measurements of EMG and BOLD Responses during Perception of Dynamic Compared to Static Facial Expressions. Front. Psychol. 9:52. doi: 10.3389/fpsyg.2018.00052

Received: 20 July 2017; Accepted: 12 January 2018;
Published: 06 February 2018.

Edited by:

Alessio Avenanti, Università di Bologna, Italy

Reviewed by:

Sebastian Korb, University of Vienna, Austria
Frank A. Russo, Ryerson University, Canada

Copyright © 2018 Rymarczyk, Żurawski, Jankowiak-Siuda and Szatkowska. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Krystyna Rymarczyk, krymarczyk@swps.edu.pl
Łukasz Żurawski, l.zurawski@nencki.gov.pl

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.