Original Research ARTICLE
Oscillatory brain correlates of live joint attention: a dual-EEG study
- 1 Université Pierre et Marie Curie-Paris 6, Centre de Recherche de l’Institut du Cerveau et de la Moelle épinière, UMRS975, Paris, France
- 2 CNRS, UMR 7225, CRICM, Paris, France
- 3 Inserm, U975, CRICM, Paris, France
- 4 Inserm, U960, Laboratoire de Neurosciences Cognitives, DEC-ENS, Paris, France
- 5 Université Paris 8, LNP EA2027, Saint-Denis, France
Joint attention consists in following another’s gaze onto an environmental object, which leads to the alignment of both subjects’ attention onto this object. It is a fundamental mechanism of non-verbal communication, and it is essential for dynamic, online, interindividual synchronization during interactions. Here we aimed at investigating the oscillatory brain correlates of joint attention in a face-to-face paradigm where dyads of participants dynamically oriented their attention toward the same or different objects during joint and no-joint attention periods respectively. We also manipulated task instruction: in socially driven instructions, the participants had to follow explicitly their partner’s gaze, while in color-driven instructions, the objects to be looked at were designated at by their color so that no explicit gaze following was required. We focused on oscillatory activities in the 10 Hz frequency range, where parieto-occipital alpha and the centro-parietal mu rhythms have been described, as these rhythms have been associated with attention and social coordination processes respectively. We tested the hypothesis of a modulation of these oscillatory activities by joint attention. We used dual-EEG to record simultaneously the brain activities of the participant dyads during our live, face-to-face joint attention paradigm. We showed that joint attention periods – as compared to the no-joint attention periods – were associated with a decrease of signal power between 11 and 13 Hz over a large set of left centro-parieto-occipital electrodes, encompassing the scalp regions where alpha and mu rhythms have been described. This 11–13 Hz signal power decrease was observed independently of the task instruction: it was similar when joint versus no-joint attention situations were socially driven and when they were color-driven. These results are interpreted in terms of the processes of attention mirroring, social coordination, and mutual attentiveness associated with joint attention state.
We live in a social world. A lot of our time and cognitive resources are devoted to the processing of information conveyed by others. Synchronizing our actions with those of others and appropriately responding to social signals are essential to adaptive behavior. Among social signals, a particularly important cue for interindividual synchronization is gaze (Argyle et al., 1973; Patterson, 1982). Eye contact and gaze following are pervasive components of social exchanges. Gaze regulates interpersonal interactions and turns of conversation. In humans, eye gaze has evolved as an essential cue to social attention, which is used to detect others’ focus of interest in the environment and infer others’ intentions.
A basic but omnipresent element of social synchronization is constituted by the process of joint attention (Emery, 2000): Seeing someone directing his/her attention to an environmental object induces a shift of attention in the observer, resulting in the alignment of both subjects’ attention onto the same object. This attentional shift is automatic insofar as it cannot be suppressed by instructing the observer to ignore the seen eye gaze or by notifying that the eye gaze is most likely to cue an irrelevant space location (for review, see Frischen et al., 2007). The shift of attention induced by others’ gaze can occur overtly – it is then accompanied by an eye movement of the observer toward the object – or covertly – no eye movement occurs and the observer’s attention is covertly aligned onto the object gazed at by a fellow (for a recent review, see Shepherd, 2010).
With the exception of a few exemplary recent studies (Redcay et al., 2010; Saito et al., 2010; Pönkänen et al., 2011), joint attention has typically been studied in task manipulating computerized face and gaze stimuli. Yet, joint attention is a fulcrum of everyday social interactions; it plays a pivotal role in our ability to understand others, to infer their intentions, desires, thoughts, and beliefs (Baron-Cohen, 1995). It is a social act that involves a triadic and dynamic relation between two agents and an external object (Emery, 2000; Grossmann et al., 2007), implying mutual attentiveness as well as coordination – two core elements of rapport (Tickle-Degnen and Rosenthal, 1990; Tickle-Degnen, 2006). Joint attention also implies a dynamic perception-action coupling between interacting agents, and it has been proposed to involve attention mirroring (Shepherd et al., 2009; Gangopadhyay and Schilbach, 2012); it may thus also be considered as pertaining to mimicry behaviors which are thought to involve the mirror neuron system and are known to play a role in affiliation (Lakin and Chartrand, 2003; Lakin et al., 2003; Iacoboni, 2009). Thus, altogether, joint attention may be best captured in online interactions between two persons, during dynamic sequences of eye movements toward external objects and gaze following.
Here, we aimed at investigating the oscillatory brain correlates of joint attention under such real-time, live situation of face-to-face interaction between two participants. We used a setup developed and validated in our laboratory that allows seating two persons face-to-face in a joint attention paradigm, getting closer to a real-life situation while retaining the advantages of a laboratory-based experiment (Lachat et al., 2012). We recorded the brain activity of dyads of participants using dual electroencephalography (dual-EEG; Dumas et al., 2011). This technique enables the simultaneous recording of the brain activities of two persons interacting with each other. Recent studies have used such dual recording with EEG as well as functional Magnetic Resonance Imaging (fMRI), paving the way to the emerging field of hyperscanning studies for the investigation of embodied, live social interactions (for a review, see Dumas et al., 2011).
Oscillatory activities centered around 10 Hz constitute prominent rhythms in the EEG power spectrum that have been observed from the birth of EEG (e.g., Berger, 1929). These oscillatory activities, measured typically between 8 and 13 Hz, are often designated as “idling rhythms” as they correspond to a resting state of the brain, and sensory stimuli typically engender a suppression of these oscillations in the corresponding sensory area of the brain (Pfurtscheller et al., 1996, Palva and Palva, 2007). Yet, these rhythms have been associated with multiple cognitive processes over the years, and particularly with attentional processes as well as with the mechanisms of social interaction (e.g., Başar et al., 1997; Ward, 2003; Klimesch et al., 2007; Palva and Palva, 2007; Foxe and Snyder, 2011; Perry et al., 2011); they were therefore of particular interest for the present study.
Oscillations within the 8–13 Hz frequency band have been described first over parieto-occipital regions. These parieto-occipital activities are known as the alpha rhythm. This rhythm is primarily modulated by visual inputs: it is attenuated by visual stimulation as well as when the eyes are open compared to when the eyes are closed (Adrian and Matthews, 1934); it is also modulated by the position of the eyes with the elevation of the eyes increasing the amplitude of alpha oscillations (Mulholland and Evans, 1965, 1966, but see Chapman et al., 1970). Moreover, the parieto-occipital alpha rhythm is held to reflect arousal and attention mechanisms (Ward, 2003; for reviews see Foxe and Snyder, 2011). Alpha oscillatory activities are reduced under conditions of high arousal and/or of increased attentiveness as well as enhanced for stimuli that have to be ignored. The parieto-occipital alpha rhythm has been proposed to reflect an attentional distractor suppression mechanism: alpha activity would be invoked in cortical regions processing irrelevant or distracting information during attention-related tasks, acting as a suppression mechanism for stimuli or stimulus features that are to be ignored (Foxe and Snyder, 2011). In addition, it has been shown that alpha oscillations may be modulated in relation with gaze perception: in a face-to-face paradigm, Gale et al. (1972) demonstrated that the alpha oscillations recorded in an observer were reduced under condition of gaze contact with an interlocutor, as compared to a condition of closed eyes or averted gaze of the interlocutor. This result was interpreted in terms of the arousing value of mutual gaze.
Moreover, Gastaut (1952) and Gastaut and Bert (1954) described for the first time the Rolandic mu rhythm which occurs in the same frequency band as the alpha and typically culminates over centro-parietal regions. The mu rhythm shows a peak in the 8–13 Hz frequency band but it also has a beta band component (15–25 Hz; for a review see Hari et al., 1997; Pineda, 2005). The mu rhythm was first associated with the execution of a motor activity (Pfurtscheller and Berghold, 1989). When a movement is performed, mu oscillations are reduced as compared to a situation of no movement (Cochin et al., 1998; Babiloni et al., 1999). Furthermore, Hari et al. (1998) demonstrated that the oscillatory activities in the 15–25 Hz band over the rolandic region are modulated by an action performed by the subject or by the observation of the same action performed by somebody else. Mu rhythm suppression in the ∼10 Hz band has also been revealed when participants observe or imagine a motor action (Pineda et al., 2000; Perry and Bentin, 2009). These results suggest that decreased mu signal power may reflect the activity of the human mirror system (Muthukumaraswamy and Johnson, 2004, for a review see Pineda, 2005). Recent studies further proposed a more specific role of mu rhythm as an electrophysiological signature of social skills: signal power modulations in the mu frequency band have been linked to the perception of socially relevant stimuli and the processing of social interactive situations (Oberman et al., 2007; Perry et al., 2011), to empathy and the representation of others’ pain (Cheng et al., 2008; Perry et al., 2010), to the social perceptive component of theory of mind (Pineda and Hecht, 2009), to the processing of social context, and to the interindividual coordination of action (Naeem et al., 2012). Tognoli et al. (2007) proposed that a particular oscillatory component within the alpha and mu frequency band, the so-called phi complex (9–11 Hz), recorded over the lateral centro-parietal regions of the scalp, would be specific to the social coordination of movements. More recently, inter-brain phase synchronizations have been observed in the same frequency range between pairs of subjects engaged in spontaneous reciprocal imitation (Dumas et al., 2010).
It is worth noticing that except the studies of Tognoli et al. (2007), Dumas et al. (2010) and Naeem et al. (2012), the mu rhythm has only been investigated with participants facing computerized stimuli rather than during live interactions with a human partner. Yet, recent studies have emphasized the importance to use natural settings to investigate human social cognition (Kingstone et al., 2003; Zaki and Ochsner, 2009; Schilbach, 2010; Wilms et al., 2010). Moreover, to our knowledge, the oscillatory correlates of joint attention remain unexplored. Since oscillatory activities in the alpha and mu frequency band have been associated with attention processes as well as with social interaction and coordination processes, they should be good candidates as electrophysiological correlates of joint attention.
The aim of our study was to investigate the influence of joint attention on oscillatory activities within the 8–13 Hz frequency band. As detailed above, joint attention is a fulcrum of social interaction and interindividual synchronization. It is a social act that takes place in the online interaction between two social agents. Jointly attending to the same object with a physically present partner requires interpersonal coordination, mutual attentiveness as well as attentional mirroring mechanisms. These multiple processes associated with joint attention predict a widespread decrease of alpha and mu signal power over centro-parietal and parieto-occipital scalp regions, in comparison with situation matched for their motor component but involving no-joint attention. To test this hypothesis, we set up a live joint attention paradigm, where dyads of participants seating face-to-face had to direct attention to the same or opposite objects (color light-emitting diodes, LEDs) during different blocks of trials. In addition, we manipulated task instruction: The participants were either explicitly told to follow each other’s gaze or instructed to look at a given color LED so that joint attention was then color-driven rather than socially driven. This aimed at examining whether alpha or mu activities may be more strongly influenced by joint attention processes when the alignment of attention of the participants resulted from explicit gaze following. On one hand, it could be predicted that alpha and mu modulations by joint attention should be enhanced under the gaze following instruction compared to the color-driven instruction, because more social coordination and enhanced mutual attentiveness may be elicited in the former case. On the other hand, the presence of a partner may be very difficult to ignore during a face-to-face, live paradigm, and this may dampen the observation of differences between the socially driven and the color-driven joint attention processes. Oscillatory activities were analyzed during the time periods where both participants focused on the same or different LEDs (i.e., after having moved their eyes toward the LEDs). Although the time period of the subjects’ eye movement following the lighting of the LEDs was of potentially great interest, reflecting overt and dynamic attention orienting processes, it was very transient and heavily contaminated by task-related ocular activities. By contrast, we were interested in the sustained states of joint attention associated with the periods of gaze focus on the LEDs; these should induce sustained feelings of mutual attentiveness and shared attention that should be strong enough to be observed over the whole time period during which both subjects gazed at the same LED.
Materials and Methods
Thirty-two healthy volunteers took part in the experiment (16 female, mean age = 23.5 ± 3.5 years). They provided informed written consent and were paid for their participation. All procedures were approved by the local ethics committee (CPP No. 07024). All participants were right-handed and had normal or corrected-to-normal vision. None of the participants had a history of neurological or psychiatric illness. All participants were in the normal range of the Autism-Spectrum Quotient (AQ; Baron-Cohen et al., 2001; mean score = 16.5 ± 1) as well as of the Rosenberg Self-Esteem Scale (RSES; Rosenberg, 1965; mean score = 21 ± 0.6). These questionnaires were given to the participants because previous studies have suggested an influence of self-esteem and autistic traits on the sensitivity to gaze cues in attention orienting paradigms (Bayliss et al., 2005; Wilkowski et al., 2009). Furthermore, 1–8 months before the dual-EEG experiment, every participant took part in a behavioral experiment consisting in a face-to-face paradigm of attention orienting induced by gaze (Lachat et al., 2012). This allowed assessing the gaze cueing effect in each participant (mean gaze cueing effect, expressed as the reaction time difference between the detection of targets cued by gaze versus the detection of targets not cued by gaze = 17 ± 4 ms). We then distributed the participants into 16 unisex dyads where the participants were matched on age, AQ, RSES score, and gaze cueing effect (in this order of priority), for the dual-EEG study. In each dyad, the participants had never met, except for 4 dyads in which the subjects had occasionally come across each other in the past yet not within the last 6 months. We excluded three subjects from the analyses due to excessive eye blinks or muscle artifacts in the analyzed time intervals. Thus, we here report the data of 29 subjects (15 female, mean age = 24 ± 1 years).
The experiment was conducted in a dimly lit, electrically shielded room. Our experimental device was placed on a table in the middle of the room. It consisted of two identical black wooden rectangle boards (100 cm × 70 cm) bound together. The device was pierced in its center by a circle hole (30 cm diameter). Four LEDs (5 mm diameter) were fixed on the edge of the hole, symmetrically to the right and left borders, the first two at the level of its horizontal diameter and the other two 45° below (Figure 1A). These LEDs were composed with two filaments, one lighting in green and the other one lighting in red. An orange color was obtained by lighting the two filaments simultaneously. The LEDs could be switched on in red, green, or orange via the parallel port of a computer. Their luminance was calibrated by using a variable serial resistance (mean = 111 ± 0.4 Cd/m2 for green and red, and 236 ± 4 for orange).
Figure 1. Dual-EEG setup and experimental conditions. (A) Photograph of the device, with two LEDs turned on, and two participants facing each other. (B) The four experimental conditions are illustrated on a view of the device. The arrows represent the gaze direction of subject A (in pink) and subject B (in blue).
Experimental Design and Task Procedure
For each dyad, the participants were introduced to each other when they arrived at the laboratory for the EEG recording session. The installation of the EEG cap took place in the experimental room for the two participants at the same time – it was performed by two experimenters. The participant interacted with each other during this period. Then, at the beginning of the experiment, the two participants sat face-to-face on each side of the device so that they could see each other through the device hole as well as see the four LEDs on the hole border. Both subjects sat at 40 cm from the center of the device with their eyes at the level of the “upper” LEDs, resulting in a 21° of visual angle for every LED.
The experiment consisted in a two-by-two factorial design where we manipulated two conditions of joint attention (Joint attention/No-joint attention) as well as the task instructions that led to these joint-/no-joint attention situations (Social instruction/Color instruction; Figure 1B). The resulting four experimental conditions were passed in separate blocks. In the joint attention blocks, both subjects had to look at the same LED across a series of trials; in the no-joint attention block, the subjects had to look at opposite LEDs. In the Social instruction condition, joint attention situations were socially driven: one subject was instructed to randomly choose the LED at which he looked on every trial, whereas the other subject was requested to look at either the same LED as his/her partner (in the joint attention blocks) or at the opposite one (in the no-joint attention blocks). Each subject endorsed the driver and follower role alternatively, in different blocks. By contrast, in the Color instruction condition, the LED to be gazed at was indicated to each subject by its color (green or red), with either both subjects having to look at the same color LED (joint attention blocks) or one subject instructed to look at the green LED and the other one instructed to look at the red LED (no-joint attention blocks). Thus, joint attention situations were here externally driven.
In every block, each trial started by a 2–3 s period of mutual gaze (where the subjects looked at each other) with all LEDs switched off. Then, two LEDs (one on each side of the subjects) turned on: one LED switched on in red, and the other one in green. The subjects looked at the same or opposite LED as fast and accurately as possible according to the instruction they had received for a given block. After 3.5 s, both lighted LEDs changed their color to orange. They remained orange for 3 s; the subjects were allowed to blink during this period. Finally, the LEDs switched off again; both participants moved their eyes back to the center of the device to look at each other, and a new trial started. Every block comprised 34 trials preceded by two baseline periods of 6 trials each. During the baselines, a black opaque cardboard with only small holes at its outer boarder was placed on the device so that the subjects could still see the LEDs but could not see each other anymore. During the baseline periods, the timing of the trials (the LEDs switching on and off) was exactly the same as during the experimental block. In the first baseline period, the subjects were asked not to move their eyes and keep looking straight at the center of the cardboard throughout the six trials. This baseline period could not be analyzed due to excessive eye blinks in the time intervals of interest. During the second baseline period, the subjects were requested to move their eyes toward the red or green LED as during the experimental blocks, except that they could not see each other. For this baseline, the subjects were given written directives, so that the subjects did not know the directive given to his/her partner. At the end of the baselines, the cardboard was removed, and the directives for the experimental block (Joint/No-Joint attention under Social or Color instruction) were given orally to both subjects.
The EEG recording session comprised 12 experimental blocks distributed across the 4 experimental conditions: socially driven joint attention, color-driven joint attention, socially driven no-joint attention, and color-driven no-joint attention. The order of the blocks was randomized for each dyad. The experimental blocks were preceded by a small block of training to each possible experimental condition.
EEG Data Acquisition
Electroencephalography data from both participants were recorded simultaneously using two identical actiCaps (Brain Products GmbH, Munich) with 60 active electrodes each, placed according to the international extended 10/10 system. Ground electrodes were placed on the right shoulder of each participant. Continuous EEG was recorded with respect to a nose reference, at a sampling rate of 500 Hz. The signal was amplified and band-pass filtered online between 0.16 and 250 Hz. Electrode impedances were maintained below 10 kΩ. We used four bipolar derivations to monitor eye movements: two electrodes were placed above and below the dominant eye for the vertical eye movements, and two electrodes were placed at the outer canthi of the eyes for the horizontal movements. The data acquisition from each cap was performed using two identical Brainamp MR amplifiers (Brain Products GmbH, Munich), which were connected to the same computer and recorded through the same software interface to ensure synchronous acquisition of both EEG data sets.
On every trial and for each participant, we ensured that the participant had moved his/her eyes to the LED by placing a “post-saccade” marker about 200 ms after the end of the saccade that followed the lighting of the LEDs (see Figure 2).
Figure 2. Time course of an example trial illustrated on the vertical electrooculograms of a participant dyad. The upper and lower time courses represent the vertical EOGs of the subjects A and B of a given dyad. Every trial started by a period of mutual gaze for 2–3 s. Then, two LEDs turned on (one in green, the other one in red) and both subjects moved their eyes to one LED, according to the directives for this block. After 3.5 s, the LEDs turned orange and the subjects were allowed to blink. Post-saccade markers were manually inserted after the saccade onto the LED for both subject A (post A) and B (post B). The gray rectangle represents the time window of analysis; the dark gray border corresponds to the 300 ms Blackman window used for the time-frequency transform, which was excluded from measurement, resulting in a 1.5 s time window of analysis (shaded in light gray).
First, we evaluated the saccadic response times of participants under each condition of joint attention and instruction by measuring the mean time interval between the LEDs turning on and the post-saccade marker under each experimental condition for every participant.
Second, for time-frequency analysis, we focused on the period where both participants fixated the same or opposite, red or green LED. We took the latest of the two participants’ post-saccade markers as our time reference landmark for each trial in order to analyze the time periods where the participants’ attention was aligned onto the same or opposite objects (Figure 2). Trials containing eye blinks, muscle artifacts, or other artifacts (>50 μV) were removed. This led to a mean trial rejection rate of 20.4 ± 2.9% (on average, 7 out of 34 trials per block) across participants. After rejection, the number of trials taken into account did not significantly differ between our experimental conditions. A time-frequency wavelet transform was applied from −0.1 s before to 2 s after this time reference landmark, for each trial, at each EEG sensor. We used a family of complex Morlet wavelets, with a m parameter of 10 and a Blackman window of 300 ms, resulting in an estimate of signal power at each time sample and at each frequency between 4 and 120 Hz, with a frequency step of 1 Hz (for details and review, see Tallon-Baudry and Bertrand, 1999). The time-frequency transformed data were then averaged across trials for each experimental block and for each subject, separately for the baseline trials and the face-to-face trials. The obtained signal power data were then averaged over a 1.5 s time interval between +0.2 and +1.7 s taking into account the Blackman window, for each frequency (see Figures 3A,B). An index of signal power, defined at each frequency as the log-transformation of the ratio between the mean signal power for face-to-face trials and the mean signal power for baseline trials was then computed, for each block. The log-transformation of the data was used to approach a normal distribution. Finally the data were averaged along the 4 conditions of interest: socially driven joint attention, color-driven joint attention, socially driven no-joint attention, and color-driven no-joint attention, for each subject and for the grand mean of the 29 subjects.
Figure 3. Overall view of the EEG power spectrum and of the time-frequency analysis of the data. (A) Time-frequency plot representing the power of oscillatory activities between 4 and 120 Hz across time (between 0.2 and 1.7 s, before the averaging of data over this 1.5 s period); the grand mean of the data, averaged across conditions, are represented on a selected electrode (C3). (B) EEG power spectrum between 4 and 40 Hz, averaged over the 1.5 s time interval, on the same electrode. This illustrates the peak of oscillatory activities obtained in the 10 Hz frequency band. (C) Result of the overall ANOVA performed on every electrode (in ordinate) and every frequency between 4 and 120 Hz (in abscissa). p-Values obtained for the main effect of Joint versus No-Joint attention are shown. This analysis did not reveal any other significant effect – with at least three electrodes reaching p < 0.001 – besides the identified 11–13 Hz band modulation.
First, we analyzed the saccadic response times of the participants. A first ANOVA was performed with Attention (Joint/No-Joint attention) and Instruction (Social/Color) as within-subject factors. A second ANOVA was restricted to the conditions of socially driven instructions and included Attention (Joint/No-Joint attention) and Participant status (Driver/Follower) as within-subject factors.
For oscillatory activities, our main interest was in the modulation of alpha and mu rhythms; we thus focused on the 8–13 Hz frequency range. A first ANOVA performed at each electrode and for each frequency, with Attention (Joint/No-Joint attention) and Instruction (Social/Color) as within-subject factors did not reveal any effect in the lower alpha and mu frequency range (8–10 Hz). Thus, we averaged the data in the higher alpha and mu band, between 11 and 13 Hz, and reported the result of statistical analyses in this frequency band. Furthermore, under the social instruction, the status of the participant as the driver or follower of his/her partner’s attention was a factor of interest. We thus performed an ANOVA restricted to the conditions of socially driven instructions with Attention (Joint/No-Joint attention) and Participant status (Driver/Follower) as within-subject factors.
Since our analyses involved multiple comparisons (over electrodes and frequencies), we used a statistical threshold of 0.01 and checked that at least three electrodes yielded a p < 0.001 in the identified clusters.
In addition, in order to test whether any other frequency band yielded some significant effects, we performed an additional ANOVA on every electrode and every frequency between 4 and 120 Hz. This analysis did not reveal any other significant effect besides the identified 11–13 Hz band modulation (Figure 3C).
Finally, in order to test the lateralization of the effect obtained in the 11–13 Hz band and to investigate whether a dissociation between the centro-parietal and parieto-occipital regions could emerge, we performed an ANOVA upon four right and left clusters composed of six electrodes each (see Muthukumaraswamy et al., 2004 for a similar approach). For the left centro-parietal cluster we considered the electrodes C5, C3, C1, CP5, CP3, and CP1. For the right parieto-occipital cluster we considered the electrodes P5, P3, P1, PO7, PO3, and O1. The symmetrical electrodes were taken into account for the right centro-parietal and the right parieto-occipital clusters. We averaged the log of the power ratio in the 11–13 Hz band over the electrodes in each cluster and we performed an ANOVA with Hemisphere (Left/Right), Cluster (Centro-Parietal/Parieto-Occipital), Attention (Joint/No-Joint attention), and Instruction (Social/Color) as within-subject factors.
First, we analyzed the saccadic response times of the participants. A first ANOVA with Attention (Joint/No-Joint attention) and Instruction (Social/Color) as within-subject factors showed a main effect of Instructions [F(1, 28) = 113.6, p < 0.0001] reflecting faster saccades under the color-driven (mean response time = 445 ± 18 ms) than under the socially driven instructions (mean response time = 577 ± 12 ms). There was also a main effect of Attention [F(1, 28) = 5.2, p < 0.05] demonstrating faster saccades to the LEDs in the joint attention (mean reaction time = 496 ± 19 ms) than in the no-joint attention conditions (mean reaction time = 526 ± 12 ms). The interaction between Attention and Instruction was not significant (F < 1). In addition, under the social instruction situations, the status of the participant as the driver or follower of his/her partner’s attention was a potential factor of interest. Note that this factor could only be analyzed under the socially driven attention conditions, since under the color-driven attention conditions the participants were instructed to attend to the LEDs according to their colors and there was neither a driver or a follower of attention that was designated. Thus, we performed a second ANOVA restricted to the social instruction conditions, with Attention (Joint/No-Joint attention) and participant’s Status (Driver/Follower) as within-subject factors. This confirmed the effect of Attention [F(1, 28) = 11.3, p < 0.005] and showed a massive effect of participant’s status [F(1, 28) = 467.9, p < 0.0001]: the response times under the socially driven attention conditions were shorter for the participant who was designated as the driver of attention as compared to the participant who had to follow his/her partner’s gaze. This demonstrates that the participants complied to the gaze following instruction. The interaction between Joint attention and Status did not reach significance [F(1, 28) = 3.7, p > 0.05].
We then turned to the analysis of oscillatory activities. The two-by-two ANOVA performed on the 11–13 Hz frequency band showed a significant effect of Attention (Joint versus No-Joint attention) over a large set of left centro-parietal electrodes extending to occipital electrodes (Figure 4; Table 1): the mean 11–13 Hz signal power was reduced in the joint relative to the no-joint attention condition. This effect was not influenced by the socially driven versus color-driven instructions: There was no significant interaction between joint attention condition and instruction (Figure 4), and t-tests contrasting joint and no-joint attention conditions under each type of instruction showed that the effect of joint attention on 11–13 Hz oscillatory activities was significant both when joint attention resulted from the social, explicit gaze following instruction and when it resulted from the color-related instruction (Figure 5; Table 1). There was not any significant main effect of the type of instruction (Figure 4). Furthermore, we checked whether there was any difference in the power of alpha and mu oscillations during socially induced joint versus no-joint attention conditions that depended on the role of the subject as the driver or the follower of his/her partner’s attention. The ANOVA restricted to the social instruction conditions with joint/no-joint attention conditions and driver/follower status as within-subject factors did not reveal any significant difference induced by the participant’s status in the 11–13 Hz frequency band.
Figure 4. The effects of joint attention and instruction. (A) Main effect of Joint attention: Maps of the mean signal power (log ratio) between 11 and 13 Hz under the Joint attention and the No-joint attention conditions are represented together with the corresponding difference map between Joint and No- joint attention conditions. The grand mean signal power within the 11–13 Hz frequency band is represented on a top view of the head. (B) Main effect of Instruction: Maps of the mean signal power (log ratio) between 11 and 13 Hz under the Social and the Color instructions are represented together with the corresponding difference map between Social and Color instruction. The grand mean signal power within the 11–13 Hz frequency band is represented on a top view of the head. (C) Results of the two-by-two ANOVA with Joint attention and Instruction as within-subject factors. The maps (top views of the head) of the p-values for the main effects of Joint attention and of Instruction are represented, as well as the map of the p-values for the interaction between Joint attention and Instruction. For (A,C), electrodes for which the p-value was beyond 0.01 (p < 0.01) are represented in white. There was not any electrode yielding a significant effect of Instruction or an interaction between Joint attention and Instruction. (D) Illustration of the main effect of Joint attention in five example subjects. Difference maps between Joint and No- joint attention conditions are represented.
Figure 5. Maps of the signal power in the 11–13 Hz frequency band under each experimental condition. Maps (top views of the head) of the grand mean of the signal power between 11 and 13 Hz are represented under the Joint attention (left column) and the No-joint attention (middle column) conditions for the Social (upper row) and the Color (lower row) instructions. The difference maps corresponding to the grand mean difference in 11–13 Hz signal power for the Joint versus the No-joint attention conditions, under the Social and the Color instruction respectively, are represented in the rightmost column. The white dots on the maps represent the electrodes on which the t-tests revealed a significant difference between Joint and No-joint attention conditions (p < 0.01) for the socially driven and color-driven instructions respectively.
Altogether, these results showed that jointly attending to the same object reduced oscillatory activities recorded in the 11–13 Hz frequency band on centro-parietal as well as parieto-occipital regions. These effects did not depend significantly on the type of instruction (whether joint attention periods were socially driven or color-driven; Table 1). In order to further check the lateralization of the effect obtained in the 11–13 Hz band and to verify if a dissociation between the centro-parietal and parieto-occipital regions could emerge, we performed an additional analysis based on an electrode clustering approach. Thus we defined four right and left, centro-parietal and parieto-occipital clusters centered on scalp regions where alpha and mu oscillatory activities have classically been reported (see Materials and Methods). The ANOVA with Hemisphere (Left/Right), Cluster (Centro-Parietal/Parieto-Occipital), Attention (Joint/No-Joint attention), and Instruction (Social/Color) as within-subject factors confirmed the main effect of Attention [F(1, 28) = 12.21, p < 0.005, η = 0.30]; this effect was significant under both the social and the color instructions [F(1, 28) = 4.6, p < 0.05, η = 0.14 and F(1, 28) = 5.7, p < 0.05, η = 0.17 respectively; no significant interaction between Instruction and Attention: F < 1]. Moreover, this ANOVA revealed an interaction between Hemisphere and Attention [F(1, 28) = 5.61, p < 0.01, η = 0.17]. The effect of Attention was present in the left hemisphere only [F(1, 28) = 20.82, p < 0.0001, η = 0.42]. There was not any significant effect of Cluster (F < 1), and the joint versus no-joint attention effect was highly significant in the two left hemisphere clusters [Centro-parietal: F(1, 28) = 20.47, p < 0.001, η = 0.42; Parieto-occipital: F(1, 28) = 16.5, p < 0.0001, η = 0.37]. This demonstrated that online joint attention was associated with both alpha and mu suppressions.
The aim of this study was to investigate whether oscillatory activities in the alpha and mu frequency band may constitute electrophysiological correlates of joint attention in a face-to-face, online interaction paradigm. We showed that oscillatory activities between 11 and 13 Hz were modulated by joint attention over a large set of left centro-parieto-occipital electrodes, with a decrease in the 11–13 Hz signal power during the periods where participants’ attention was aligned onto the same object as compared to the periods where subjects looked at opposite objects. These effects were found both when participants’ attention was socially driven and when it was color-driven.
To our knowledge, this study is the first to associate joint attention with alpha and mu rhythm modulations. Our finding can be interpreted in the framework of the functional roles that have been proposed for parieto-occipital alpha and centro-parietal mu rhythms.
Indeed, first, mu rhythm suppression has been associated with the mirroring of action and the activation of human mirror system (Muthukumaraswamy and Johnson, 2004; Perry and Bentin, 2009; for a review see Pineda, 2005). Furthermore, Shepherd et al. (2009) have proposed that gaze following and attention orienting induced by gaze would involve a mechanism of attention mirroring, subtended by a mirror-like neuron system in the posterior parietal cortex of macaque monkeys. Although the identification of the brain regions involved was beyond the scope of the present study and cannot be inferred from scalp data only, our finding of a modulation of mu rhythm by joint attention over a large set of centro-parietal electrodes fits with Shepherd’s view of joint attention as involving an attention mirroring mechanism (for review, Shepherd, 2010).
Moreover, mu rhythm suppression has been linked to social interactions, and particularly to interindividual coordination processes. More precisely, mu rhythm modulation has been implicated in the processing of socially relevant stimuli and in the undertaking of social interactive situations. For instance, using a computerized ball throwing game, Oberman et al. (2007) showed a decrease of the mu rhythm according to the level of involvement of the participant in the game: the more the participant was involved (i.e., received the ball from the on-screen players), the more mu oscillatory activities were reduced. More recently, Perry et al. (2011) found a similar result with participants viewing or playing a game of Rock-Paper-Scissors. In line with these studies, our results suggest that sharing an object of attention with another fellow elicited greater engagement of the participants in the social interaction than did the no-joint attention conditions where the participants attended to different objects. This may be particularly true in the live, face-to-face joint attention paradigm that we used, which promoted a naturalistic – although very basic – social interaction between the participants. It is however interesting to note that the effect of Joint attention was not modulated by the factor of Instruction. This raises some questions about the precise functional nature of our mu modulation. Pineda and Hecht (2009) demonstrated that mu rhythm is involved in the social perceptive component of theory of mind, which implies in particular the processing of social signals conveyed by faces, and they further suggested that it correlates with the inference made by participants about person-object interactions in a theory of mind task. Following this study, it may be suggested that the joint attention conditions in our experiment engaged more strongly these components of theory of mind, as compared to the no-joint attention conditions. However, if such proposal was true, one would have expected greater mu suppression under the socially driven than under the color-driven instructions, because the socially driven instruction required the processing of person-object interactions to greater extent than the color-driven instruction did. Yet, our results did not support this view. Rather, the same level of mu reduction under joint relative to no-joint attention conditions was observed for both types of instructions. Thus, our results are more in line with studies that have associated mu oscillatory activities with the interpersonal coordination component of social interactions. In particular, our finding is reminiscent of the result of Naeem et al. (2012) based on another modality, namely finger movements. These authors showed that the coordination of movements between two participants modulated the power of mu oscillations with a relative synchronization in the 10–12 Hz frequency band observed when participants moved their finger independently from each other and a decrease of these oscillatory activities when they moved in coordination (see also Tognoli et al., 2007). Our results extend these findings to gaze following, and support the view that the joint attention conditions elicited more coordination of the participants’ action than the no-joint attention condition as reflected by signal power reduction in the high (11–13 Hz) mu frequency band. This interpersonal coordination component of joint attention may have been recruited to the same extent under both the socially and the color-driven instructions, as discussed in detail below.
In addition, the modulation of oscillatory activities in the 11–13 Hz extended onto posterior, parieto-occipital electrodes, where visual alpha rhythm is classically measured. This supports the view that parieto-occipital alpha rhythm is also involved in joint attention. The modulation of alpha rhythm was historically firstly associated with visual processing (Adrian and Matthews, 1934). Later, the alpha rhythm has been associated with arousal as well as attentional mechanisms (Ray and Cole, 1985). Both types of mechanisms may have contributed to our results. Indeed, Gale et al. (1972) demonstrated that alpha oscillations were reduced under a condition of gaze contact between a participant and a physically present experimenter as compared to situations where the experimenter displayed closed eyes or averted gaze. This was interpreted as reflecting an arousal increase induced by mutual attention. Under this view, the reduction of alpha signal power may reflect a greater component of mutual attentiveness in the joint attention than in the no-joint attention condition. This mutual attentiveness component of joint attention (Emery, 2000) would induce a higher arousal during joint than no-joint attention conditions. This interpretation also fits with the Striano et al. (2006) finding that the mid-latency negative component (Nc) of event-related potentials in response to objects – associated with attention and arousal – was enhanced in a live joint attention context (as compared to a non-joint attention context), in an infant study. Likewise, alpha rhythm has been related to attentional suppression mechanism (for a review, see Worden et al., 2000; Sauseng et al., 2005; Foxe and Snyder, 2011). In this latter framework, it may be suggested that the modulation of alpha activity that we observed reflected a process of attentional suppression of the gaze of others and/or of the object of the other’s attention under the condition where subjects had to attend different objects (no-joint attention condition) as compared to the condition of attention alignment onto the same object (joint attention condition).
It has to be noted that our effect on alpha band cannot be construed as reflecting the difficulty of the task (i.e., reduced alpha signal power being related to greater task difficulty) since if anything, the more difficult condition in our paradigm was the no-joint attention condition, as reflected by the slower saccadic response times in this condition than in the joint attention condition. Indeed, jointly attending with someone to the same object is easier as it is more natural and automatic than looking at different objects, which requires the inhibition of the trend to follow the other’s gaze. Yet, the no-joint attention conditions elicited increased alpha oscillatory activities as compared to the joint attention conditions.
We did not find any effect on alpha or mu rhythms of the roles played in turn by each participant of the dyads as the driver or the follower of gaze in the social instruction. This may be due to our choice of the window for time-frequency analysis: we chose to analyze the time period in which the attention of both participants was settled either onto the same LED or onto opposite LEDs. This might not have favored the capture of the physiological responses associated with the driver versus follower roles.
Interestingly, the reduction of oscillatory activities in the alpha and mu frequency band in the joint versus no-joint attention conditions was observed under both the socially driven and the color-driven instructions. This may seem at odd with the fact that to perform the task under the color instruction, the participants did not need their partner. Thus, it may have been expected that the color instruction conditions required less social coordination and mutual attentiveness than the social instruction conditions. Yet, the analysis of saccadic responses times showed that although participants were faster under the color than the social instructions, responses times were overall shorter in the joint than in the no-joint attention conditions, and there was not any significant interaction between Attention and Instruction. This reveals an influence of the partner’s behavior on participant’s performances that did not seem to depend on the type of instruction, corroborating the finding of an overall decrease of alpha and mu rhythms under joint relative to no-joint attention conditions. Altogether, these results are likely to be explained by our setup: the participants sat face-to-face, being in physical co-presence, and they shared periods of mutual gaze in between every trial of all experimental blocks. In this condition, the presence of a partner may have always been relevant to the participants, emphasizing joint attention-related processes under both types of task instructions.
The modulation of alpha and mu rhythms was restricted to the left scalp regions. This left lateralization of our results may not be straightforwardly related to a preferential left hemisphere involvement as such interpretation would require source localization. Yet, this aspect of our findings deserves discussion because it stands in contrast with the studies that have reported right-lateralized brain responses in social interaction paradigms, whether in fMRI data or in scalp EEG data (Tognoli et al., 2007; Dumas et al., 2010; Redcay et al., 2010). In a recent fMRI study using a live joint attention paradigm similar to ours, Saito et al. (2010) found that following a partner’s gaze toward object elicited activation in the left intraparietal sulcus. They suggested that this region may be specifically involved in shared attention mechanisms, encoding dyadic relations – between the partner and the object, and between the self and the object– during attention orienting and gaze following. Thus the activation of such mechanism may explain the left lateralization of our results. In addition, it has been proposed that in tasks involving perspective taking during the performance (or imagination) of action, the hemispheric lateralization of the regions involved in processing first- versus third-person perspective, particularly in the temporo-parietal regions, may critically depend on the actual context of the task at hand (Vogeley and Fink, 2003). For example, a study reported left hemisphere activations specific of first-person perspective in a task of action simulation (Ruby and Decety, 2001). Another study reported left-lateralized activations during imitation relative to observation of action as well as under first- relative to third-person perspective (Jackson et al., 2006). Left-lateralized activation of the temporo-parietal region was also reported in a task involving perspective taking with the participants facing a human figure opposite to them (Zacks et al., 1999). It is thus possible that the balance between first- and third-person perspective taking involved in our experiment under joint versus no-joint attention conditions favored the observation of left-lateralized effects.
In line with the program of cognitive ethology (Kingstone et al., 2003), we designed an ecological setup to study joint attention in a face-to-face situation, which we combined with dual-EEG recording. This allowed us to investigate the oscillatory brain correlates of live joint attention processes. In our design, we wanted to get as close as possible of the real-life joint attention phenomenon and therefore designed a paradigm where patients were dynamically engaged in alternating periods of joint attention and mutual gaze subtended by eye movements. Under this design, we chose to focus our analysis on the time period during which both participants’ attention were aligned onto the same object, with the hypothesis that this should elicit a sustained state of joint attention with maintained associated feelings of mutual attentiveness and rapport (Tickle-Degnen and Rosenthal, 1990). We showed that joint attention periods (relative to no-joint attention periods) yielded a decrease in the 11–13 Hz frequency band over a large set of left-lateralized centro-parieto-occipital electrodes. This can be interpreted as reflecting the processes of attention mirroring, social coordination, and mutual attentiveness associated with the time periods where participants’ attention was aligned onto the same object. It is the first time that alpha and mu oscillatory activities are demonstrated to be electrophysiological correlates of joint attention. In order to make the best out of the dual-EEG technique, it will be interesting in future studies to examine the modulation of these oscillatory activities by joint attention at an interindividual level.
Conflict of Interest Statement
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
We thank Florence Bouchet for her help during the recording of dual-EEG. We also thank Denis Schwartz for his help in the time-frequency analyses.
Babiloni, C., Carducci, F., Cincotti, F., Rossini, P. M., Neuper, C., Pfurtscheller, G., and Babiloni, F. (1999). Human movement-related potentials vs desynchronization of EEG alpha rhythm: a high-resolution EEG study. Neuroimage 10, 658–665.
Baron-Cohen, S., Wheelwright, S., Skinner, R., Martin, J., and Clubley, E. (2001). The autism-spectrum quotient (AQ): evidence from Asperger syndrome/high-functioning autism, males and females, scientists and mathematicians. J. Autism Dev. Disord. 31, 5–17.
Cheng, Y., Yang, C.-Y., Lin, C.-P., Lee, P.-L., and Decety, J. (2008). The perception of pain in others suppresses somatosensory oscillations: a magnetoencephalography study. Neuroimage 40, 1833–1840.
Hari, R., Forss, N., Avikainen, S., Veskari, E. K., Salenius, S., and Rizzolatti, G. (1998). Activation of human primary motor cortex during action observation: a neuromagnetic study. Proc. Natl. Acad. Sci. U.S.A. 95, 15061–15065.
Lakin, J. L., Jefferis, V. E., Cheng, C. M., and Chartrand, T. L. (2003). The chameleon effect as social glue: evidence for the evolutionary significance of nonconscious mimicry. J. Nonverbal Behav. 27, 145–162.
Perry, A., and Bentin, S. (2009). Mirror activity in the human brain while observing hand movements: a comparison between EEG desynchronization in the mu-range and previous fMRI results. Brain Res. 1282, 126–132.
Perry, A., Bentin, S., Bartal, I. B.-A., Lamm, C., and Decety, J. (2010). “Feeling” the pain of those who are different from us: modulation of EEG in the mu/alpha range. Cogn. Affect. Behav. Neurosci. 10, 493–504.
Pfurtscheller, G., Stancák, A. Jr., and Neuper, C. (1996). Event-related synchronization (ERS) in the alpha band – an electrophysiological correlate of cortical idling: a review. Int. J. Psychophysiol. 24, 39–46.
Pineda, J. A., Allison, B. Z., and Vankov, A. (2000). The effects of self-movement, observation, and imagination on mu rhythms and readiness potentials (RP’s): toward a brain-computer interface (BCI). IEEE Trans. Rehabil. Eng. 8, 219–222.
Pönkänen, L. M., Alhoniemi, A., Leppänen, J. M., and Hietanen, J. K. (2011). Does it make a difference if I have an eye contact with you or with your picture? An ERP study. Soc. Cogn. Affect. Neurosci. 6, 486–494.
Redcay, E., Dodell-Feder, D., Pearrow, M. J., Mavros, P. L., Kleiner, M., Gabrieli, J. D. E., and Saxe, R. (2010). Live face-to-face interaction during fMRI: a new tool for social cognitive neuroscience. Neuroimage 50, 1639–1647.
Saito, D. N., Tanabe, H. C., Izuma, K., Hayashi, M. J., Morito, Y., Komeda, H., Uchiyama, H., Kosaka, H., Okazawa, H., Fujibayashi, Y., and Sadato, N. (2010). “Stay tuned”: inter-individual neural synchronization during mutual gaze and joint attention. Front. Integr. Neurosci. 4, 127. doi:10.3389/fnint.2010.00127
Sauseng, P., Klimesch, W., Stadler, W., Schabus, M., Doppelmayr, M., Hanslmayr, S., Gruber, W. R., and Birbaumer, N. (2005). A shift of visual spatial attention is selectively associated with human EEG alpha activity. Eur. J. Neurosci. 22, 2917–2926.
Wilms, M., Schilbach, L., Pfeiffer, U., Bente, G., Fink, G. R., and Vogeley, K. (2010). It’s in your eyes – using gaze-contingent stimuli to create truly interactive paradigms for social cognitive and affective neuroscience. Soc. Cogn. Affect. Neurosci. 5, 98–107.
Worden, M. S., Foxe, J. J., Wang, N., and Simpson, G. V. (2000). Anticipatory biasing of visuospatial attention indexed by retinotopically specific alpha-band electroencephalography increases over occipital cortex. J. Neurosci. 20, RC63.
Keywords: joint attention, dual-EEG, alpha, mu, social coordination, time-frequency analysis
Citation: Lachat F, Hugueville L, Lemaréchal J-D, Conty L and George N (2012) oscillatory brain correlates of live joint attention: a dual-EEG study. Front. Hum. Neurosci. 6:156. doi: 10.3389/fnhum.2012.00156
Received: 31 December 2011; Accepted: 16 May 2012;
Published online: 01 June 2012.
Edited by:Leonhard Schilbach, Max-Planck-Institute for Neurological Research, Germany
Reviewed by:Stefanie Hoehl, University of Heidelberg, Germany
Scott Kelso, Florida Atlantic University, USA
Copyright: © 2012 Lachat, Hugueville, Lemaréchal, Conty and George. This is an openaccess article distributed under the terms of the Creative Commons Attribution Non Commercial License, which permits non-commercial use, distribution, and reproduction in other forums, provided the original authors and source are credited.
*Correspondence: Fanny Lachat, CRICM, UMR 7225/UMR-S 975, UPMC/CNRS/INSERM, Equipe Cogimage (1e étage), Institut du Cerveau et de la Moelle Epiniere, Hopital de la Salpetriere, 47, Bd de l’Hopital, F-75651 Paris CEDEX 13, France. e-mail: firstname.lastname@example.org