Impact Factor 3.648 | CiteScore 3.99
More on impact ›

Original Research ARTICLE Provisionally accepted The full-text will be published soon. Notify me

Front. Neurosci. | doi: 10.3389/fnins.2019.01088

Musical Role Asymmetries in Piano Duet Performance influence Alpha-Band Neural Oscillation and Behavioral Synchronization

 Auriel Washburn1, 2*, Irán Román1,  Madeline Huberth1, Nick Gang1,  Tysen Dauer1,  Wisam Reid1, Chryssie Nanou1,  Matthew Wright1 and  Takako Fujioka1, 3
  • 1Center for Computer Research in Music and Acoustics, Stanford University, United States
  • 2University of California, San Diego, United States
  • 3Wu Tsai Neurosciences Institute, Stanford University, United States

Recent work in interpersonal coordination has revealed that neural oscillations, occurring spontaneously in the human brain, are modulated during the sensory, motor, and cognitive processes involved in interpersonal interactions. In particular, alpha-band (8-12 Hz) activity, linked to attention in general, is related to coordination dynamics and empathy traits. Researchers have also identified an association between each individual's attentiveness to their co-actor and the relative similarity in the co-actors’ roles, influencing their behavioral synchronization patterns. We employed music ensemble performance to evaluate patterns of behavioral and neural activity when roles between co-performers are systematically varied with complete counterbalancing. Specifically, we designed a piano duet task, with three types of co-actor dissimilarity, or asymmetry: 1) musical role (starting vs. joining), 2) musical task similarity (similar vs. dissimilar melodic parts), and 3) performer animacy (human-to-human vs. human-to-non-adaptive computer). We examined how the experience of these asymmetries in four initial musical phrases, alternatingly played by the co-performers, influenced the pianists’ performance of a subsequent unison phrase. Electroencephalography was recorded simultaneously from both performers while playing keyboards. We evaluated note-onset timing and alpha modulation around the unison phrase. We also investigated whether each individual's self-reported empathy was related to behavioral and neural activity. Our findings revealed closer behavioral synchronization when pianists played with a human vs. computer partner, likely because the computer was non-adaptive. When performers played with a human partner, or a joining performer played with a computer partner, having a similar vs. dissimilar musical part did not have a significant effect on their alpha modulation immediately prior to unison. However, when starting performers played with a computer partner with a dissimilar vs. similar part there was significantly greater alpha synchronization. In other words, starting players attended less to the computer partner playing a similar accompaniment, operating in a solo-like mode. Moreover, this alpha difference based on melodic similarity was related to a difference in note-onset adaptivity, which was in turn correlated with performer trait empathy. Collectively our results extend previous findings by showing that musical ensemble performance gives rise to a socialized context whose lasting effects encompass attentiveness, perceptual-motor coordination, and empathy.

Keywords: EEG, neural oscillation, Alpha oscillations, perceptual-motor coordination, role asymmetries, social neuroscience, interpersonal coordination, musical performance

Received: 01 May 2019; Accepted: 27 Sep 2019.

Copyright: © 2019 Washburn, Román, Huberth, Gang, Dauer, Reid, Nanou, Wright and Fujioka. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Dr. Auriel Washburn, Center for Computer Research in Music and Acoustics, Stanford University, Stanford, 94305, California, United States, alwashburn@eng.ucsd.edu