%A Yoo,Sejin %A Lee,Kyoung-Min %D 2013 %J Frontiers in Human Neuroscience %C %F %G English %K verbal repetition,inferior frontal gyrus,articulation-based codes,Sound perception,functional near-infrared spectroscopy,hemoglobin concentration,sensorimotor integration %Q %R 10.3389/fnhum.2013.00540 %W %L %M %P %7 %8 2013-September-05 %9 Original Research %+ Prof Kyoung-Min Lee,Seoul National University,Department of Neurology and Interdisciplinary Program in Cognitive Science,Seoul,Republic of Korea,kminlee@snu.ac.kr %# %! A fNIRS study on verbal repetition %* %< %T Articulation-based sound perception in verbal repetition: a functional NIRS study %U https://www.frontiersin.org/articles/10.3389/fnhum.2013.00540 %V 7 %0 JOURNAL ARTICLE %@ 1662-5161 %X Verbal repetition is a fundamental language capacity where listening and speaking are inextricably coupled with each other. We have recently reported that the left inferior frontal gyrus (IFG) harbors articulation-based codes, as evidenced by activation during repetition of meaningless speech sounds, i.e., pseudowords. In this study, we aimed at confirming this finding and further investigating the possibility that sound perception as well as articulation is subserved by neural circuits in this region. Using functional near-infrared spectroscopy (fNIRS), we monitored changes of hemoglobin (Hb) concentration at IFG bilaterally, while subjects verbally repeated pseudowords and words. The results revealed that the proportion of oxygenated hemoglobin (O2Hb) over total Hb was significantly higher at the left IFG during repetition of pseudowords than that of words, replicating the observation by functional MRI and indicating that the region processes articulatory codes for verbal repetition. More importantly for this study, hemodynamic modulations were observed at both IFG during passive listening without repetition to various sounds, including natural environmental sounds, animal vocalizations, and human non-speech sounds. Furthermore, the O2Hb concentration increased at the left IFG but decreased at the right IFG for both speech and non-speech sounds. These findings suggest that both speech and non-speech sounds may be processed and maintained by a neural mechanism for sensorimotor integration using articulatory codes at the left IFG.