AUTHOR=Morucci Piermatteo , Giannelli Francesco , Richter Craig G. , Molinaro Nicola TITLE=Spoken words affect visual object recognition via the modulation of alpha and beta oscillations JOURNAL=Frontiers in Neuroscience VOLUME=Volume 19 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/neuroscience/articles/10.3389/fnins.2025.1467249 DOI=10.3389/fnins.2025.1467249 ISSN=1662-453X ABSTRACT=Hearing spoken words can enhance the recognition of visual object categories. Yet, the mechanisms that underpin this facilitation are incompletely understood. Recent proposals suggest that words can alter visual processes by activating category-specific representations in sensory regions. Here, we tested the hypothesis that neural oscillations serve as a mechanism to activate language-generated visual representations. Participants performed a cue-picture matching task where cues were either spoken words, in their native or second language, or natural sounds, while their EEG and reaction times were recorded. Behaviorally, we found that images cued by words were recognized faster than those cued by natural sounds. This indicates that language activates more accurate semantic representations compared to natural sounds. A time-frequency analysis of cue-target intervals revealed that this label-advantage effect was associated with enhanced power in posterior alpha (9–11 Hz) and beta oscillations (17–19 Hz), both of which were larger when the image was preceded by a word compared to a natural sound. These results suggest that alpha and beta rhythms may play distinct functional roles to support language-mediated visual object recognition: alpha might function to amplify sensory representations in posterior regions, while beta may (re)activate the network states elicited by the auditory cue.