ORIGINAL RESEARCH article
Front. Comput. Sci.
Sec. Human-Media Interaction
Volume 7 - 2025 | doi: 10.3389/fcomp.2025.1570249
This article is part of the Research TopicEmbodied Perspectives on Sound and Music AIView all 7 articles
BrAIn Jam: Neural Signal-Informed Adaptive System for Drumming Collaboration with an AI-Driven Virtual Musician
Provisionally accepted- 1University of Colorado Boulder, Boulder, United States
- 2National University of Singapore, Singapore, Singapore
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Collaboration between improvising musicians requires a dynamic exchange of subtleties in human musical communication. Many musicians can intuit this information, however, translating this knowledge to embodied computer-driven musicianship systems-be they robotic or virtual musicians-remains an ongoing challenge. Methods of communicating musical information to computer-driven musicianship systems have traditionally been accomplished using an array of sensing techniques such as MIDI, audio, and video. However, utilizing musical information from the human brain has only been explored in limited social and musical contexts. This paper presents "BrAIn Jam," utilizing functional near-infrared spectroscopy to monitor human drummers' brain states during musical collaboration with an AI-driven virtual musician. Our system includes a real-time algorithm for preprocessing and classifying brain data, enabling dynamic AI rhythm adjustments based on neural signal processing. Our formative study is conducted in two phases: 1) training individualized machine learning models using data collected during a controlled experiment, and 2) using these models to inform an embodied AI-driven virtual musician in a real-time improvised drumming collaboration. In this paper, we discuss our experimental approach to isolating a network of brain areas involved in music improvisation with embodied AI-driven musicians, a comparative analysis of several machine learning models, and post hoc analysis of brain activation to corroborate our findings. We then synthesize findings from interviews with our participants and report on the challenges and opportunities for designing music systems with functional near-infrared spectroscopy, as well as the applicability of other physiological sensing techniques for human and AI-driven musician communication.
Keywords: FNIRS (functional Near-InfraRed Spectroscopy), Brain-Computer Interfaces, embodied AI, Music, Neuroscience, machine learning, music improvisation, human-computer interaction
Received: 03 Feb 2025; Accepted: 04 Jul 2025.
Copyright: © 2025 Hopkins, Sun, Weng, Ma, Crum, Hirshfield and Do. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Torin Hopkins, University of Colorado Boulder, Boulder, United States
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.