ORIGINAL RESEARCH article
Front. Digit. Health
Sec. Connected Health
Volume 7 - 2025 | doi: 10.3389/fdgth.2025.1670464
Establishing a Real-Time Biomarker-to-LLM Interface: A Modular Pipeline for HRV Signal Acquisition, Processing, and Physiological State Interpretation via Generative AI
Provisionally accepted- 1University of Zurich, Zürich, Switzerland
- 2Ruhr-Universitat Bochum, Bochum, Germany
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Large language models are capable of summarizing research, supporting clinical reasoning, and engaging in coherent conversations. However, their inputs are limited to user-generated text, which reflects subjective reports, delayed responses, and consciously filtered impressions. Integrating physiological signals provides a clear additional value, as it allows language models to consider real-time indicators of autonomic state alongside linguistic input, thereby enabling more adaptive and context-sensitive interactions in learning, decision-making, and healthcare.Large language models can summarize research, generate clinical reasoning, and carry on convincing conversations. But for all their linguistic power, they rely entirely on what we tell them — subjective reports, delayed inputs, and filtered impressions. If we want them to become true partners in learning, decision-making, or care, they need something more: biosignals, not just words. Therefore, we present a streamlined architecture for routing real-time heart rate variability (HRV) data from a wearable sensor directly into a generative AI environment. Using a validated heart rate variability HRV sensor, we decoded Bluetooth-transmitted R-R intervals via a custom Python script and derived core heart rate variability HRV metrics (HR, RMSSD, SDNN, LF/HF ratio, pNN50) in real time. These values were published via REST and WebSocket endpoints through a FastAPI backend, making them continuously accessible to external applications — including OpenAI's GPT models. This is a provisional file, not the final typeset article The result: a live data pipeline from autonomic input to conversational output. A language model that does notn't just talk back, but responds to real-time physiological shifts in natural language. In multiple proof-of-concept scenarios, ChatGPT accessed real-time HRV data, performed descriptive analyses, generated visualizations, and adapted its feedback in response to autonomic shifts induced by low and high cognitive load. This system represents an early prototype of bioadaptive AI, in which physiological signals are incorporated as part of the model's input context.This system marks an early prototype for bioadaptive AI — where your body becomes part of the prompt.
Keywords: Embodied AI 1, Physiologically Coupled Language Models 2, Biofeedback-Enhanced LLM Interaction 4, Stress Detection via AI, Affective Computing
Received: 22 Jul 2025; Accepted: 11 Sep 2025.
Copyright: © 2025 Gellisch and Burr. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Morris Gellisch, University of Zurich, Zürich, Switzerland
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.