ORIGINAL RESEARCH article
Front. Phys.
Sec. Radiation Detectors and Imaging
Volume 13 - 2025 | doi: 10.3389/fphy.2025.1588715
This article is part of the Research TopicMulti-Sensor Imaging and Fusion: Methods, Evaluations, and Applications, Volume IIIView all 11 articles
Multi-Sensor Fusion for AI-Driven Behavior Planning in Medical Applications
Provisionally accepted- 1Nanjing University of Chinese Medicine, Nanjing, China
- 2Southeast University, Dhaka, Dhaka, Bangladesh
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Multi-sensor fusion has emerged as a transformative approach in AI-driven behavior planning for medical applications, significantly enhancing perception, decision-making, and adaptability in complex and dynamic environments. Traditional fusion methods primarily rely on deterministic techniques such as Kalman Filters or rule-based decision models. While effective in structured settings, these methods often struggle to maintain robustness under sensor degradation, occlusions, and environmental uncertainties. Such limitations pose critical challenges for realtime decision-making in medical applications, where precision, reliability, and adaptability are paramount. To address these challenges, we propose an Adaptive Probabilistic Fusion Network (APFN), a novel framework that dynamically integrates multi-modal sensor data based on estimated sensor reliability and contextual dependencies. Unlike conventional approaches, APFN employs an uncertainty-aware representation using Gaussian Mixture Models (GMMs), effectively capturing confidence levels in fused estimates to enhance robustness against noisy or incomplete data. we incorporate an attention-driven deep fusion mechanism to extract highlevel spatial-temporal dependencies, improving interpretability and adaptability. By dynamically weighing sensor inputs and optimizing feature selection, APFN ensures superior decision-making under varying medical conditions. We rigorously evaluate our approach on multiple large-scale medical datasets, comprising over one million trajectory samples across four public benchmarks.Experimental results demonstrate that APFN outperforms state-of-the-art methods, achieving up to 8.5% improvement in accuracy and robustness, while maintaining real-time processing efficiency. These results validate APFN's effectiveness in AI-driven medical behavior planning, providing a scalable and resilient solution for next-generation healthcare technologies, with the potential to revolutionize autonomous decision-making in medical diagnostics, monitoring, and robotic-assisted interventions.1 Sample et al.
Keywords: Multi-sensor fusion, AI-driven Behavior Planning, Uncertainty-aware Modeling, deep learning, Medical applications
Received: 06 Mar 2025; Accepted: 07 Jul 2025.
Copyright: © 2025 Wu, Qin, Chang and Li. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Mianhua Wu, Nanjing University of Chinese Medicine, Nanjing, China
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.