ORIGINAL RESEARCH article
Front. Bioeng. Biotechnol.
Sec. Biosensors and Biomolecular Electronics
Volume 13 - 2025 | doi: 10.3389/fbioe.2025.1568690
This article is part of the Research TopicBiomechanics, Sensing and Bio-inspired Control in Rehabilitation and Assistive Robotics, Volume IIView all 16 articles
A Novel Multi-Modal Rehabilitation Monitoring Over Human Motion Intention Recognition
Provisionally accepted- 1Air University, Islamabad, Pakistan
- 2King Khalid University, Abha, Saudi Arabia
- 3Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia
- 4University of Bremen, Bremen, Bremen, Germany
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Human Motion Intention Recognition (HMIR) plays a vital role in advancing medical rehabilitation and assistive technologies by enabling the early detection of pain-indicative actions such as sneezing, coughing, or back discomfort. However, existing systems struggle with recognizing such subtle movements due to complex postural variations and environmental noise. This paper presents a novel multi-modal framework that integrates RGB and depth data to extract high-resolution spatial-temporal and anatomical features for accurate HMIR. Our method combines kinetic energy, optical flow, angular geometry, and depth-based features (e.g., 2.5D point clouds and random occupancy patterns) to represent full-body dynamics robustly. Stochastic Gradient Descent (SGD) is employed to optimize the feature space, and a deep neuro-fuzzy classifier is proposed to balance interpretability and predictive accuracy. Evaluated on three benchmark datasets-NTU RGB+D 120, PKUMMD, and UWA3DII-our model achieves classification accuracies of 94.50%, 91.23%, and 88.60% respectively, significantly outperforming state-of-the-art methods. This research lays the groundwork for future real-time HMIR systems in smart rehabilitation and medical monitoring applications.
Keywords: Motion Intension Recognition, human machine interaction, Rehabilitation, Multimodal Sensor Integration Motion Intension Recognition, Multimodal Sensor Integration
Received: 30 Jan 2025; Accepted: 19 Jun 2025.
Copyright: © 2025 Kamal, Alshehri, Alqahtani, Alshahrani, Almujally, Jalal and Liu. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence:
Ahmad Jalal, Air University, Islamabad, Pakistan
Hui Liu, University of Bremen, Bremen, 28359, Bremen, Germany
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.