ORIGINAL RESEARCH article

Front. Bioeng. Biotechnol.

Sec. Biosensors and Biomolecular Electronics

Volume 13 - 2025 | doi: 10.3389/fbioe.2025.1558529

This article is part of the Research TopicBiomechanics, Sensing and Bio-inspired Control in Rehabilitation and Assistive Robotics, Volume IIView all 15 articles

Intelligent Biosensors for Human Movement Rehabilitation and Intention Recognition

Provisionally accepted
Mehrab  RafiqMehrab Rafiq1Nouf  Abdullah AlmujallyNouf Abdullah Almujally2Asaad  AlgarniAsaad Algarni3Ahmad  JalalAhmad Jalal1*Hui  LiuHui Liu4*
  • 1Air University, Islamabad, Pakistan
  • 2Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia
  • 3Northern Border University, Arar, Northern Borders, Saudi Arabia
  • 4University of Bremen, Bremen, Bremen, Germany

The final, formatted version of the article will be published soon.

Advances in sensing technology have made it possible to incorporate inertial sensors, such as accelerometers and gyroscopes, into wearables and smartphones. Although the sensors have various uses now, their main objective was to improve the capabilities of the device. Human locomotion recognition (HLR) is an intriguing field of study with many uses in sports, fitness, medicine, and health monitoring. This study develops a sophisticated system to recognize different human movement and localization characteristics. Two datasets are used in this paper: The extrasensory dataset and the KU-HAR dataset. The Extrasensory dataset features data from 60 participants, including thousands of data samples from smartphone and smartwatch sensors, labeled with a wide array of human activities. Our study integrates novel feature extraction techniques for signal, GPS, and audio sensor data. Specifically, GPS, audio, and IMU sensors are utilized for localization, while IMU and Ambient sensors are employed for locomotion activity recognition. The second dataset is the KU-HAR dataset. Human Locomotion Recognition (HLR) refers to the capacity of machines to perceive human locomotion's. The dataset contains information about 18 activities gathered from 90 participants (75 males and 15 females) using smartphone sensors (Accelerometer and Gyroscope). It consists of 1945 raw activity samples collected directly from the participants and 20750 subsamples extracted from them. The raw sensors used to acquire the data were noisy ones. We first describe our noise removal procedure, which uses a second-order Butterworth filter to clean the raw sensor data before using Hamming windows to segment the signal. Following that, features were taken out for various sensors. We employ Skewness, Energy, Kurtosis, Linear Prediction Cepstral Coefficient (LPCC), Dynamic Time Wrapping (DTW), for locomotion, and Step Count and Step Length for localization and feature extraction for actions. Yeo John Power Optimization was employed to optimize features. The Extrasensory dataset achieves an accuracy of 90% and the KU-HAR dataset achieves an accuracy of 91%. The suggested system performs better than the most advanced techniques already in use. The proposed system not only demonstrates high accuracy on two diverse datasets but also shows strong generalization capabilities validated through statistical testing and additional experiments.

Keywords: wearable sensors, remote sensing, Rehabilitation, intelligent perception, human movement, Motion intention recognition Wearable sensors, Motion intention recognition

Received: 10 Jan 2025; Accepted: 17 Jun 2025.

Copyright: © 2025 Rafiq, Almujally, Algarni, Jalal and Liu. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence:
Ahmad Jalal, Air University, Islamabad, Pakistan
Hui Liu, University of Bremen, Bremen, 28359, Bremen, Germany

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.