ORIGINAL RESEARCH article

Front. Bioeng. Biotechnol.

Sec. Biosensors and Biomolecular Electronics

Volume 13 - 2025 | doi: 10.3389/fbioe.2025.1544968

Multimodal Intelligent Biosensors Framework for Fall Disease Detection and Healthcare Monitoring

Provisionally accepted
Iqra  Aijaz AbroIqra Aijaz Abro1Shuaa S.  AlharbiShuaa S. Alharbi2Naif  S AlshammariNaif S Alshammari3Asaad  AlgarniAsaad Algarni4Nouf  Abdullah AlmujallyNouf Abdullah Almujally5Ahmad  JalalAhmad Jalal1*Hui  LiuHui Liu6*
  • 1Air University, Islamabad, Pakistan
  • 2Qassim University, Ar Rass, Saudi Arabia
  • 3Majmaah University, Al Majma'ah, Riyadh, Saudi Arabia
  • 4Northern Border University, Arar, Northern Borders, Saudi Arabia
  • 5Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia
  • 6University of Bremen, Bremen, Bremen, Germany

The final, formatted version of the article will be published soon.

In the field of human action recognition, the fusion of multi-modal data from RGB and inertial modalities provides a valid technique for identifying activities of daily life and falls. Our approach uses two reference datasets: UR-Fall Detection and UMA_Fall Detection for ADL and Fall Events. First, data preprocessing is conducted for each sort of sensor individually, then the signals are windowed and segmented properly. Key features are then extracted, where from RGB data we get 2.5D point clouds, kinetic energy, angles, curve points, ridge features, and inertial signals, giving GCC, GMM, LPCC, and SSCE coefficients. The second method employed is Adam to improve the discriminant of the chosen features. For classification, we employed a Deep Neural Network (DNN) for ADL and fall detection over the UR-Fall dataset and the UMA_Fall dataset. The classification accuracy achieved on the UMA_Fall dataset is 97% for ADL activities and 96% for fall activities, while for the UR-Fall dataset, it is 94% for ADL activities and 92% for fall activities. This diversified classifier setting compensates for the variety of data and optimizes the system for differentiating between ADL and fall events. The above system provides outstanding results in recognizing these activities on both datasets and illustrates that the multimodal data fusion can boost the human activity identification system for health and safety purposes.

Keywords: Biosensing devices, artificial intelligence, machine learning, Body pose, Disease detection, decision-making, Healthcare management Biosensing Devices, Healthcare Management

Received: 13 Dec 2024; Accepted: 30 Apr 2025.

Copyright: © 2025 Abro, Alharbi, Alshammari, Algarni, Almujally, Jalal and Liu. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence:
Ahmad Jalal, Air University, Islamabad, Pakistan
Hui Liu, University of Bremen, Bremen, 28359, Bremen, Germany

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.