AUTHOR=Abro Iqra Aijaz , Alharbi Shuaa S. , Alshammari Naif S. , Algarni Asaad , Almujally Nouf Abdullah , Jalal Ahmad , Liu Hui TITLE=Multimodal intelligent biosensors framework for fall disease detection and healthcare monitoring JOURNAL=Frontiers in Bioengineering and Biotechnology VOLUME=Volume 13 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/bioengineering-and-biotechnology/articles/10.3389/fbioe.2025.1544968 DOI=10.3389/fbioe.2025.1544968 ISSN=2296-4185 ABSTRACT=IntroductionIn the field of human action recognition, the fusion of multi-modal data from RGB and inertial modalities provides a valid technique for identifying activities of daily life and falls.MethodsOur approach uses two reference datasets: UR-Fall Detection and UMA_Fall Detection for ADL and Fall Events. First, data preprocessing is conducted for each sort of sensor individually, then the signals are windowed and segmented properly. Key features are then extracted, where from RGB data we get 2.5D point clouds, kinetic energy, angles, curve points, ridge features, and inertial signals, giving GCC, GMM, LPCC, and SSCE coefficients. The second method employed is Adam to improve the discriminant of the chosen features. For classification, we employed a Deep Neural Network (DNN) for ADL and fall detection over the UR-Fall dataset and the UMA_Fall dataset.ResultsThe classification accuracy achieved on the UMA_Fall dataset is 97% for ADL activities and 96% for fall activities, while for the UR-Fall dataset, it is 94% for ADL activities and 92% for fall activities. This diversified classifier setting compensates for the variety of data and optimizes the system for differentiating between ADL and fall events.DiscussionThe above system provides outstanding results in recognizing these activities on both datasets and illustrates that the multimodal data fusion can boost the human activity identification system for health and safety purposes.