- 1School of Engineering, University of Birmingham, Birmingham, United Kingdom
- 2School of Automation Engineering, Nanjing University of Information Science and Technology, Nanjing, China
- 3Faculty of Medical Rehabilitation Science, King Abdulaziz University, Jeddah, Saudi Arabia
Introduction: Accurate joint kinematics estimation is essential for understanding human movement and supporting biomechanical applications. Although optical motion capture systems are accurate, their high cost, complex setup, and limited portability restrict use outside laboratory environments. This study proposes a lightweight, physics-informed neural network for real-time joint kinematics estimation using inertial measurement units (IMUs).
Methods: We developed OrientationNN, which integrates orientation-based physical constraints into a compact multi-layer perceptron architecture to ensure biomechanically consistent joint kinematics estimation. The model was evaluated on a publicly available dataset and compared with a physics-based inverse kinematics framework (OpenSense) and conventional learning-based models including MLP, LSTM, CNN, and Transformer.
Results: OrientationNN achieved an average joint angle estimation error below 5° during ambulatory motion and consistently outperformed OpenSense across all kinematic variables. The model required only 4.9 × 10³ FLOPs per frame and 10.8 KB of parameters, demonstrating high computational efficiency suitable for real-time applications.
Conclusion: OrientationNN enables accurate and computationally efficient joint kinematics estimation from IMU data. The results highlight its potential as a cost-effective and scalable solution for wearable biomechanical and motion analysis applications.
1 Introduction
Accurate estimation of joint kinematics is fundamental to understanding human movement, advancing biomechanical modelling, and supporting the design, assessment, and optimisation of interventions and assistive technologies. When achieved in real time, such estimation enables personalised rehabilitation therapy (Patel et al., 2012) and closed-loop control of powered prostheses and exoskeletons (Kawamoto et al., 2003), contributing to objective management of motor recovery and functional performance (Baker, 2006). While optical motion capture systems remain the gold standard for laboratory-based motion analysis, their reliance on controlled environments, expensive infrastructure, and complex calibration procedures severely limits their applicability in clinical settings and daily-living scenarios (Cappozzo et al., 2005). This gap between laboratory precision and real-world accessibility has motivated the development of wearable sensing technologies for continuous motion monitoring.
Inertial measurement units (IMUs) have emerged as a promising alternative, offering compact form factors, low cost, and environmental independence (Roetenberg et al., 2009). Recent studies have demonstrated that IMU-based systems can achieve accuracies comparable to optical systems for gait analysis in controlled settings (Cutti et al., 2010; Al-Amri et al., 2018), with comparative validation studies showing strong agreement between inertial sensor-based and camera-based measurements during treadmill walking and running (Nüesch et al., 2017), and confirming concurrent validity and within-session reliability when proper calibration protocols are applied (Berner et al., 2020). However, translating IMU-derived orientations into anatomically meaningful joint angles remains a challenging inverse problem, particularly when sensor-to-segment misalignment, soft tissue artifacts, and calibration errors are present (Cooper et al., 2009).
Two primary paradigms have been developed to address this challenge: physics-based methods and data-driven machine learning approaches. Physics-based methods, such as OpenSense (Delp et al., 2007) and Xsens MVN (Roetenberg et al., 2009), employ biomechanical models combined with inverse kinematics optimisation to reconstruct joint angles from IMU orientations. By incorporating anatomical constraints — including bone-to-bone articulation, segmental parameter consistency, and physiologically valid joint ranges of motion — these methods ensure biomechanical plausibility. However, they require iterative numerical optimisation, which incurs substantial computational cost (often
In contrast, data-driven machine learning approaches have demonstrated superior performance by learning direct mappings from IMU data to joint kinematics without explicit modelling assumptions. Recent advances include convolutional neural networks (CNNs) for spatial feature extraction (Huang et al., 2018; Yi et al., 2022), long short-term memory (LSTM) networks for temporal dependency modelling (Mundt et al., 2020; Khant et al., 2023), attention-based architectures for adaptive feature weighting (Wang et al., 2020), and transformer models for global sequence modelling (Kwon et al., 2020). These methods have achieved RMSEs below 3° in controlled settings (Mundt et al., 2020; Lea et al., 2017; Kwon et al., 2020), outperforming physics-based approaches across walking, running, and stair ambulation tasks. Furthermore, hybrid sensor fusion approaches combining IMUs with surface electromyography (sEMG) have shown promise for capturing neuromuscular dynamics (Phinyomark et al., 2018; Chen et al., 2018).
Despite these impressive results, purely data-driven models face critical limitations that hinder their translation to clinical practice and wearable deployment. First, they lack physical interpretability–operating as black boxes without guarantees of biomechanical consistency, which raises concerns for safety-critical applications such as prosthetic control (Tucker et al., 2020). Second, state-of-the-art architectures such as transformers demand substantial computational resources (
In recent years, physics-informed neural networks (PINNs) have emerged as a framework that integrates physical laws with neural networks (Raissi et al., 2019; Karniadakis et al., 2021). By embedding physical laws, such as Newton-Euler equations, kinematic constraints, or conservation principles, into neural network architectures or loss functions, PINNs achieve improved data efficiency and physical consistency (Cuomo et al., 2022). Recent applications in fluid dynamics (Raissi et al., 2020), structural mechanics (Zhang et al., 2022), and robotics (Westenbroek et al., 2022) have demonstrated that physics-informed models can match or exceed purely data-driven approaches with orders of magnitude fewer parameters. However, despite growing interest in biomechanics applications (Linka et al., 2021), PINNs have rarely been applied to IMU-based joint kinematics estimation, and no existing work has systematically addressed the computational efficiency requirements for edge deployment in wearable systems.
To address these gaps, this study introduces OrientationNN, a lightweight physics-informed neural network that integrates orientation-based kinematic constraints with compact multi-layer perceptron (MLP) subnetworks for real-time lower limb joint angle estimation from IMU data. Unlike purely data-driven models that learn arbitrary input-output mappings, OrientationNN explicitly encodes the rotational relationships between adjacent body segments using learnable rotation matrices that represent sensor-to-segment calibrations, combined with dynamic MLP modules that capture subject-specific non-rigid motion artifacts. This hybrid architecture preserves biomechanical interpretability while achieving computational efficiency through modular, joint-specific processing with minimal parameter overhead.
Our research objectives are:
1. To develop a lightweight physics-informed neural network (OrientationNN) that integrates orientation-based physical constraints with a compact MLP architecture for accurate (RMSEs below 5°) and efficient (model size under 20 KB) estimation of lower-limb joint kinematics from IMU data.
2. To analyse and compare the error distribution of the proposed OrientationNN and the physics-based biomechanics model (i.e., OpenSense, an IMU-driven inverse kinematics toolbox), providing insights into model behavior and optimisation throughout the gait cycle.
3. To ensure efficient, real-time, and edge-deployable kinematics estimation by reducing computational cost and parameter dependency.
2 Methods
2.1 Problem statement
Our objective was to provide a high-accuracy and real-time automated estimation of lower limb joint kinematics using wearable IMU sensors.
We used a publicly available treadmill walking dataset comprising recordings from 14 healthy adults (7 males and 7 females), each conducting five approximately 7-min walking trials at a self-selected speed (Bailey et al., 2021). The dataset included measurements from eight IMU sensors (Xsens) mounted on the trunk, lower back, left and right thighs, left and right shanks, and left and right feet. In this study, the trunk IMU was excluded from modelling by following the previous work (Ma et al., 2025). The signals were sampled at 60 Hz, resulting in around 25000 data points per 7-min trial, with variations depending on trial durations.
In addition, the dataset provides lower limb joint angles computed through OpenSim’s inverse kinematics (IK) based on marker trajectory data. As OpenSim is a validated and widely accepted framework for estimating joint kinematics (Holzbaur et al., 2005), the OpenSim-derived joint angles were used as the reference standard for evaluating the proposed neural network model (Figure 1).
Figure 1. Schematic of the experimental setup and data acquisition system. IMU sensors were attached to the lower back, thighs, shanks, and feet to record lower limb kinematics, while a marker-based optical motion capture system was used to obtain the ground truth joint angles for model evaluation.
Following data quality screening based on inter-system synchronisation accuracy, recordings with synchronization errors exceeding 50 ms were excluded. As a result, recordings from ten participants were retained for further analysis. Both IMU-driven (OpenSense) and optoelectronic-driven (OpenSim) inverse kinematics solutions were used for model comparison. The IMU data were synchronised with the marker-based data in MATLAB to eliminate signal delay.
Lower limb joint kinematics, including hip flexion, hip adduction, hip rotation, knee flexion, ankle dorsiflexion, and ankle inversion, were estimated. Neural network models were trained and evaluated under an intra-subject scenario, in which model training and testing were performed using data from the same individual. For each participant, the dataset was partitioned into training (60%), validation (20%), and test (20%) subsets. The training data were used to fit the model, the validation data to optimise the network architecture and hyperparameters, and the test data to evaluate final performance.
2.2 OrientationNN
2.2.1 Model summary
We proposed OrientationNN, a lightweight architecture that integrates orientation-based physical information with compact multi-layer perceptrons (MLPs), ensuring computational efficiency and adherence to segmental constraints. The modelling process began by grouping adjacent IMU orientations as shown in Figure 2b to calculate the corresponding joint angles. The model takes the orientations of two adjacent segments as input and estimates their relative rotation matrix, representing the corresponding joint motion. (Figure 2c). These joint motion matrices were converted into Euler angles to compute the loss function, as segmental kinematics are conventionally represented using Euler angles.
Figure 2. OrientationNN architecture. (a) Model input. (b) Grouping of adjacent IMU orientations for joint kinematics estimation. (c) Computation flow from relative rotation matrices to Euler angles.
The proposed model incorporates joint-specific dynamic modules. Each joint employs a tiny MLP subnetwork (input = 3, hidden = 32, output = 3) to predict dynamic offsets
2.2.2 Basic module
Assume that we have two adjacent segment orientation
where
2.2.3 Learnable static module
However, only IMU orientations are measured directly in the study. Thus, we added two learnable static rotation matrices to replace the sensor to segment orientations (Mundt et al., 2020). Then the joint rotation matrix can be calculated as shown in Equation 2:
where
2.2.4 Dynamic module
To further help the model approximate the joint motion better, we also introduced an extra dynamic module containing tiny MLP models. Two dynamics orientation matrices were first calculated as shown in Equations 3, 4:
where
2.2.5 Concatenation module
Then, the joint angles can be calculated as shown in Equations 5, 6:
where
2.2.6 Weighted euler loss
The loss function is calculated as shown in Equation 7:
where
2.3 Baseline machine learning models
For comparison with our proposed model OrientationNN, we adopted multiple neural network models from literature. The model architectures and hyperparameters are optimised using a framework named Optuna (Akiba et al., 2019), which adopted a Bayesian method to get the optimal architecture and hyper parameter for each model. The optimal hyperparameters of other machine learning models are listed in Table 1.
2.3.1 MLP
The MLP network comprises two fully connected hidden layers, each containing 128 neurons with rectified linear unit (ReLU) activation functions. To prevent overfitting, a dropout layer with a rate of 0.2 is applied after each hidden layer. The input layer receives 63 features, and the output layer produces 12 joint angle estimations. Model parameters are optimized using the Adam optimizer with a learning rate of 0.01 and a weight decay of
2.3.2 LSTM
The LSTM model consists of a single recurrent layer with 128 hidden units and an input size of 63. A dropout rate of 0.2 is applied to the LSTM outputs to reduce overfitting. The final fully connected layer maps the 128-dimensional hidden representation to 12 output joint angles. Training is performed for 100 epochs using the Adam optimizer (learning rate = 0.01, weight decay =
2.3.3 CNN
The CNN model performs spatiotemporal feature extraction using a two-dimensional convolutional layer (Conv2D) with 64 filters and a kernel size of 3, followed by ReLU activation and a dropout rate of 0.2. The feature maps are then flattened and passed through a fully connected layer that outputs 12 joint angle predictions. The input tensor has dimensions of
2.3.4 Transformer
The Transformer-based model begins with a linear projection layer that maps the input to a 64-dimensional embedding space. This embedding is processed by three Transformer encoder layers with a model dimension of 64, two attention heads, a feedforward dimension of 256, and a dropout rate of 0.1. The output sequence is flattened and passed through a final linear layer to produce 12 joint angle outputs. Training is performed for 100 epochs using the Adam optimizer (learning rate = 0.005, weight decay =
2.4 Model evaluation
2.4.1 Performance metrics
Model accuracy was assessed by using the root mean squared error (RMSE), which is defined as follows:
1. The estimation of joint angles should achieve a RMSE of less than 5
2. Joint angle estimation should be feasible on resource-limited wearable devices; thus, the memory requirements and computational complexity of the solutions should be minimized. A practical efficiency threshold of
2.4.2 Statistics
To evaluate the statistical significance of performance differences between OrientationNN and the baseline models, we conducted an independent samples t-test. Before performing the t-test, we assessed the normality of the performance metrics to ensure that the data met the assumption of a normal distribution. For each model (MLP, CNN, LSTM, Transformer, and OrientationNN), we trained 10 separate instances using optimised architectures and hyperparameters. The mean test performance across 10 participants was recorded for each training instance. This resulted in 10 independent performance values per model (N = 10), which were then compared. We used MATLAB’s t-test function to conduct a two-tailed independent samples t-test
3 Results
3.1 OrientationNN vs. OpenSense
Figure 3 shows that OrientationNN achieved significantly lower RMSEs than the OpenSense inverse kinematics approach across all 12 joint kinematic variables (p
Figure 3. Comparison of average RMSE between OrientationNN and OpenSense across 12 lower limb joint angles.
The intra-subject evaluation demonstrated that OrientationNN achieved significantly lower RMSEs across all 12 kinematic variables compared with other machine learning models (p
3.2 Joint angle profiles
Figure 4 illustrates the estimated lower limb joint angles over a complete gait cycle (0%–100%) for 12 kinematic variables, including hip, knee, and ankle joints on both right and left limbs. Ground-truth trajectories obtained from optical motion capture are shown for reference (Baseline, cyan).
Figure 4. Joint angle estimation results over the whole gait cycle. Shaded regions represent the across-subject standard deviation at each normalized time point, illustrating the variability of joint-angle estimates across participants. In the time-normalized gait cycle (0%–100%), heel strike corresponds to 0% and 100%, while toe-off occurs around 50%–60%.
Across all twelve kinematic variables, the error distributions differed between the OrientationNN and OpenSense. For OrientationNN, small deviations from the baseline were mainly observed around heel strike (0%–10%) and toe-off (50%–60%), corresponding to rapid transitions in segment motion. Most local fluctuations were mild (
3.3 Model efficiency analysis
The bubble chart (Figure 5) illustrates the trade-off between computational complexity (FLOPs) and prediction error (RMSE) for different neural network models. The bubble size represents the model parameter size (in KB). The proposed OrientationNN achieved an RMSE of 3.13
4 Discussion
This work highlights a significant advancement in IMU-based joint angle estimation, delivering three main contributions. First, we proposed OrientationNN, a lightweight AI model which integrates physics information with tiny MLP, achieving clinical significance. Our results indicate its superiority over the physics model-based solution. Second, we revealed the error distribution of both proposed AI model and OpenSense model, providing insight into model design focus over the whole gait cycle. Finally, by introducing physics information into the AI model, we significantly reduced the model reliance on computational resources, enabling more cost-effective applications.
4.1 OrientationNN vs. OpenSense
Compared with the traditional physics-based OpenSense approach, OrientationNN demonstrated superior accuracy and stability across all joint angle channels. The experimental results showed that the average RMSEs of OrientationNN were below 5° for all 12 gait kinematic channels, significantly lower than those of OpenSense (5°–10°) with statistical significance (p
From an algorithmic perspective, OpenSense relies on inverse kinematics optimization using inertial sensor signals, whose performance is easily affected by sensor drift, noise accumulation, and misalignment errors (McConnochie et al., 2025). In contrast, OrientationNN directly learns the mapping between IMU orientations and joint rotations in an end-to-end manner, effectively mitigating cumulative errors introduced by optimization. Furthermore, the introduction of the weighted Euler-angle loss enhances the model’s sensitivity to biomechanically critical directions (Liu and Popović, 2002), such as flexion/extension and rotation.
4.2 Joint kinematics profiles
The gait cycle analysis further revealed that OrientationNN achieved superior phase responsiveness in kinematic estimation. When compared with ground-truth trajectories obtained from optical motion capture, the predicted joint angle curves from OrientationNN exhibited high consistency across the entire gait cycle, with minimal phase lag in key movements such as hip flexion, abduction, and rotation. Conversely, OpenSense showed substantial deviations and fluctuations, particularly in hip rotation and ankle inversion during highly dynamic phases (McConnochie et al., 2025; Suvorkin et al., 2024).
In terms of error distribution, OrientationNN’s deviations were mainly concentrated in transition phases (0%–10% heel strike, 50%–60% toe-off), which are characterized by rapid angular acceleration and high inertial variability (Burnfield, 2010). Even in these challenging segments, local errors remained acceptable, and the distribution was symmetric between limbs, indicating robust inter-limb consistency. In contrast, OpenSense exhibited larger phase-dependent drift, with errors up to 8°–10° during mid-swing (70%–90%) for hip and knee flexion.
These results demonstrate that OrientationNN not only improves overall accuracy but also enhances the smoothness and physiological plausibility of the estimated trajectories. Such stability is crucial for clinical applications including gait abnormality detection and neurorehabilitation evaluation (Patel et al., 2012), where reliable and continuous kinematic signals are required for real-time feedback and control.
4.3 Model efficiency analysis
The efficiency analysis highlights the strong trade-off achieved by OrientationNN between computational complexity and predictive performance. The proposed model achieved an average RMSE of
The lightweight advantage of OrientationNN arises from its modular design: each joint is modeled by a compact MLP subnetwork combined with learnable static rotation matrices, preserving biomechanical interpretability while minimizing parameter redundancy (Han et al., 2015). Moreover, the dynamic rotation compensation module further refines non-linear rotational behavior without significantly increasing computational cost. These features make OrientationNN particularly suitable for deployment on wearable devices, rehabilitation robots (Dollar and Herr, 2008), and prosthetic systems, where low power consumption and real-time feedback are critical.
4.4 Limitation and future work
Although OrientationNN achieved strong performance in both accuracy and computational efficiency, several limitations remain. First, this study was conducted using treadmill walking data from healthy adults under controlled laboratory conditions. The model’s robustness under more complex scenarios, such as outdoor walking, uneven terrain (Hamacher et al., 2011), or sensor displacement, has yet to be validated (Prisco et al., 2025). Moreover, the current framework focuses solely on joint kinematics estimation, without explicitly modelling underlying joint dynamics such as torques or interaction forces (Thelen and Anderson, 2006). Although OrientationNN reduces dependence on explicit calibration by learning static rotation matrices, a minimal sensor-to-segment calibration is still required for real-time deployment, and these orientations cannot be entirely pre-trained. This study was conducted using treadmill walking data collected under controlled laboratory conditions. Although this setup allows reliable baseline evaluation, real-world variability such as sensor noise, placement changes, and environmental disturbances may affect model performance. Future work will therefore focus on validating the proposed method under more diverse conditions, including data augmentation and real-world walking scenarios, before extending it to pathological or rehabilitation applications.
Future work will focus on several significant directions. First, we plan to extend the current framework from kinematic estimation to dynamic modelling (Schwartz et al., 2008), enabling the prediction of joint moments and interaction forces directly from IMU data. Second, we aim to deploy and validate OrientationNN on real embedded and edge devices, assessing its real-time performance, energy efficiency, and latency in practical scenarios such as wearable gait monitoring and robotic assistance (Sze et al., 2017). Moreover, the framework can be integrated with more powerful machine learning models to achieve superior prediction accuracy over current baseline AI models.
5 Conclusion
This study presented OrientationNN, a lightweight and physics-informed neural network framework for IMU-based lower limb joint kinematics estimation. By embedding orientation-based physical constraints within a compact MLP architecture, the proposed model achieved both high estimation accuracy and biomechanical interpretability. Experimental results demonstrated that OrientationNN outperformed the traditional physics-based OpenSense method, achieving average RMSEs below 5° across twelve gait kinematic channels, thus meeting clinical relevance for gait assessment (Armand et al., 2016). Additionally, OrientationNN achieved this performance with substantially reduced computational complexity and parameter size, making it well suited for deployment on edge and wearable devices (Taylor et al., 2017).
Data availability statement
The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author.
Author contributions
QB: Conceptualization, Data curation, Formal Analysis, Investigation, Methodology, Software, Validation, Visualization, Writing – original draft, Writing – review and editing. HW: Conceptualization, Formal Analysis, Investigation, Validation, Writing – review and editing. KA: Investigation, Project administration, Resources, Supervision, Writing – review and editing. ZD: Funding acquisition, Project administration, Resources, Supervision, Writing – review and editing.
Funding
The author(s) declared that financial support was received for this work and/or its publication. This work was supported by the Engineering and Physical Sciences Research Council (EPSRC) under Grant EP/V057138/1.
Conflict of interest
The author(s) declared that this work was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Generative AI statement
The author(s) declared that generative AI was used in the creation of this manuscript. To polish the language.
Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
Akiba, T., Sano, S., Yanase, T., Ohta, T., and Koyama, M. (2019). “Optuna: a next-generation hyperparameter optimization framework,” in Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery and data mining, 2623–2631.
Al-Amri, M., Nicholas, K., Button, K., Sparkes, V., Sheeran, L., and Davies, J. L. (2018). Inertial measurement units for clinical movement analysis: reliability and concurrent validity. Sensors 18, 719. doi:10.3390/s18030719
Armand, S., Decoulon, G., and Bonnefoy-Mazure, A. (2016). Gait analysis in children with cerebral palsy. EFORT Open Reviews 1, 448–460. doi:10.1302/2058-5241.1.000052
Bailey, C. A., Uchida, T. K., Nantel, J., and Graham, R. B. (2021). Validity and sensitivity of an inertial measurement unit-driven biomechanical model of motor variability for gait. Sensors 21, 7690. doi:10.3390/s21227690
Baker, R. (2006). Gait analysis methods in rehabilitation. J. Neuroengineering Rehabilitation 3, 4. doi:10.1186/1743-0003-3-4
Berner, K., Cockcroft, J., Morris, L. D., and Louw, Q. (2020). Concurrent validity and within-session reliability of gait kinematics measured using an inertial motion capture system with repeated calibration. J. Bodyw. Mov. Ther. 24, 251–260. doi:10.1016/j.jbmt.2020.06.008
Bian, Q., Castellani, M., Shepherd, D., Duan, J., and Ding, Z. (2024). Gait intention prediction using a lower-limb musculoskeletal model and long short-term memory neural networks. IEEE Trans. Neural Syst. Rehabilitation Eng. 32, 822–830. doi:10.1109/TNSRE.2024.3365201
Cappozzo, A., Della Croce, U., Leardini, A., and Chiari, L. (2005). Human movement analysis using stereophotogrammetry: part 1: theoretical background. Gait and Posture 21, 186–196. doi:10.1016/j.gaitpost.2004.01.010
Chen, C., Zhuang, Y., Nie, F., Yang, Y., Wu, F., and Xiao, J. (2011). Learning a 3d human pose distance metric from geometric pose descriptor. IEEE Trans. Vis. Comput. Graph. 17, 1676–1689. doi:10.1109/TVCG.2010.272
Chen, J., Zhang, X., Cheng, Y., and Xi, N. (2018). Surface emg based continuous estimation of human lower limb joint angles by using deep belief networks. Biomed. Signal Process. Control 40, 335–342. doi:10.1016/j.bspc.2017.10.002
Cooper, G., Sheret, I., McMillian, L., Siliverdis, K., Sha, N., Hodgins, D., et al. (2009). Inertial sensor-based knee flexion/extension angle estimation. J. Biomechanics 42, 2678–2685. doi:10.1016/j.jbiomech.2009.08.004
Cuomo, S., Di Cola, V. S., Giampaolo, F., Rozza, G., Raissi, M., and Piccialli, F. (2022). Scientific machine learning through physics–informed neural networks: where we are and what’s next. J. Sci. Comput. 92, 88. doi:10.1007/s10915-022-01939-z
Cutti, A. G., Ferrari, A., Garofalo, P., Raggi, M., Cappello, A., and Ferrari, A. (2010). ‘outwalk’: a protocol for clinical gait analysis based on inertial and magnetic sensors. Med. and Biological Engineering and Computing 48, 17–25. doi:10.1007/s11517-009-0545-x
Delp, S. L., Anderson, F. C., Arnold, A. S., Loan, P., Habib, A., John, C. T., et al. (2007). Opensim: open-source software to create and analyze dynamic simulations of movement. IEEE Transactions Biomedical Engineering 54, 1940–1950. doi:10.1109/TBME.2007.901024
Dollar, A. M., and Herr, H. (2008). Lower extremity exoskeletons and active orthoses: challenges and state-of-the-art. IEEE Trans. Robotics 24, 144–158. doi:10.1109/tro.2008.915453
Fedus, W., Zoph, B., and Shazeer, N. (2022). Switch transformers: scaling to trillion parameter models with simple and efficient sparsity. J. Mach. Learn. Res. 23, 1–39.
Hamacher, D., Singh, N., Van Dieën, J. H., Heller, M., and Taylor, W. R. (2011). Kinematic measures for assessing gait stability in elderly individuals: a systematic review. J. R. Soc. Interface 8, 1682–1698. doi:10.1098/rsif.2011.0416
Han, S., Pool, J., Tran, J., and Dally, W. (2015). Learning both weights and connections for efficient neural network. Adv. Neural Information Processing Systems 28.
Holzbaur, K. R., Murray, W. M., and Delp, S. L. (2005). A model of the upper extremity for simulating musculoskeletal surgery and analyzing neuromuscular control. Ann. Biomedical Engineering 33, 829–840. doi:10.1007/s10439-005-3320-7
Huang, Y., Kaufmann, M., Aksan, E., Black, M. J., Hilliges, O., and Pons-Moll, G. (2018). Deep inertial poser: learning to reconstruct human pose from sparse inertial measurements in real time. ACM Trans. Graph. (TOG) 37, 1–15. doi:10.1145/3272127.3275108
Karniadakis, G. E., Kevrekidis, I. G., Lu, L., Perdikaris, P., Wang, S., and Yang, L. (2021). Physics-informed machine learning. Nat. Rev. Phys. 3, 422–440. doi:10.1038/s42254-021-00314-5
Kawamoto, H., Lee, S., Kanbe, S., and Sankai, Y. (2003). “Power assist method for hal-3 using emg-based feedback controller,” in SMC’03 conference proceedings. 2003 IEEE international conference on systems, man and cybernetics. Conference theme - system security and assurance (Cat. No.03CH37483), 2, 1648–1653. doi:10.1109/ICSMC.2003.1244649
Khant, M., Gouwanda, D., Gopalai, A. A., Lim, K. H., and Foong, C. C. (2023). Estimation of lower extremity muscle activity in gait using the wearable inertial measurement units and neural network. Sensors 23, 556. doi:10.3390/s23010556
Kingma, D. P., and Ba, J. (2014). Adam: A Method for Stochastic Optimization. arXiv preprint arXiv: 1412.6980.
Kok, M., Hol, J. D., and Schön, T. B. (2017). Using inertial sensors for position and orientation estimation, 11, 1–153. doi:10.1561/2000000094
Kwon, H., Tong, C., Haresamudram, H., Gao, Y., Abowd, G. D., Lane, N. D., et al. (2020). Imutube: automatic extraction of virtual on-body accelerometry from video for human activity recognition. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 4, 1–29. doi:10.1145/3411841
Lane, N. D., Bhattacharya, S., Georgiev, P., Forlivesi, C., and Kawsar, F. (2015). “An early resource characterization of deep learning on wearables, smartphones and internet-of-things devices,” in Proceedings of the 2015 international workshop on internet of things towards applications, 7–12.
Lea, C., Flynn, M. D., Vidal, R., Reiter, A., and Hager, G. D. (2017). “Temporal convolutional networks for action segmentation and detection,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 156–165.
Linka, K., Hillgärtner, M., Abdolazizi, K. P., Aydin, R. C., Itskov, M., and Cyron, C. J. (2021). Constitutive artificial neural networks: a fast and general approach to predictive data-driven constitutive modeling by deep learning. J. Comput. Phys. 429, 110010. doi:10.1016/j.jcp.2020.110010
Liu, C. K., and Popović, Z. (2002). Synthesis of complex dynamic character motion from simple animations. ACM Trans. Graph. (TOG) 21, 408–416. doi:10.1145/566570.566596
Ma, P., Bian, Q., Kim, J. M., Alsayed, K., and Ding, Z. (2025). Measuring lower-limb kinematics in walking: wearable sensors achieve comparable reliability to motion capture systems and smartphone cameras. Sensors 25, 2899. doi:10.3390/s25092899
McConnochie, G., Fox, A. S., Bellenger, C., and Thewlis, D. (2025). Optimal control simulations tracking wearable sensor signals provide comparable running gait kinematics to marker-based motion capture. PeerJ 13, e19035. doi:10.7717/peerj.19035
McGinley, J. L., Baker, R., Wolfe, R., and Morris, M. E. (2009). The reliability of three-dimensional kinematic gait measurements: a systematic review. Gait and Posture 29, 360–369. doi:10.1016/j.gaitpost.2008.09.003
Mundt, M., Koeppe, A., David, S., Witter, T., Bamer, F., Potthast, W., et al. (2020). Estimation of gait mechanics based on simulated and measured imu data using an artificial neural network. Front. Bioengineering Biotechnology 8, 41. doi:10.3389/fbioe.2020.00041
Nüesch, C., Roos, E., Pagenstert, G., and Mündermann, A. (2017). Measuring joint kinematics of treadmill walking and running: comparison between an inertial sensor based system and a camera-based system. J. Biomechanics 57, 32–38. doi:10.1016/j.jbiomech.2017.03.015
Patel, S., Park, H., Bonato, P., Chan, L., and Rodgers, M. (2012). A review of wearable sensors and systems with application in rehabilitation. J. Neuroengineering Rehabilitation 9, 21. doi:10.1186/1743-0003-9-21
Peters, D. M., O’Brien, E. S., Kamrud, K. E., Roberts, S. M., Rooney, T. A., Thibodeau, K. P., et al. (2021). Utilization of wearable technology to assess gait and mobility post-stroke: a systematic review. J. Neuroengineering Rehabilitation 18, 67. doi:10.1186/s12984-021-00863-x
Phinyomark, A., Khushaba, N., and Scheme, E. (2018). Feature extraction and selection for myoelectric control based on wearable emg sensors. Sensors 18, 1615. doi:10.3390/s18051615
Prisco, G., Pirozzi, M. A., Santone, A., Esposito, F., Cesarelli, M., Amato, F., et al. (2025). Validity of wearable inertial sensors for gait analysis: a systematic review. Diagnostics 15, 36. doi:10.3390/diagnostics15010036
Raissi, M., Perdikaris, P., and Karniadakis, G. E. (2019). Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. J. Comput. Physics 378, 686–707. doi:10.1016/j.jcp.2018.10.045
Raissi, M., Yazdani, A., and Karniadakis, G. E. (2020). Hidden fluid mechanics: learning velocity and pressure fields from flow visualizations. Science 367, 1026–1030. doi:10.1126/science.aaw4741
Reddi, V. J., Cheng, C., Kanter, D., Mattson, P., Schmuelling, G., Wu, C.-J., et al. (2020). “Mlperf inference benchmark,” in 2020 ACM/IEEE 47th annual international symposium on computer Architecture (ISCA) (IEEE), 446–459.
Roetenberg, D., Luinge, H., and Slycke, P. (2009). Xsens mvn: full 6dof human motion tracking using miniature inertial sensors. Xsens Motion Technol. BV, Tech. Rep. 1, 1–7.
Schwartz, M. H., Rozumalski, A., and Trost, J. P. (2008). The effect of walking speed on the gait of typically developing children. J. Biomechanics 41, 1639–1650. doi:10.1016/j.jbiomech.2008.03.015
Seel, T., Raisch, J., and Schauer, T. (2014). Imu-based joint angle measurement for gait analysis. Sensors 14, 6891–6909. doi:10.3390/s140406891
Seth, A., Hicks, J. L., Uchida, T. K., Habib, A., Dembia, C. L., Dunne, J. J., et al. (2018). Opensim: simulating musculoskeletal dynamics and neuromuscular control to study human and animal movement. PLoS Computational Biology 14, e1006223. doi:10.1371/journal.pcbi.1006223
Solà, J., Deray, J., and Atchuthan, D. (2018). A micro lie theory for state estimation in robotics. arXiv Preprint arXiv:1812.01537.
Suvorkin, V., Garcia-Fernandez, M., González-Casado, G., Li, M., and Rovira-Garcia, A. (2024). Assessment of noise of mems imu sensors of different grades for gnss/imu navigation. Sensors 24, 1953. doi:10.3390/s24061953
Sze, V., Chen, Y.-H., Yang, T.-J., and Emer, J. S. (2017). Efficient processing of deep neural networks: a tutorial and survey. Proc. IEEE 105, 2295–2329. doi:10.1109/jproc.2017.2761740
Taylor, L., Miller, E., and Kaufman, K. R. (2017). Static and dynamic validation of inertial measurement units. Gait and Posture 57, 80–84. doi:10.1016/j.gaitpost.2017.05.026
Thelen, D. G., and Anderson, F. C. (2006). Using computed muscle control to generate forward dynamic simulations of human walking from experimental data. J. Biomechanics 39, 1107–1115. doi:10.1016/j.jbiomech.2005.02.010
Tucker, M., Cheng, M., Novoseller, E., Cheng, R., Yue, Y., Burdick, J. W., et al. (2020). “Human preference-based learning for high-dimensional optimization of exoskeleton walking gaits,” in Proceedings of the IEEE/RSJ international conference on intelligent robots and systems, 3423–3430.
Wang, W., Li, Y., Zou, T., Wang, X., You, J., and Luo, Y. (2020). A novel image classification approach via dense-mobilenet models. Mob. Inf. Syst. 2020, 7602384–7602388. doi:10.1155/2020/7602384
Westenbroek, T., Castaneda, F., Agrawal, A., Sastry, S., and Sreenath, K. (2022). Lyapunov design for robust and efficient robotic reinforcement learning. arXiv Preprint arXiv:2208.06721.
Yi, X., Zhou, Y., Habermann, M., Shimada, S., Golyanik, V., Theobalt, C., et al. (2022). “Physical inertial poser (pip): Physics-aware real-time human motion tracking from sparse inertial sensors,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 13167–13178.
Keywords: biomechanics, IMU, joint kinematics estimation, lightweight, physics-informed neural network
Citation: Bian Q, Wang H, Alsayed K and Ding Z (2026) OrientationNN: a physics-informed lightweight neural network for real-time joint kinematics estimation from IMU data. Front. Bioeng. Biotechnol. 13:1737916. doi: 10.3389/fbioe.2025.1737916
Received: 02 November 2025; Accepted: 29 December 2025;
Published: 12 January 2026.
Edited by:
Filipa João, University of Lisbon, PortugalReviewed by:
Munho Ryu, Jeonbuk National University, Republic of KoreaDiogo Ricardo, Instituto Politécnico de Lisboa, Portugal
Copyright © 2026 Bian, Wang, Alsayed and Ding. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Ziyun Ding , ei5kaW5nQGJoYW0uYWMudWs=