ORIGINAL RESEARCH article
Front. Plant Sci.
Sec. Sustainable and Intelligent Phytoprotection
UAV Multi-Source Data Fusion with Super-Resolution for Accurate Soybean Leaf Area Index Estimation
Provisionally accepted- 1College of Electrical Engineering and Information, Northeast Agricultural University,Harbin 150030, China, Harbin, China
 - 2College of Agriculture, Northeast Agricultural University, Harbin 150030, China, Harbin, China
 
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Leaf Area Index (LAI) is a key parameter that reflects the structure and growth status of crop canopies. Accurate measurement and assessment of LAI provide valuable insights into crop health. With the advancement of unmanned aerial vehicle (UAV) technology, multispectral sensors mounted on UAVs enable high-throughput estimation of LAI. However, UAV flight altitude directly affects both operational efficiency and image resolution, which in turn influences the accuracy of LAI estimation. In this study, we acquired RGB and multispectral images using UAVs flying at four altitudes: 15 m, 30 m, 45 m, and 60 m. We then conducted super-resolution (SR) image reconstruction and modeling analysis. The main findings are as follows: (1) SR algorithms varied significantly in their ability to reconstruct canopy images, and their effectiveness decreased as flight altitude increased. Among the tested methods, SwinIR provided the best performance in terms of PSNR and SSIM, surpassing Real-ESRGAN, SRCNN, and EDSR. (2) Texture features extracted from RGB images demonstrated strong sensitivity to LAI. The XGBoost algorithm, which integrated both RGB and multispectral data, achieved the highest accuracy, with a relative error of 4.16%, significantly outperforming models based solely on RGB data (5.25%) or only multispectral data (9.17%). (3) The incorporation of SR techniques significantly enhanced model accuracy at altitudes of 30 m and 45 m. At 30 m, models incorporating Real-ESRGAN and SwinIR achieved an average coefficient of determination (R²) of 0.86. At 45 m, the same SR methods yielded models with an average R² of 0.77. Overall, the results underscore the potential of integrating SR techniques with multi-sensor data to improve the accuracy of soybean LAI estimation. This approach provides a reliable and efficient solution for UAV-based crop monitoring and supports data-driven decision-making in precision agriculture.
Keywords: UAV remote sensing, machine learning, super resolution, leaf area index, Multi-source data fusion
Received: 07 Sep 2025; Accepted: 04 Nov 2025.
Copyright: © 2025 Zhao, Yao, Zeng, Jiang and Zhang. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Xihai  Zhang, xhzhang@neau.edu.cn
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
