Your new experience awaits. Try the new design now and help us make it even better

ORIGINAL RESEARCH article

Front. Plant Sci.

Sec. Sustainable and Intelligent Phytoprotection

This article is part of the Research TopicAdvancing Plant Science with UAVs: Precision in Agricultural Sensing, Targeted Protection, and PhenotypingView all 4 articles

Research on the application of Remote Sensing Image Super-Resolution Reconstruction Techniques in Crop Phenology Extraction

Provisionally accepted
  • 1Shenyang Agricultural University, Shenyang, China
  • 2Liaoning Provincial Key Laboratory of Smart Agriculture Technology, Shenyang, China
  • 3National Digital Agriculture Regional Innovation Center (Northeast), Shenyang, China
  • 4High-resolution Earth Observation System, Liaoning Forest and Grass Resources and Environment Remote Sensing Research and Application Center, Shenyang, China

The final, formatted version of the article will be published soon.

Crop phenology is one of the most critical physiological attributes of agricultural crops, serving as a direct indicator of growth status throughout the developmental cycle. With the advancement of phenological research, satellite remote sensing has emerged as a primary monitoring tool due to its large spatial coverage and convenient data acquisition. However, high-resolution remote sensing satellites, which are essential for precise phenological observations, often have long revisit intervals. Additionally, adverse atmospheric conditions such as cloud cover frequently compromise the usability of images on multiple dates. As a result, high-resolution time-series data for crop phenology monitoring are typically sparse, limiting the ability to capture rapid phenological changes during the growing season.To address this challenge, this study focuses on paddy and dryland fields as experimental sites and proposes a novel method for filling temporal gaps in remote sensing data using generative image processing techniques. Specifically, a lightweight super-resolution Generative Adversarial Network (GAN) is developed for image reconstruction. Using the reconstructed dataset, dense time-series monitoring and phenological metric extraction were conducted throughout the crop growing season.(1) The proposed super-resolution reconstruction method achieves structural similarity index (SSIM) and peak signal-to-noise ratio (PSNR) values of 0.834 and 28.69, respectively, outperforming mainstream approaches in reconstructing heterogeneous remote sensing data.(2) Following temporal reconstruction, the revisit intervals of remote sensing imagery for the two test sites improved from 6.40 and 6.63 days to 5.70 and 5.88 days, respectively. To further analyze phenological metrics, four smoothing techniques were applied, among which Savitzky–Golay filtering yielded the most accurate and robust results. Although discrepancies were observed between the results obtained using the reconstructed data and those based on the original datasets, the proposed method demonstrated smaller deviations from benchmark datasets. Compared with conventional interpolation-based gap-filling approaches, the framework demonstrated marked improvements in the accuracy of phenological extraction, while also delivering superior spatial resolution and robustness relative to the Harmonized Landsat and Sentinel (HLS) dataset. Experimental results confirm that the proposed approach effectively fills temporal gaps in satellite imagery, enhances data continuity, accurately captures key phenological turning points, and enables precise crop phenology monitoring at high spatial and temporal resolution.

Keywords: Phenology extraction, satellite remote sensing, Super-resolution reconstruction, Crop phenology, generative adversarial network

Received: 17 Aug 2025; Accepted: 24 Oct 2025.

Copyright: © 2025 Han, Feng, Cai, Li, Du and Xu. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence:
Ziyi Feng, fengziyi@syau.edu.cn
Tongyu Xu, xutongyu@syau.edu.cn

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.