AUTHOR=Wang Jianliang , Chen Chen , Huang Senpeng , Wang Hui , Zhao Yuanyuan , Wang Jiacheng , Yao Zhaosheng , Sun Chengming , Liu Tao TITLE=Monitoring of agricultural progress in rice-wheat rotation area based on UAV RGB images JOURNAL=Frontiers in Plant Science VOLUME=Volume 15 - 2024 YEAR=2025 URL=https://www.frontiersin.org/journals/plant-science/articles/10.3389/fpls.2024.1502863 DOI=10.3389/fpls.2024.1502863 ISSN=1664-462X ABSTRACT=Real-time monitoring of rice-wheat rotation areas is crucial for improving agricultural productivity and ensuring the overall yield of rice and wheat. However, the current monitoring methods mainly rely on manual recording and observation, leading to low monitoring efficiency. This study addresses the challenges of monitoring agricultural progress and the time-consuming and labor-intensive nature of the monitoring process. By integrating Unmanned aerial vehicle (UAV) image analysis technology and deep learning techniques, we proposed a method for precise monitoring of agricultural progress in rice-wheat rotation areas. The proposed method was initially used to extract color, texture, and convolutional features from RGB images for model construction. Then, redundant features were removed through feature correlation analysis. Additionally, activation layer features suitable for agricultural progress classification were proposed using the deep learning framework, enhancing classification accuracy. The results showed that the classification accuracies obtained by combining Color+Texture, Color+L08CON, Color+ResNet50, and Color+Texture+L08CON with the random forest model were 0.91, 0.99, 0.98, and 0.99, respectively. In contrast, the model using only color features had an accuracy of 85.3%, which is significantly lower than that of the multi-feature combination models. Color feature extraction took the shortest processing time (0.19 s) for a single image. The proposed Color+L08CON method achieved high accuracy with a processing time of 1.25 s, much faster than directly using deep learning models. This method effectively meets the need for real-time monitoring of agricultural progress.