Your new experience awaits. Try the new design now and help us make it even better

METHODS article

Front. Plant Sci.

Sec. Technical Advances in Plant Science

This article is part of the Research TopicPlant Phenotyping for AgricultureView all 33 articles

Accurate Detection and Density Estimation of Peach Tree Inflorescences Using an Improved YOLOv11 Model

Provisionally accepted
Jiangtao  JiJiangtao JiXiaoxuan  LuXiaoxuan LuHao  MaHao MaXinyi  LuXinyi LuYaqing  YangYaqing YangHongwei  CuiHongwei Cui*Meijia  YuMeijia YuXuran  XieXuran Xie
  • Henan University of Science and Technology, Luoyang, China

The final, formatted version of the article will be published soon.

Flower thinning plays a vital role in peach production, which significantly affects fruit yield and quality. Obtaining precise information about inflorescences is the key to scientific thinning and refined orchard management. However, the accurate detection of peach inflorescence still faces great challenges due to the complex and changeable light conditions, dense occlusion between flowers and significant scale differences in the actual orchard environment. In order to solve these problems, an enhanced YOLOv11s peach inflorescence detection model, termed MDI-YOLOv11, is proposed in this study to achieve accurate and stable recognition of flowers and buds. Considering the characteristics of small target and frequent occlusion in peach inflorescences, a collaborative design of the neck feature fusion structure and the backbone feature attention mechanism is adopted. Specifically, the RFCAConv module is added to the backbone network to increase sensitivity to salient regions, while a P2 layer for small target detection is embedded within the neck network and integrated with the RepGFPN structure to enhance multi-scale feature fusion, thereby improving detection accuracy and adaptability in complex orchard environments. The model's performance was systematically assessed on a self-built dataset comprising 1,008 images. The dataset labled 41,962 target instances after sample balancing, including 22,803 flower targets and 19,159 bud targets, covering typical orchard scenes with varying illumination, color characteristics, and high density occlusion. The five-fold cross-validation experiment demonstrated that MDI-YOLOv11 achieved an AP50 of 0.919 and an AR50 of 0.964 for peach tree inflorescences detection, along with a detection time of 13.46 ms per image. 10.97 million parameters, and a model size of 21.51MB, all of which meet practical application requirements. Compared with the YOLOv11s model, the MDI-YOLOv11 model achieved a 0.033 increase in both AP50 and AR50, and the detection performance and model complexity are better than YOLOv11m. Based on the detection results of MDI-YOLOv11, this study generated row-by-row inflorescence density distribution maps that intuitively displayed the spatial density distribution of peach inflorescences. The results indicate that the proposed method enables efficient and accurate detection of peach flowers and the generation of inflorescence density maps, which is expected to provide effective support for refined orchards management.

Keywords: deep learning, Inflorescence Detection, Multi-ScaleFeature Fusion, Peach tree, YOLOv11

Received: 10 Oct 2025; Accepted: 31 Jan 2026.

Copyright: © 2026 Ji, Lu, Ma, Lu, Yang, Cui, Yu and Xie. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Hongwei Cui

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.